Bruce Eckel posted on Threading Terminology over at artima.com. The beauty of this post is following along with Bruce as he intuitively wrestles with Java 1.5 threads as a service to his Thinking in Java readers. As with most popular blogs, the discussion that follows is interesting. A common thought is that despite the power and clarity of the Java 1.5 threading model, the same pitfalls remain.
Multithreaded Applications: asynchrony - a blessing or curse?
We have the power to create a new thread in our code. We create a thread and send it on a mission. Our new thread is J.E.B. Stuart and his task is to find the Union's flank and then to report immediately. Meanwhile the main thread can continue its march towards Washington. At this point, we have two elements operating asynchronously, but eventually the main thread may have to wait for Stuart's report. At this point the strategy begings to unravel, the main thread has to halt its advance and await notification from Stuart.
The wait proceeds, now the main thread may have to choose between retiring from the field or proceeding on to battle. One problem is that the main thread still has a responsibilty to Stuart. Stuart cannot be left orphaned precariously near the Union's lines. In any case Stuart must be ordered to terminate his patrol.
The essence of multithreaded apps is indepence and asynchrony, yet the reality is that lines of communication must be maintained between threads and protocols must be established to handle the situations that will arise.
Synchrony - the beating heart of time
Many of the elements of a threading service are synchronization objects such as Mutexes, Semaphores, Critical Sections, Signals, Events, and Monitors. With a few simple lines of code, we loose a fury upon the world, and then write reams of code trying to control this bastard child. Ultimately the synchronity of single threadedness is revealed as more powerful than the asynchronous multithread. It is the only way to subdue the creature.
Multiprocess or Multithread
Multithreaded programs have been identfied as more performant and more scalable than their multiprocess counterparts. A good example is process spawning web-servers, such as CGI servers, vs. threaded servers such as Mod Perl. One very brilliant blogger whose URL I cannot recall (I will update this blog when I remember), suggests that multiprocess is far more robust than multithreaded. I especially enjoyed his term for hard-to-find multithreaded bugs - "heisen bugs".
However, if one process is dependent on the state of another process, synchronicity issues will exist. What about multimachine distributed systems and such? Message Queue applications are used to synch up the constituents.
In conclusion - good, bad, indifferent or just reality.
It is goodness that programming languages and platforms continue to put a fine point on thier support for multithreaded programs. Yet multithreaded support needs to exist in the viscera of applications that need threads. It is not a problem for language designers, it is a problem for application designers. It is a problem for very experienced application designers.
Monday, October 03, 2005
Friday, July 01, 2005
Optimization Patterns
This following is a reply to the post The Limits of the MVC Design Pattern.
It seems like the "Limit of the MVC Design Pattern" is that it defies optimization.
It is no surprise that it would be difficult to optimize a general solution such as an architectural framework or design pattern. Maybe we are stumbling across the need for "Optimization Patterns".
One example of an Optimization Pattern is refactoring, which is optimizing the structure of code.
One argument against the need to optimize, is that performance is a shrinking concern. Sometimes it seems brain-dead to continually reload data-structures instead of caching, but reloading and requerying may have no effect on the bottom line - which is the "user experience".
New evidence suggests that software performance may be very important. Stored data is growing at incredible rates, rates that even surpass Moore's Law. This is good news for us geeks since complex solutions involving caches, multiple threads and efficient algorithms will be in demand.
So MVC promotes reuse and decoupling which are good things, but what patterns promote optimization? One answer may be the Inversion of Control Pattern (IoC). Briefly, IoC is used in frameworks because it facilitates the design of configurable systems. Therefore "Optimization Patterns" really are "Flexibility Patterns".
Systems will have to be designed with extension and modification points to support changes not yet imagined.
It seems like the "Limit of the MVC Design Pattern" is that it defies optimization.
It is no surprise that it would be difficult to optimize a general solution such as an architectural framework or design pattern. Maybe we are stumbling across the need for "Optimization Patterns".
One example of an Optimization Pattern is refactoring, which is optimizing the structure of code.
One argument against the need to optimize, is that performance is a shrinking concern. Sometimes it seems brain-dead to continually reload data-structures instead of caching, but reloading and requerying may have no effect on the bottom line - which is the "user experience".
New evidence suggests that software performance may be very important. Stored data is growing at incredible rates, rates that even surpass Moore's Law. This is good news for us geeks since complex solutions involving caches, multiple threads and efficient algorithms will be in demand.
So MVC promotes reuse and decoupling which are good things, but what patterns promote optimization? One answer may be the Inversion of Control Pattern (IoC). Briefly, IoC is used in frameworks because it facilitates the design of configurable systems. Therefore "Optimization Patterns" really are "Flexibility Patterns".
Systems will have to be designed with extension and modification points to support changes not yet imagined.
Sunday, March 27, 2005
Last night I dreamt of trees...
of data, trees of reality.
To begin, yesterday I spent a good part of the day working on my latest project which combines learning Python and brushing up on some basic data structures, such as Binary Trees. Maybe "brushing up" isn't a good way to describe my intentions. I am not satisfied with knowing that I can code up a Binary Tree whenever I care to. What I am looking for is a visceral, deep, intuitive understanding.
I coded up an AVL Tree which is a balanced Binary Tree. In order to test the tree, I had to come up with some specific values to insert and then determine the resulting tree. This work was done, not on a computer, but with pen and paper. After several hours of sketching trees, I was suddenly struck with the limitation of the Binary Tree. As the tree is traversed from root to leaf, branches of the tree are eliminated from consideration. All that matters are the options that lay ahead. This is what makes Binary Trees so powerful for searching, but not so great for modeling activities. In real life, branches that connect nodes of existence come and go, seemingly at random. Nodes are connected and removed due to past-events and probablities. We seem to have the ability to skip around the tree of reality, to teleport amongst its various nodes.
But reality is a Binary Tree. Consider the effect of time on reality.
Time delimits and defines reality. Choices are made, paths are taken. We don't have to concern ourselves with options on paths we are not on. Reality is a Binary Tree, could you imagine otherwise? The choices we make resolve to a path, which in turn resolves to a line. The time-line.
I looked at the smallest of Binary Trees, three nodes, one of which was the parent with its two children, and saw infinite possibilities. I looked at a main branch of a large tree and, even with all of its sub-trees to consider, all that came to mind was the removal of possibility, the refinement of the real.
So I went to sleep last night and I dreamt of trees. Elements of life, connected by branches now seen.
To begin, yesterday I spent a good part of the day working on my latest project which combines learning Python and brushing up on some basic data structures, such as Binary Trees. Maybe "brushing up" isn't a good way to describe my intentions. I am not satisfied with knowing that I can code up a Binary Tree whenever I care to. What I am looking for is a visceral, deep, intuitive understanding.
I coded up an AVL Tree which is a balanced Binary Tree. In order to test the tree, I had to come up with some specific values to insert and then determine the resulting tree. This work was done, not on a computer, but with pen and paper. After several hours of sketching trees, I was suddenly struck with the limitation of the Binary Tree. As the tree is traversed from root to leaf, branches of the tree are eliminated from consideration. All that matters are the options that lay ahead. This is what makes Binary Trees so powerful for searching, but not so great for modeling activities. In real life, branches that connect nodes of existence come and go, seemingly at random. Nodes are connected and removed due to past-events and probablities. We seem to have the ability to skip around the tree of reality, to teleport amongst its various nodes.
But reality is a Binary Tree. Consider the effect of time on reality.
Time delimits and defines reality. Choices are made, paths are taken. We don't have to concern ourselves with options on paths we are not on. Reality is a Binary Tree, could you imagine otherwise? The choices we make resolve to a path, which in turn resolves to a line. The time-line.
I looked at the smallest of Binary Trees, three nodes, one of which was the parent with its two children, and saw infinite possibilities. I looked at a main branch of a large tree and, even with all of its sub-trees to consider, all that came to mind was the removal of possibility, the refinement of the real.
So I went to sleep last night and I dreamt of trees. Elements of life, connected by branches now seen.
Wednesday, March 02, 2005
Temporal Anamolies
My favorite StarTrek NG plots involve "temporal anamolies" in which the time-line is disrupted and the crew of the Enterprise is plunged into the grips of an alternate reality. In real life, temporal anamolies do exist, in multi-threaded software applications.
The most fundamental building-block of computer algorithms is the sequential execution of code. Code is always executed sequentially for each "unit of execution" in a computer progam. A unit of execution in a computer program is known as a thread.
A thread is a time-line in code, a unit of reality if you will. Maybe this is why programming with multiple threads seems so odd to me.
Operations on threads involve things like putting threads to sleep, which is equivalent to suspending time. Threads are "blocked" which is like having time run into a log-jam. A use for "blocking" is to make a thread wait so that it may "join" with another thread (time-line).
Note that other forms of reality such as state and data (which is like matter), are accessible by all threads in a program. Therefore a protocol has to exist between threads to ensure that time-lines don't corrupt matter. For example, you would not want to go back in time and marry your parent (sorry for the visual).
Multi-threaded programs have thier own set of common problems. One is "dead-lock" when time-lines attempt to resolve conflicts over matter access and effectively block each other out. The other problem is known as a "race-condition" which is where one time-line is dependent on the actions of another time-line.
I blogged about object-orientation in Hammer Of The Gods where I made the case that every "thing" can be classified as an object. None of the OO languages I use, effectively model threads. Instances of objects have scoped life-times. Objects are just "matter" that exist on the time-line. Threads are the time-line. Threads transcend objects.
I have two dark visions regarding threads. One vision is of a thread, endlessly spinning through time, its time-line eternal. All context and state of interest long-since extinguished. This happens, which is why the Windows Task Manager is a handy tool. The seconds it takes to launch the tool and shut-down the errant task are many eternities to a lost thread.
The other vision is nightmare involving the use of garbage-collected or reference-count-based memory management. I see a thread, drifting through an empty, irrelevant universe, holding reference to an irrelevant object. Partners in the void. Neither able to release one another from futility. Neither able to redeem the others existence.
The most fundamental building-block of computer algorithms is the sequential execution of code. Code is always executed sequentially for each "unit of execution" in a computer progam. A unit of execution in a computer program is known as a thread.
A thread is a time-line in code, a unit of reality if you will. Maybe this is why programming with multiple threads seems so odd to me.
Operations on threads involve things like putting threads to sleep, which is equivalent to suspending time. Threads are "blocked" which is like having time run into a log-jam. A use for "blocking" is to make a thread wait so that it may "join" with another thread (time-line).
Note that other forms of reality such as state and data (which is like matter), are accessible by all threads in a program. Therefore a protocol has to exist between threads to ensure that time-lines don't corrupt matter. For example, you would not want to go back in time and marry your parent (sorry for the visual).
Multi-threaded programs have thier own set of common problems. One is "dead-lock" when time-lines attempt to resolve conflicts over matter access and effectively block each other out. The other problem is known as a "race-condition" which is where one time-line is dependent on the actions of another time-line.
I blogged about object-orientation in Hammer Of The Gods where I made the case that every "thing" can be classified as an object. None of the OO languages I use, effectively model threads. Instances of objects have scoped life-times. Objects are just "matter" that exist on the time-line. Threads are the time-line. Threads transcend objects.
I have two dark visions regarding threads. One vision is of a thread, endlessly spinning through time, its time-line eternal. All context and state of interest long-since extinguished. This happens, which is why the Windows Task Manager is a handy tool. The seconds it takes to launch the tool and shut-down the errant task are many eternities to a lost thread.
The other vision is nightmare involving the use of garbage-collected or reference-count-based memory management. I see a thread, drifting through an empty, irrelevant universe, holding reference to an irrelevant object. Partners in the void. Neither able to release one another from futility. Neither able to redeem the others existence.
Wednesday, February 23, 2005
Hammer Of The Gods
Objects abound us. There are objects that move, objects that are fixed. Objects that reside within other objects and objects that contain other objects. Objects crawl, eat, defecate, breed and die on the surface of other objects. Objects revolve in systems around other massive objects. What about thoughts, concepts, feelings? What about light, gravity, water, the wind? We can consider these objects as well.
So everything is-an Object (reminds me of a very popular programming language). Of course the term object is just an extremely general way to classify things. Software design involves the discovery and classification of objects that reside within a domain, and the creation of objects that serve to frame and facilitate the domain objects.
Object-Oriented Design is a tool and as a tool it has a very specific function. There is an old cliche, "if you have a hammer, everything looks like a nail". But everything is an object, therefore everything is-a nail, in this sense.
Certainly if we name an element in our domain or system, we are speaking of objects. A very powerful naming technique is the use of metaphor. Successful metaphors surround the geekosystem. We don't invent new words, we overload the ones we already have. Words like 'file', 'folder', 'menu', 'icon'. Objects, every one of them. We even overload verbs, for example, 'browse', 'click', 'surf'. One new verb that comes to mind is to 'google'. One 'googles' when at Google. Google is a service (metaphorically speaking), a service is an object.
'Naming' and the use of metaphor is a very important capability for software developers. I think we tend to underestimate the power of well-crafted names and metaphors. The word 'object', with regards to software development, is a great name. The term was coined in the 1960s and is still effective, relevant, and a source of mystery.
The power in the name 'object' comes from the abstract, vague and nebulous nature of the name. By under-specifying what an object is, by allowing the term 'object' to breath, we give 'objects' great capability. So lets go the whole nine-yards and say that everything is an object. Its true anyway.
Its important to be able to live in the abstract, but every now and again we need to touch concrete. The typical way to teach OO is to provide concrete examples and to provide simple tenets such as 'an object is data and the methods that work with the data'. Objects live along side electron-clouds and Heisenberg. Too much concrete - too much uncertainty. I think most of the break-throughs in OO came from the early pioneers (SmallTalkers) who didn't have rules and concrete. They just created. And speaking of clouds, kind of makes you wonder if Grady Booch was on to something with his form of OO design notation. At least he gave Heisenberg his due.
So everything is-an Object (reminds me of a very popular programming language). Of course the term object is just an extremely general way to classify things. Software design involves the discovery and classification of objects that reside within a domain, and the creation of objects that serve to frame and facilitate the domain objects.
Object-Oriented Design is a tool and as a tool it has a very specific function. There is an old cliche, "if you have a hammer, everything looks like a nail". But everything is an object, therefore everything is-a nail, in this sense.
Certainly if we name an element in our domain or system, we are speaking of objects. A very powerful naming technique is the use of metaphor. Successful metaphors surround the geekosystem. We don't invent new words, we overload the ones we already have. Words like 'file', 'folder', 'menu', 'icon'. Objects, every one of them. We even overload verbs, for example, 'browse', 'click', 'surf'. One new verb that comes to mind is to 'google'. One 'googles' when at Google. Google is a service (metaphorically speaking), a service is an object.
'Naming' and the use of metaphor is a very important capability for software developers. I think we tend to underestimate the power of well-crafted names and metaphors. The word 'object', with regards to software development, is a great name. The term was coined in the 1960s and is still effective, relevant, and a source of mystery.
The power in the name 'object' comes from the abstract, vague and nebulous nature of the name. By under-specifying what an object is, by allowing the term 'object' to breath, we give 'objects' great capability. So lets go the whole nine-yards and say that everything is an object. Its true anyway.
Its important to be able to live in the abstract, but every now and again we need to touch concrete. The typical way to teach OO is to provide concrete examples and to provide simple tenets such as 'an object is data and the methods that work with the data'. Objects live along side electron-clouds and Heisenberg. Too much concrete - too much uncertainty. I think most of the break-throughs in OO came from the early pioneers (SmallTalkers) who didn't have rules and concrete. They just created. And speaking of clouds, kind of makes you wonder if Grady Booch was on to something with his form of OO design notation. At least he gave Heisenberg his due.
Wednesday, February 16, 2005
Graphs and the Space-Time Continuum
I was given a small hammer when I was just a wee lad. With hammer in hand, I wandered my home searching for something to fix. Later, my father had to retrace my steps with wood-filler and paint, repairing various door-frames, sills and mouldings. I don't do much with hammers anymore, the tools of my trade are software technologies. When I learn something new, its like getting a new tool and I go in search of an application. This approach is little non-optimal but it is how I learn. My latest interest are the data-structures known as graphs. Like most software engineers, I learned about graphs in school. Graphs are connected networks of nodes. One of the key points is that the connectivity of nodes is established by having nodes refer to other nodes.
I have never worked with anyone who has ever had the need to create a graph, including the Binary Search Tree. Today, most languages have built-in data-structures such as lists, hash-tables, and vectors. Data is modeled and stored in a database. It is easy to consign the graph to the realm of academia and the exotic. As others discount the graph, I may find a way to use it to my advantage, to make it a permanent part of my toolkit.
To begin, I close my textbook and strap myself into my rocket and launch to low earth orbit. Now adrift in vacuum, away from code examples and technical detail, I once again view the graph. Ignoring the obvious, such as computer networks, air travel routes, and state-machines, what do I see? Anything that interacts with anything probably is a graph. One of my favorite tools is Object-Oriented design. Interacting objects are more exciting to me than functional flow. Interacting objects are nothing more than nodes in a graph.
Can I enhance my ability to model the world with OO by thinking in terms of graphs? Will graphs add a new perspective to my ability to analyze?
Next I dock with my star-ship and warp out of the solar-system. I subject the graph to various experiments. One thing to note is that a graph can exist in three dimensions. Perfect examples are Tinker Toys and Connects. But 3-D graphs can be squashed flat and still exist in 2-D. Therefore it is easy to surmise that graphs of many dimensions exist and can be represented in just two dimensions. Infinity in two dimensions.
Now I launch the graph at the event-horizon of a black-hole. As the graph compresses, the intent of each node become distorted. Yet the relationships remain. The differences between each node becomes less as they merge to a singularity and become one. One node associated with itself. With trepidation, I leave the void, with the memories of that brave graph emblazened in my mind. I return home, reopen my textbook and begin my journey anew.
I have never worked with anyone who has ever had the need to create a graph, including the Binary Search Tree. Today, most languages have built-in data-structures such as lists, hash-tables, and vectors. Data is modeled and stored in a database. It is easy to consign the graph to the realm of academia and the exotic. As others discount the graph, I may find a way to use it to my advantage, to make it a permanent part of my toolkit.
To begin, I close my textbook and strap myself into my rocket and launch to low earth orbit. Now adrift in vacuum, away from code examples and technical detail, I once again view the graph. Ignoring the obvious, such as computer networks, air travel routes, and state-machines, what do I see? Anything that interacts with anything probably is a graph. One of my favorite tools is Object-Oriented design. Interacting objects are more exciting to me than functional flow. Interacting objects are nothing more than nodes in a graph.
Can I enhance my ability to model the world with OO by thinking in terms of graphs? Will graphs add a new perspective to my ability to analyze?
Next I dock with my star-ship and warp out of the solar-system. I subject the graph to various experiments. One thing to note is that a graph can exist in three dimensions. Perfect examples are Tinker Toys and Connects. But 3-D graphs can be squashed flat and still exist in 2-D. Therefore it is easy to surmise that graphs of many dimensions exist and can be represented in just two dimensions. Infinity in two dimensions.
Now I launch the graph at the event-horizon of a black-hole. As the graph compresses, the intent of each node become distorted. Yet the relationships remain. The differences between each node becomes less as they merge to a singularity and become one. One node associated with itself. With trepidation, I leave the void, with the memories of that brave graph emblazened in my mind. I return home, reopen my textbook and begin my journey anew.
Thursday, January 27, 2005
Techie or Manager
I drove into work the other day listening to meditative incantations. As I drove, I tried to see the world as a child. Water-towers became docked space ships and streetlights become uni-podded aliens attempting to disintegrate me. Beside the fact that I am weird, why would I do this? No I didn't smoke anything. Its an exercise I like to do when I am doing software design. I like to keep imaginitive, and open. This is important because software is soft. Software developers create thier own realities. Maybe a little insanity is good for those whose create alternate realities.
Anyway, I arrived at work, mentally prepared, but not for what was waiting for me. A note summons me to my bosses office - kind of formal and all. I immediately itemized all my misdeeds of the past week or so. Nothing that I couldn't defend, or so I thought. Well to cut to the chase, I was promoted, to Lead Software Engineer.
My heart sank. To begin with, my mentor, leader, and collaborator of 6 years was moving on. He was the interpersonal hub and arbitrator between the disparate elements of the project. He also had to do all the crappy admin stuff. He never really got to do much software design or code. I sort of led the software design and coding efforts as one of the project's Alpha Geeks.
As I thought about it I realized that I really had it made. I get to play with computers for a living. I get to devise, conceptualize and realize. I don't wanna be no manager. I am already living my dream. So I said 'No'. I told my manager that developing software is my passion and I meant it. You can't argue with that. I also mentioned that she could count on me while transitioning in a new Lead.
My heart rose like the mid-day sun.
I had made the right choice.
Anyway, I arrived at work, mentally prepared, but not for what was waiting for me. A note summons me to my bosses office - kind of formal and all. I immediately itemized all my misdeeds of the past week or so. Nothing that I couldn't defend, or so I thought. Well to cut to the chase, I was promoted, to Lead Software Engineer.
My heart sank. To begin with, my mentor, leader, and collaborator of 6 years was moving on. He was the interpersonal hub and arbitrator between the disparate elements of the project. He also had to do all the crappy admin stuff. He never really got to do much software design or code. I sort of led the software design and coding efforts as one of the project's Alpha Geeks.
As I thought about it I realized that I really had it made. I get to play with computers for a living. I get to devise, conceptualize and realize. I don't wanna be no manager. I am already living my dream. So I said 'No'. I told my manager that developing software is my passion and I meant it. You can't argue with that. I also mentioned that she could count on me while transitioning in a new Lead.
My heart rose like the mid-day sun.
I had made the right choice.
Tuesday, January 18, 2005
hello, world!
So what do I have to say that hasn't been said before? Well I guess I am searching for something that seems real but doesn't exist. Its interesting to hear software developers talk about languages and design methodologies, because it ain't real. Its all abstraction. One thing all developers have is the ability to work with newly created layers of reality. I remember when "too abstract" was a problem. Now I say, "yeah ain't it cool." Mathematicians have searched for meaning in the digits of PI. Physicists have searched for answers about the fabric of reality in quantum theory. Programmers also find the fabric of reality in code. So what happens when abstraction gets a hold of us. We look deeply into the machine and see not ones and zeros but a mystery. A mystery that we will try to solve but never will, because its only real in our minds.
Subscribe to:
Posts (Atom)