CREDIT: KOON-KIU YAN AND NITIN BHARDWAJ
Yan et al. have compared the transcriptional control network in the bacterium Escherichia coli to the network depiction (known as the call graph) of the Linux kernel, which is the central component of a highly popular operating system. Both systems feature (i) master regulators (yellow in the graphic), which send directions to targets; (ii) middle managers (red), which both send and receive orders; and (iii) workhorses (green), which are controlled but do not control others. For the bacterium, there are lots of workhorses but relatively few regulators at the other levels. The Linux call graph is top-heavy or more populated at the master regulator and middle-manager levels. In other words, a workhorse in the transcriptional network usually has only a few supervisors, but in Linux, a workhorse answers to a large number of regulators.
What is very usual, to living separation not having lot of options regulable, nor interfaces connected (only sexual exchange is regulable, feeding systems are automatic).
The authors also contrasted evolution in the two systems by looking at the functions that persist in 24 versions of the Linux source code relative to genes that persist in 200 phylogenetically distinct bacteria. For E. coli, the workhorses showed the greatest persistence, whereas for Linux, there was persistence at all three levels, but mostly in the master regulators and middle managers.
Basically there is supposed to be more in- and ex- tensive conservation in programming than in natural processes in any kind. Nature is not deserved to serve to statical reasons, software is not designed to live and adapt constantly, nor to replicate itself. Nature just use static cycles if it is required by principles of grounding. If chaos would be required to implement language interchanges, nature would implement chaotical repositions, to survive.
The authors interpret these differences in terms of two design principles. The need for cost-effectiveness (or reusability) is central in programming, and robustness-that is, resistance to breakdown due to failure of a part-is the driving factor in biological systems.
Not very true insight. Both of them need long and short term adaptability (this concept consumes both, reusability and immediate exception driving, hacking the environment basically can stand for this too). Reusability is not pilot of programming, its just a most easy paradigm of code storage implementation, and neither resistance to breakdown seems to stay for something in biological systems. For example we can think about changing RTB for RUB in computers, and we see, that RTB is the same importance as RUB, because RUB systems on a platform not very RTB are not very UF. I regard as main legal in nature recyclability (embedded even in coupling, languages or foodchains and basically both HGo2F-CFc2C breath systems construing main earths living platform), and in computers conservativity and communication. Software created without any communication to outer world is not software. Finally computers are just evolution of nature, not in progress sense, but in layer occupation and diversity managing sense. So the difference between those two platforms can spring from any kind of transformatives between human-electronics-nets layer and basic chemistry conversion primitives layer of nature.
Evolution, they speculate, goes from top to bottom in software, but from bottom to top in biological systems.
Really just speculation. I consider evolution at least in software being driven, and personally feel it bug driven, and mostly just copying outer world into the schemes, for distinct purposes. Basically I feel evolution of one distinct object per time and projects very natural, and being plastic in abstraction and convolution phases - for that natural. Finally lets speculate where is top and where bottom in biological systems, or make philosophistry if there is even hierarchy in nature. Not really good stuff for science :)