Jump to content

Proving the power of Darwin's theory to the most hardened sceptic

Rate this topic


Recommended Posts

The new theories of evolution



Steve Jones

Last Updated: 8:23PM BST 16 January 2009



Darwin's ideas are being used by scientists to develop new drugs and plan phone networks, says Steve Jones


Last week I found myself answering hard questions about evolution in the echoing halls of an organisation established by an anti-evolutionist.


The Natural History Museum (founder: Richard Owen, described by the normally mild Charles Darwin as "spiteful, extremely malignant, and clever") was hosting a "question time".


Beside me on the platform were the distinguished biologists, Richard Dawkins and Lewis Wolpert, and in front of us, beneath the vast skeleton of Dippy the Dinosaur, several hundred delegates of the annual GECCO meeting.


Nothing to do with extinct and terrible lizards or their modern relatives with sticky feet, but members of the Genetic and Evolutionary Computation Conference, whose main sessions were held at University College London.


Computers have long been used to model biological evolution (and Dawkins himself has played a part in this) but Darwin would have been amazed at the ways his ideas are being used by computer scientists to solve non-biological problems.


Evolution works, in the factory as much as the field; and military tactics, automatic music transcription and marine architecture - just some of the topics discussed - prove that Darwin's notion of unintelligent design can sometimes beat the most expert engineer.


The theoreticians use evolutionary robotics, genetic algorithms and their relatives to mimic the notion of descent with modification.


The equivalents of mutation, sex and natural selection crack challenges too complex for the fine scalpel of pure mathematics.


A set of rough and ready solutions compete within the machine, are randomly changed and reshuffled until, after many generations of breeding from the winners, a vastly improved version emerges.


On the wilder shores of algorithmia rest papers on The Induction of Fuzzy Rules with Artificial Immune Systems and on Using DNA to Generate 3D Organic Art Forms - a picture from the fertile computer of William Latham, of Goldsmiths College, London, one of the authors of that idea, can be seen above.


A whole group of intriguing conference titles turns on the Biblical injunction to "Go to the ant, thou sluggard; consider her ways, and be wise".


Much wisdom has emerged from studying those busy beings.


Ant colony optimisation, as the technique is called, turns on the fact that such insects exploit food in what appears to be an intelligent, but is in fact an entirely mindless, way.


One ant stumbles on a tasty item. It brings a piece back to the nest, wandering as it does, and leaving a trail of scent; a second tracks that pathway back to the source, making random swerves of its own; a third, a fourth, and so on until soon the active little creatures converge on the shortest possible route, marked by a highly-perfumed highway along which they scurry with every impression of planning.


As the food dwindles, one by one the animals give up, and the track slowly evaporates.


The computer scientists fill their machines with virtual ants and give them the task of finding their way through a maze or graph, leaving a coded signal as they pass until, just like the ants, the fastest route emerges.


The technique is used in planning the most efficient design of a phone network, the best use of the gates at Heathrow and the management of wireless messages through a grid of receivers. In the phone system, for example, each message leaves a digital scent-mark as it passes through a node and, as it builds up, the fastest track soon attracts the most traffic.


A few of the papers emphasised the lessons to be learned from nature; from real, rather than electronic, insects and, as a mere biologist struggling to understand computer-speak, that is a relief.


Ants do behave in a remarkably digital fashion. The ground around a colony is at its busiest on a warm day, with plenty of food around.


Its inhabitants decide en masse on the best strategy. A few go out as patrollers, returning to the nest only when they find some food. Back home, they are sniffed at by their fellows - and, as ant scent changes when they are out in the open the stay-at-homes can sense how many explorers are returning.


However, those domestic types need several hits on a patroller, spaced a few seconds apart, before they will venture outside. A shortage of food, or a hungry bird, means that not many of the wanderers return and the colony stays quiet; but once a threshold is reached a hungry mass suddenly pours out - and woe betide any tasty grub that gets in the way.


Even bacteria go in for a form of group-think based on mathematical rules.


A certain bug that gets into many hospital patients does little harm until it indulges its unpleasant talent of forming a sticky layer over a wound or a lung, with fatal results.


It's the swarming ant effect again: when the bacteria are rare there is no point in their holding hands with a neighbour to make a sheet because no more than a few cells would be linked. They divide instead.


As they do, they pump out a chemical signal, and as numbers rise that reaches a threshold. The billions of individuals then suddenly join into an adhesive and lethal film.


The latest idea, an exciting one in these days of antibiotic resistance, is to design drugs that block the signal molecule and might save the patient.


Strangely enough, one candidate for the job is garlic extract - which for bacteria, as for Britons, sends out a chemical message of mutual repulsion. Genetic algorithms are already used in drug development and the mathematicians will, no doubt, soon be on the garlic case too.


Evolution in the computer may soon overcome evolution in the real world, as bacteria and their digital equivalents use mutation and natural selection to defeat the challenges that human ingenuity throws at them.


That should prove the power of Darwin's theory to even the most hardened sceptic.

Link to comment
Share on other sites

Evolution of Evolution: "The New Theories of Evolution"



A recent post at telegraph.co.uk purported to offer updated "proof" for the theory of evolution. You know, the "utterly proven" theory about the origin of the species that millions of scientists have dedicated their life to re-proving, much the way they do other "Laws" of science such as gravity, entropy, and displacement...er...oh, wait.


Be sure to read his article before continuing.


As proof, the author offered several contrived examples: from modern technology to insect and bacterial behavior to show how the evidence is mounting in such a way that, soon, even the "most hardened skeptics" will become convinced of the truthfulness of evolution. Waiting...waiting...hm...still no.


The first absurd claim comes when the author claims that, "Computers have long been used to model biological evolution." Since I am a computer programmer, let me offer my opinion on this. The computer that could truly model even the splitting of a single cell into two cells has not yet been invented. No computer existing today has the memory or processing power that it would take to even begin the insanely complex task of modeling biological reproduction, the foundation for "evolution". Though you may read about today's super computers being used to simulate "protein folding", keep in mind that protein folding is to single-cell division what memorizing your ten decimal digits is to quantum physics and doctorate level calculus. The two can hardly be compared, even though the first is a building block to the second. Secondly, all digital system that simulate physical systems must do so a pre-determined "resolutions". In the world of 3D graphics, for instance, as a car speeds towards a wall, the picture of the car and wall may be drawn 10 times, 100 times, or 1000 times depending on how fast the "real time" engine is capable of satisfying the "geometry engine's" will to keep the car in the right place given the amount of time elapsed. For instance, if the computer can re-draw twice every second the complex vertices and shaders that make up one frame, then a very choppy presentation will show the car hitting the wall in 3 seconds. If the computer can draw 10 frames per second, the display will show a much smoother rendition of the car hitting the wall, but still in just 3 seconds. If the display can render 32 or more "fields" (half-frames, every other "scan line"), then the display will actually fool the human eye into believing that it is watching a "full-motion" rendering of the car hitting the wall. However, this isn't actually true. In fact, in 3 seconds time, only about 100 frames could be rendered at 32 fields per second. But it you've ever seen high-speed photography of a crash dummy hitting an air-bag or a bullet flying from the barrel of a gun, then you know that even at a million frames per second, small changes are reflected in EVERY FRAME! So how many states you choose to measure in a second determines how many states you capture, and take into account. In biological and physical systems, these states are the moments in time that you are watching. But between these moments, the vast majority of time passes without observation. If you're watching a tortoise crawl through the sand, then 32 frames per seconds will certainly suffice. You can pretty much guarantee that the tortoise didn't run off and get a burger and fry between frame 17 and 18. If, however, you're watching the blindingly fast reaction between molecules that make up proteins that make up a strand of DNA interacting with an mRNA strand, then a million frames per second doesn't really do it justice at all. The only way to model such complexities in our day is to elongate time.


A former manager of mine used work for Amdol designing I/O bus architectures. He said that they ran a weeks-long software simulation of one of their chips, but only simulated a total of 6 seconds worth of actual I/O once the chip was finally constructed. Software is much, much slower than hardware. Even much more so, software is infinitely slower than the real world. No matter what resolution you choose to measure a physical system, virtually all time passes between your measurements. Any attempts to circumvent this method of modeling and optimize the output is merely a model of one's assumptions, not of the real world. The more optimized the output, the less of a true model it is and the more of a model of one's assumptions it is.


All that to say this: if someone today claims to use computers to simulate evolution, you must parse their words. They purport to have modeled unspeakable numbers of creatures over eons of time and arrived at a result in short order. In order to do that, your "model" would have to be nothing BUT assumptions. Thus, if you write a program to tell you that evolution is true, then don't be surprised if it does!


The second absurd claim is that network and telephony networks exercise "evolution" when passing data through them. Now, every network programmer knows what a "trace route" is. In windows, the tracert command echoes back the "route" used for data to travel from your computer to some remote computer and back and the elapsed time between each hop. Most also know that if you tracert the same host on different days, it will often result in a different route than before. If I'm getting data from a computer in India, the data might make 20-30 "hops" from one router to another along the internet backbone. The request for the data goes from my computer to my local router, usually in the same building. From there it "hops" to my internet service provider: Comast, Earthlink, AOL, or some other provider. After a couple 2 or 3 "hops" at Comcast, it goes out to the real internet and finds it's way to the host computer via a series of hops that tend to move it geographically closer and closer to it's destination. Finally it's received by their internet service provider, forwarded to their company's router and finally to the specific computer that has the data I want. The data then takes the reverse route, usually, but not always identical to the request route. But all those "hops" hit routers in between Comcast and say, Bangalore Electronics's routers. Those public routers are "learning" systems in that they are constantly measuring the response times from various other routers and re-routing packets of data around slower or over-burdened routers to keep the total "latency" (response times) rather low. When routers fail to auto-optimize routes, human administrators can go in and define "static routes" to override the router's artificial intelligence and assert a better plan.



Breathe! Ok, now what Dr. Chimp is trying to hoist on the public is that this auto-optimization, similar in all digital networks, is proof of evolution!!! These systems were designed from the ground up to do exactly what they do, and this is proof of evolution! Yikes!


The third absurd claim is that the behavior of colonies of ants proves evolution! Now admittedly, when he claims it proves evolution, he's talking about the tendency of randomness to produce an improved system. The problem is that the things he states as randomness could easily be rephrased as "applied design". The scent trail, foraging specialists, carriers, and sudden, mass, hunger-based movements of colonies that he observes are all programmed responses using intricate instruments that pre-exist the duration of his experiment. In this clever, misleading way, the bar for "proving evolution" gets set so low that a child cutting across a field to get home proves evolution!!!


The fourth example is not any better. That bacteria in clusters suddenly change their behavior in a way that kills their host and thus, ultimately themselves is, by no means, producing a better system. I think he kind of lost focus towards the end of his chat.


If this is what passes for proof of evolution amidst the echoes of learned professors in the higher halls of learning these days, then I'm not all that worried about a sudden shift towards the hopeless world view of evolution. And by the way, Steve, your article is misnamed.


P.S. - Keep an eye out for Ben Stein's new movie, Expelled. It is documented proof of the slight of hand and scholastic intimidation that is commonplace in the scientific community today.

Link to comment
Share on other sites

A ‘genetic algorithm’ can be viewed as a computational model of some basic genetic mechanisms such as reproduction, crossover, and mutation. GA's are used in computer simulations to solve complex problems and indeed work very, very well. GA’s also demonstrate that evolution is not a gradual process, but instead is characterized by sudden bursts of change spreading through a population of solutions! However, any model is per definition a simplification of reality. Therefore GA’s can at best provide evidence in favor of biological evolution theory; never absolute proof.

Link to comment
Share on other sites


The vanaras("ape-like humanoids") of ramayana; they surely must be a species in between apes and men.


Actually, there were a lot of sub-human species which lived along Homo Sapiens (that's Modern Man) right up to the last ice age which ended about 10,000 years ago.


Matter a fact, if you were to travel to Europe some 40,000 years ago, you could meet up with Neathanels, if you travel to China - Peking Man, if you travel to Indonesia - Java Man and if you travel to the Philiphines - the hobbits. There were at least SIX sub-humans to exist for the past 5 million years.


Any one of them could be the Varanas of Ramayana.

Link to comment
Share on other sites

Join the conversation

You are posting as a guest. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Create New...