Many people feel comfortable today not believing in God because they feel that the great complexity evident in the world can simply be explained by Darwinian evolution theory.  But does it really make sense?  Does this even pass the smell test, or is this sort of a 19th-century fable that has been desperately kept alive by a group of biologists that we’re just supposed to trust with blind faith?

No analogy is perfect, but let’s consider one that should at least get the gears turning a bit.

The theory of evolution by Darwin posits that organisms can form into new species (“speciation”) even of greater complexity through two independent processes:  random mutation and natural selection.  Mutation can occur from radiation or copying errors during cell replication, etc.  And natural selection covers the idea that a mutation that is beneficial to the organism is likely to be retained and one that is detrimental is likely to be rejected.

Now there’s a lot of hand-waving here.  First, since mutations are more often detrimental than beneficial to the organism, every mutation is not going to promote evolution.  Second, it’s not shown how natural selection keeps positive mutations at a higher frequency than it fails to reject negative ones.  (In other words, since mutations tend to be detrimental more often than beneficial, one would expect, over time, an accumulative effect that shows genetic entropy rather than evolution.)

But consider a simple analogy.  Let’s say we have a blank hard disk.  Information is stored on the disk, at the lowest level, as 0s and 1s.  Zero represents no magnetism on the medium, and 1 represents some magnetism.  These bit states can be unintentionally modified through electromagnetic radiation.  The hard disk has a piece of software called firmware installed on a chip, which has algorithms that assist in reading and writing from the disk and correcting data errors.  Let us say, for the sake of argument, that the firmware on the drive has a program that simulates natural selection:  it can detect bit-level changes that are beneficial for the overall system and keep those, while throwing out those changes that are bad.

My question is:  how long would it take — just by changes in electromagnetic radiation — to go from a blank hard drive to a fully-functional Windows 10 system?  How much time would it take to develop that level of complexity just from random mutations to the 0s and 1s?

If that seems like an unfair question, then let’s start with a hard drive that has Windows for Workgroups 3.11 already installed.  How many years of random radiation effects and copy errors would it take to evolve into Windows 10?

An evolutionary biologist would just answer, “Ohh with a trillion years it’s possible!”  But to be honest, I don’t think it could ever happen.  It just doesn’t make sense.

People are biased to believe in evolution because they see progress all about them and project that onto biology.  We used to have rotary phones, now we have smartphones — evolution!  We used to have aqueducts and now we have indoor plumbing — see, evolution!  But moving from a rotary phone to a smartphone didn’t happen simply through random mutations, it was an application of human intelligence that caused the progress and the increase in complexity is explained by intelligence.

(And for the record, a single human cell is significantly more complex than even a smartphone.  In fact, just a single human cell has more information than the entire Windows 10 source code, if converted to a computer storage measurement.  The complexity of a human cell was something completely unknown in Darwin’s time.)

What do you think, does it make sense?  Going from a fish to a human is a far greater increase in complexity than going from Windows 3.11 to Windows 10.  And I can tell you that anyone who thinks the latter is possible simply does not understand how source code works.  So what does it say about a biologist who believes the former is possible?