DARWIN’S BIG IDEA
Since the dawn of history, the eye and other complex biological systems had baffled materialists. How could they exist without a designer? However, that changed in 1859 when biologist Charles Darwin published his revolutionary, The Origin of Species. The big idea in Darwin’s book was that life in all its complexity came about by a process he called natural selection. In other words, according to Darwin, no designer is needed. Materialists were elated.
Darwin postulated that natural selection was totally responsible for the complexity of organs like the eye, addressing the issue in a special section entitled, “Organs of Extreme Perfection and Complication.”
In his special section Darwin brilliantly argued that the eye might have developed in any number of ways. His reasoning was that even a partially developed eye would offer a creature some evolutionary advantage.
His explanation for the gradual development of such complex systems certainly had its critics, but by and large his ideas were embraced because they helped to explain a great deal of the observable phenomena of our world.
As the evolutionary movement grew, a great deal of evidence seemed to confirm Darwin’s theory, evidence similar to what you were taught in your high school textbooks. Adaptability, survival of the fittest, and other Darwinian tenets are clearly demonstrable within a given species. Materialist Richard Dawkins remarks of Darwin’s acceptance among most biologists, “Today the theory of evolution is about as much open to doubt as the theory that the earth goes round the sun….”1
As an atheist, Dawkins seems to applaud Darwin as the hero behind a purposeless world of chance. He writes, “Darwin’s theory of evolution by natural selection is satisfying because it shows us a way in which simplicity could change into complexity, how unordered atoms could group themselves into ever more complex patterns until they ended up manufacturing people. Darwin provides a solution, the only feasible one so far suggested, to the deep problem of our existence.”2
Since Darwin’s theory was birthed in the mid-nineteenth century before the discovery of DNA and the intricacies of how life works at the molecular level, there was no scientific evidence to refute his claims. By the mid-twentieth century, Darwinism had gained widespread acceptance, but mounting evidence persuaded some scientists that his theory was incapable of accounting for life’s intricate complexity.
This led to a series of meetings where scientists from various disciplines attempted to hammer out a coherent and unified theory of evolution. The result was called the “evolutionary synthesis,” also known as Neo-Darwinism.
But as Dr. Michael Behe, associate professor of biochemistry at Lehigh University, notes in his book Darwin’s Black Box, “One branch of science was not invited to the meetings [that produced the evolutionary synthesis], and for good reason. It did not yet exist.”3 Behe is referring to his own field of study, biochemistry.
Behe’s field did not begin until later in the century, after the advent of the electron microscope. Yet biochemistry is perhaps the most critical of all the disciplines for this study, because it analyzes life at the cellular level and observes the molecular foundations of living organisms.
If Darwin’s general theory of evolution is a valid explanation of how life can develop wholly apart from outside intelligence, then it must be demonstrated to be operating at the molecular level. But does Darwin’s theory hold up under such scrutiny?
Continue reading page 4 of 6 of “Was Darwin Right About the Eye?”