The history of human evolution is clear: we all
came from Africa the Middle East Europe.
Recently discovered fossils from Tbilisi, Georgia, may rewrite the prevailing evolutionary ideas about human origins. While scientists once believed that Homo erectus (a supposed ancestor of Homo sapiens) migrated out of Africa nearly a million years ago, these new skulls, jawbones, and limb bones show “far more primitive-looking” humans who populated Eurasia much earlier—around 1.8 million years ago by secular dating methods.
What this means, according to Steve O’Conner of the Independent, is that
there was a very early migration out of Africa of a group of human-like “hominins”, who may have spent a long interlude in Eurasia before migrating back into Africa, where they contributed to the further evolution of the Homo genus—the family of man.
In other words, the old ideas about humans evolving in Africa and then spreading out are becoming more and more unlikely. The new view is that “hominins” left Africa, evolved, and came back at least once.
For creationists, there are several interesting aspects of this discovery—once we move beyond the presuppositions. First, we should point out that these humans are classified as “primitive” because of their small stature: 1.44–1.5 meters tall (a little under 5 feet); their smaller brain volume: 600cc (compared to an average of 1000–1450cc in most modern people); and “archaic” upper limbs, a statement which is not explained in the article. However, even the researchers concede that these ancients in Tbilisi used stone tools and showed quite modern morphology.
Astute readers will notice that what these scientists call primitive does not preclude the fact that this population was fully human. While their brains may have had a lower capacity than the known range for most humans living in the present, their use of stone tools and hints of human culture (e.g., caring for an individual with no teeth) suggest a fully human social atmosphere. A smaller average brain volume is not an indicator of a deficiency in intelligence, since we could just as easily point out that Neanderthals had a larger brain on average than most people today. Rather, this new find simply shows that the range of human brain volume was much greater in the past than it is now. Or, conversely, factors such as nutritional deficiencies (e.g., iodine) and genetic disorders could also explain the reduced capacity and height.
The so-called hobbits of Indonesia were small in stature and have also been targeted by claims of being less than human. They had a brain the size of a grapefruit, but made sophisticated tools just like Homo sapiens did. However, because of differences in wrist bones and brain capacity, they are demoted to a more “primitive” state. This is a convenient story for evolutionists, who need past “hominins” to be less humanlike to show that evolution has made modern humanity the “pinnacle” of history.
In many ways, this is a subtle form of racism against humans living in the past. Because they don’t look exactly like humans today, these scientists make them out to be sub-human, the same way Ota Benga was considered a sideshow oddity. But just like Ota Benga, these individuals in Tbilisi—and the hobbits—were no less human than those reading this article. Different does not mean lesser.
Tooth loss and molecular decay purportedly prove Charles Darwin right. Does that mean brushing twice a day keeps evolution away?
In a study heralded as providing “support” for Darwin’s ideas, biologists at the University of California–Riverside set out to show that some mammals without tooth enamel or any teeth at all retained the enamelin gene (a gene involved in enamel production), though likely showing molecular decay. The reason for this hypothesis? Fossils of some toothless mammals (e.g., baleen whales) reveal that they once had teeth, and mammals with teeth lacking enamel (e.g., sloths) once had enamel.
Not surprisingly, the group found exactly this. With the help of modern gene sequencing, the researchers discovered mutations in the enamelin gene “that disrupt how the enamelin protein is coded, resulting in obliteration of the genetic blueprint for the enamelin protein.”
According to the report, this vindicates Darwin because
[t]he fossil record demonstrates that the first mammals had teeth with enamel. Mammals without enamel therefore must have descended from mammals with enamel-covered teeth.
If that seems underwhelming, perhaps you would be more persuaded by Professor Mark Springer (head of the study), who informs any doubters that
[i]n our research we clearly see the parallel evolution of enamel loss in the fossil record and the molecular decay of the enamelin gene into a pseudogene in representatives of four different orders of mammals that have lost enamel.
The results of this study show quite clearly the effects of different starting assumptions. The Bible tells us—long before this research or Darwin arrived on the scene—that the Curse (Genesis 3) impacted every living thing when God removed some of His sustaining power. This is most evident through the mutational decay of the genome, which often causes various disorders.
It’s no surprise, then, that mammalian genes have lost function over time. These creatures were designed for a much different world than the one after the Fall, but God gave them enough variability to survive—even with genetic entropy.
What we see from this evidence is that the mammals in this study were created with the information for teeth and enamel. Through mutations, these genes lost function, and some of the animals lost teeth. Far from lending credence to Darwinism, this research reveals how impossible molecules-to-man evolution really is. Mammals likely lost the ability to form teeth and enamel because of genetic degradation—the mammals, however, have always been mammals.
It’s a zinc world after all?
When the Miller-Urey experiment was performed in 1953, the two scientists generated amino acids in an environment they believed was much like the early earth (a “reducing” atmosphere of methane, hydrogen, ammonia, and water vapor). However, the problem is that the conditions they used are no longer considered to be accurate for when life supposedly arose on earth.
Nowadays, the accepted idea is that the early earth had a “neutral” atmosphere of carbon dioxide. Performing the Miller-Urey experiment with this type of atmosphere fails to generate any amino acids. This is a problem for naturalists, and it is further compounded by UV rays that often foil even the most fantastic abiogenesis suppositions.
But naturalists have never met a problem they couldn’t leave God out of. Since God couldn’t have created life (according to them), a new explanation is needed, which is where the work of Armen Mulkidjanian of the University of Osnabrueck, Germany, and Michael Galperin of the U.S. National Institutes of Health comes in.
According to the two scientists, life originated on structures similar to deep-sea vents. Although the location is nothing new, their explanation is:
They argue that under the high pressure of a carbon-dioxide-dominated atmosphere, zinc sulfide structures could form on the surface of the first continents, where they had access to sunlight. Unlike many existing theories that suggest UV radiation was a hindrance to the development of life, Mulkidjanian and Galperin think it actually helped.
Why zinc? Zinc sulfide stores the energy of light and this property means that a “zinc world” hypothesis is considered more plausible than orgin-of-life theories that cannot account for UV rays. Beyond this, Mulkidjanian and Galperin point out that some proteins in modern cells contain high levels of zinc, especially those considered “evolutionarily old.”
Every year, new ways in which life could have originated pop into existence, get media coverage, and then fade away. Life came from rocks, mud, crystals, and now zinc. Sometimes scientists piece theories together to give them new—well—life, as with the deep-sea vents here, and other times they claim life came from all of the above.
The bigger question that rarely gets asked is this: why so many hypotheses? Naturalists would claim that science works by making hypotheses and testing them. We agree. But abiogenesis does not lie within the realm of science.
Take the zinc world, for example. The human body contains other metals besides the one these researchers have focused on, and the proteins that are supposedly “evolutionarily old” contain much more than just zinc. But because zinc is important to this claim, the scientists find zinc and suddenly have “proof.” Why not a “rust world” because blood contains iron? Or a “sodium world” because the oceans contain salt?
With operational science, we can take a claim and perform repeatable experiments. Abiogenesis research will never move past conjecture and hints because there’s nothing to test and no first-hand account to check. Scientists can point to interesting aspects of our chemical composition or dream up life-nurturing scenarios, but they cannot perform scientific exploration on imagination. Taking something that we can see (e.g., zinc in proteins) and affixing “evolutionarily old” to the front of it only diverts attention from the fact that spontaneous generation goes against the laws of nature—both in the present and however many billions of years these scientists care to throw at the problem.
If there is one solid fact tying all these various hypotheses about life springing up from non-life together, it is that it must have happened. The only alternative is a Creator—and many scientists would much rather be accountable to zinc than the God they so desperately want to keep out of the equation.
The scene is reminiscent of a movie: intrepid explorers pierce the wilderness and stumble upon a trove of amazing species rarely—if ever—seen before.
In this case, the explorers were a team of scientists from Britain, New Zealand, and the United States who trekked down into a 1 kilometer (0.62 miles) deep and 3 kilometer (1.86 miles) wide crater in Papua New Guinea. Over the course of five weeks, the team documented at least 21 new species or sub-species, including frogs with fangs, fish, and a giant rat that may be the largest currently living. (Some of these are shown in an image gallery on the Guardian’s website.)
To understand how this discovery impacts the creation-evolution debate, we’ll first need to consider what was and was not found. The scientists did discover a number of isolated species that, because of the variety in each created kind, have become remarkably unique. Protected in the crater, they still express certain traits (e.g., frog fangs and rat size) that other members of the kind have lost. (Most frogs, in fact, have teeth.)
Each of these discoveries reminds us of just how much diversity is possible within each created kind since all the land animals departed from the Ark. Much of the information in the genome of those original animals has been expressed, repressed, or lost through speciation and mutation. But the creativity shown in each kind is truly amazing.
What the team did not uncover, however, are examples of what one would expect to find according to evolution. An area such as this crater is a living laboratory for what Darwin predicted: changes from one kind into another kind.
Everything the team discovered fits with animals we’re accustomed to: birds, fish, rats, etc. Even though some of these animals have unique features, there is still no evidence of fish growing legs or salamanders sprouting mammal hair. Frogs with fangs are still frogs. In a similar fashion, creatures on the famous Galápagos Islands, though often cited as evidence of evolution, suffer this same defect (as far as evolution is concerned). Finches and tortoises, no matter the size of their beaks or the shapes of their shells, will always be finches and tortoises.
Natural selection and mutations—even in this pristine wilderness of Papua New Guinea—can do nothing more than work with existing genetic material to cause changes within each kind. In other words, don’t hold your breath for the announcement of the reptiliobird.
As far back as 2001, creationists pointed to liposuctioned fat as a source of stem cells that didn’t require the destruction of embryonic human life. Good to see secular scientists catching on.
Nearly two years ago, researchers discovered a method to transform skin cells into pluripotent stem (iPS) cells, the stem cells that can become other cell types (e.g., neural, cardiac, cartilage, and others). These iPS cells could lead to cures for various disorders and to organ regrowth and regeneration. This important breakthrough undermined the perceived need to destroy human life in order to harvest stem cells (especially since so-called embryonic stem cells have proven harmful to recipients).
Now a group of doctors and plastic surgeons at the Stanford University School of Medicine has turned to what one researcher calls a “readily available, great natural resource”: fat cells. The benefits of using fat cells to make iPS cells is that one liter of fat produces hundreds of millions of stem cells, they take less time to be cultured (twice as quickly as skin cells), and they are easier to “program” for use throughout the body. And most patients have a few fat cells they’d be willing to donate.
Don’t expect to see this new method in a clinic near you for a few years, however. Any new treatments must first pass clinical testing and receive FDA approval. In addition, there are still many questions regarding the efficacy of these stem cells, the time required to “coach” them before reintroducing them to the patient, and safety concerns.
For Christians, new sources of God-honoring stem cells are always good news. Even if embryonic stem cells (ESC) had proven useful (which remains in serious doubt), there’s no justification for destroying one life to save another. The model for Christians is to lay down one’s life for others—not to take life away for one’s own selfish gain.
Even as far back as 2001, Ken Ham argued that there was no reason to study ESCs, since
non-embryonic stem cells have had proven laboratory and clinical successes and don’t require any loss of human life. For example, stem cells have been extracted from hippocampal and periventricular regions of the brain, umbilical cord blood, pancreatic ducts, hair follicles, skin biopsies, and liposuctioned fat.
There was never any excuse to rely on ESCs when so many other viable and ethical sources are—and have been—available. This new research simply emphasizes the fact that adult stem cells are worth their weight (pun intended).
http://www.answersingenesis.org/articles/2009/09/12/news-to-note-09122009