A recently reported study by meteorologist Carl Drews has suggested a way in which the waters of the Red Sea could have parted “naturally,” enabling the children of Israel, led by Moses, to cross as pharaoh’s army pursued (Exodus 14). Such an explanation might leave God with credit for the timing of the event, but little else.
The courses of rivers and the locations of shorelines do change over time, and using satellite imagery to suggest a site where a branch in the Nile River delta once drained into a coastal lagoon, Drews has proposed that a 63 mph wind blowing for 12 hours could have pushed back a wall of water, exposed a land bridge, and held it for the crossing. This effect, called wind setdown, is sort of the opposite of storm surge. On the one hand, the wind setdown concept is somewhat more refreshing than the popular claim that the Israelites waded through a “reed sea,” inevitably leading to the joke about a little boy being amazed that God could drown the Egyptian army in six inches of water. The wind setdown idea is not inconsistent with the biblical fact ( Exodus 14:21-22 ) that God caused the sea to go back by a strong east wind all that night—after all, God created the wind and the water and can command them to behave as He chooses. However, the existence of a physical explanation for a miraculous phenomenon is not necessary and the demand for one prior to believing a biblical miracle is foolhardy.
Asked to comment on the study, Ken Ham, President and CEO of Answers in Genesis and the Creation Museum, states the following:
The parting of the Red Sea was a miracle. It was an extraordinary act of God (Exodus 14). Yet, God used a force of nature—wind—to bring about this miracle. But there is no need to come up with a naturalistic explanation of a supernatural event.
This current research involving computer modeling is based on the assumption that such an event has a naturalistic explanation, and that nothing supernatural was involved. Regardless of whatever results are found and ideas that are proposed, the researchers have accepted one part of the account in the Bible—that the Red Sea crossing by the Israelites really occurred—but they have already ruled out the rest of the account: that it was the result of a supernatural event. Besides, any such research can never ultimately prove or disprove what happened, and the only way we could know for sure is if there was an eyewitness account. The Bible gives us a record from the ultimate Eyewitness, the God of Creation.
When, as recorded in the New Testament, Jesus calmed a storm on the Sea of Galilee, His disciples were amazed that the winds and the sea obeyed him ( Luke 8:24–25 and Matthew 8:26–27). The end of the storm was a natural event miraculously obeying a supernatural God. Yet just a few years later, Jesus arose from the grave. These same disciples were eyewitnesses to the evidence of that miraculous event, an event for which there is no possible naturalistic explanation. We must be careful to avoid limiting our faith to only those acts of God that we can explain according to the laws of the physical world. He miraculously created those physical laws, and He can miraculously use them or override them as He chooses.
New research concludes that, consistent with the biological evidence, the Neanderthals were quite human.
A seven-year study of two separate Neanderthal cultures in Italy (the study is to be published in the December issue of the Journal of Archaeological Method and Theory) is “rehabilitating” the Neanderthal people’s image in the eyes of the world. While these cultures coexisted with a “modern” Homo sapiens population, the southernmost of the two Neanderthal groups was geographically separate and would have likely had no contact with them. Previous findings consistent with cultural innovations among Neanderthals have prompted suggestions that they were somewhat less-than-human and had cultures contaminated by the “real human” cultures. Now, the findings of all sorts of tools, ornaments, and hunting implements in a Neanderthal culture remote from conventional human populations demonstrate that they did not require exposure to a truly “sentient” species but had the creative ability to come up with the tools needed to cope with their harsh environment on their own.
A new study of the Simian Immunodeficiency Virus (SIV), a virus infecting “almost all African monkeys” but not sickening them, has led to speculation about how long the virus has been around in its present form.
Regardless of the conclusions drawn about the virus’s longevity, the related Human Immunodeficiency Virus (HIV) continues to have a mysterious past as scientists speculate about why HIV made its deadly and epidemic appearance in the 20th century. Scientists study the mutations of these viruses in hopes of getting some clue to help in the treatment of HIV sufferers.
The study found that Bioko, an island cut off from the mainland when 19 miles of melting Ice Age water covered the land bridge to the continent, was home to four dissimilar forms of the simian virus, SIV, which match well to SIV viruses infecting similar species found on the mainland. Although evolutionists point to several Ice Ages and speculate that this land bridge was covered at the end of the last one “10,000 years ago,” we affirm that one single Ice Age occurred. This Ice Age, based on a biblical understanding of history and meteorological models (see Setting the Stage for an Ice Age), probably occurred soon after the global Flood and probably lasted about 700 years. In any case, the new findings suggest the existence of the present SIV ancestor on the continent of Africa several thousand years ago.
Scientists are trying to determine how long ago the SIV ancestor appeared on the scene, using apparent mutation rates as a “molecular clock.” In view of the new data suggesting these virus strains are “at least 10,000 years old,” the scientists “now believe that all the S.I.V. strains infecting monkeys and apes across Africa diverged from a common ancestor between 32,000 and 78,000 years ago.”
Commenting on the viral “molecular clock,” Answers in Genesis’s Dr. Georgia Purdom (a molecular geneticist) gives the following explanation:
A molecular clock is not an independent factor that can measure time but rather is dependent on external factors for its calibration. If the calibration is wrong, like the starting assumption by the scientists that the virus is at least 10,000 years old, then the clock will be wrong too, giving exaggerated dates. It is interesting to note this quote from the NY Times article that “Previous methods of dating the virus had concluded it was a few hundred to 2,000 years old, ‘and that just didn’t seem right,’ Dr. Hahn said.” It didn’t “seem right” to them because of their evolutionary presuppositions, and so, in their desire to get a date that “seems” more reasonable to them, they change the interpretation of the evidence to make it better fit their presuppositions or starting point.
Furthermore, rather than producing new information and becoming the agents of evolution, mutations (such as apparently resulted in the change of the simian virus) actually represent a loss of information and a reshuffling of the information that is present in the genome. Thus, the emergence of deadly human immunodeficiency virus in the 20th century, while yet unexplained, is definitely not an example of evolution.
Creationists are frequently accused of ignoring “real science” in favor of their own agendas. Well, when it comes to embryonic stem cell research, the U.S. government, at least the present administration, is guilty of ignoring “real science” in favor of its political agendas . . .
So says Mike Bowman, legal counsel with Alliance Defense Fund, in an opinion piece just published by CNN.com. A federal court recently issued an injunction against the Obama administration’s guidelines favoring embryonic stem cell research on the grounds that those guidelines violate a 1995 law directing that taxpayers’ money be directed toward research that “produces treatments instead of destroyed embryos.” Indeed, there are two key issues in view here, neither in favor of embryonic stem cell research.
Stem cell research involves the use of human cells that have the potential to reproduce and transform into a variety of cell types. These cells are used to replace damaged or diseased cells in human beings. There are two main sources of stem cells: live human embryos—each essentially a test-tube baby that never made it to a womb—and adult stem cells—used to make induced pluripotent stem cells (iPSCs).
The foremost moral and ethical issue arises from the fact that embryonic stem cells come from human embryos, each a “genetically distinct human being from the moment of conception.” These tiny beings are conceived in the laboratory, but then destroyed as their cells are harvested.
While research using adult stem cells (iPSCs) has produced a number of remarkable medical treatments for a variety of dreadful diseases and done so with minimal side effects, embryonic stem cell research cannot yet claim a single successful medical treatment. Despite administration claims that many people will suffer if money isn’t appropriated to embryonic stem cell research, as Bowman states, “embryonic stem cell researchers have produced no treatments at all.” Despite claims by many in the government, the media, and Hollywood, now “only 33% of U.S. voters believe that taxpayer money should be spent on embryonic stem cell research,” the majority of people are now beginning to realize that directing money to research which destroys embryos and produces no useful results at all is not only the height of poor stewardship, but also robs fruitful iPSC research of those funds.
Numerous secular scientists, pioneers in stem cell research, are ready to relegate embryonic stem cell research to the trash heap of disappointing dreams. The “father of human embryonic stem cells” research, James Thomson, predicted back in 2007 that embryonic stem cell research would be abandoned in favor of the iPSCs which “do all the things embryonic stem cells do.” Former NIH director Bernadine Healy has termed the embryonic stem cell research “obsolete.”
The desire to protect human life, created in the image of God, is what primarily governs our position against human embryonic stem cell research—no matter who is funding it—but also the need to practice financial good sense. It causes us to applaud the increasing number of research scientists willing to take a stand and abandon human embryonic stem cell research in favor of iPSCs.
Recent measurements of radioactive decay rates at Purdue University as well as laboratories in Europe gave scientists whose faith in the absolute accuracy of radiometric dating methods a bit of a jolt when they appeared to vary slightly at the time of solar flares.
Scientists at the National Institute of Standards and Technology, working with others from Oak Ridge National Laboratory and several universities, have now restored their own confidence in radiometric dating methods by demonstrating that neutrinos, such as those emitted in a solar flare, do not alter the decay rate of gold-198.
Efforts to discover variation in radioactive decay rates in the past century have shown none. Nevertheless, these experiments—of necessity—can only be conducted in the observable present. When we, as young earth creationists, point out that radiometric dating is based on several unprovable assumptions, including the assumption that the decay rates of radioactive isotopes have never varied, we are pointing to the possibility that decay rates may have varied significantly in the distant past, at times remote from the realm of observable science.
Thus, while the scientists at NIST can sleep well at night now knowing that solar flares don’t apparently cause alterations in decay rates but just alterations in the reliability of their instrumentation, there is still no way for them to prove that decay rates have always been the same. There is, in fact, mounting evidence for the idea that radioactive clocks “ticked faster” in the far past, yielding countless discrepancies between actual verifiable rock ages and those determined by radiometric dating. The huge ages calculated for the age of the earth may make evolutionists who think that “given long enough, anything can happen” comfortable, but those numbers are not really supportable when the data is examined with an eye to its uneasy foundations.
(Additional unprovable assumptions also haunt radiometric methods. See below for further reading.)
A study of 600 generations of fruit flies, just reported in Nature, sought to unravel the mystery of how advantageous mutations could become fixed in the population.
Note that even when an “advantageous mutation” becomes fixed in a population, the mutation is not producing new genetic information and, more importantly, the mutation—no matter how fixed—is not producing a new creature or a transitional creature, just a fruit fly better adapted for its environment at the time.
Nevertheless, this study, finding that the advantageous mutation—specifically, the tendency to hatch earlier—didn’t really “take,” concluded that “unconditionally advantageous alleles rarely arise, are associated with small net fitness gains or cannot fix because selection coefficients change over time.” In other words, it is very difficult for even a helpful mutation, which is rare, to take hold in a population because, by the time the advantage in the environment is felt, the conditions change to make the helpful mutation, which was a loss of information anyway, not so helpful anymore.
Specifically, the study affirmed the notion that changes in a population’s traits may arise from the fact that “many genes influence a trait” as each of these versions gradually “become just a little more common.” This process is termed soft sweep, in contrast to hard sweep that would require the sudden appearance of a trait through the novel mutation of single gene.
In an effort to apply this information to human evolution, the NYTimes article quotes a University of Chicago geneticist as saying that soft sweeps would affect human evolution by working “on the genetic variation already present in a population, without having to wait for a novel mutation to arise.” In reality, however, this process is not evolution at all as no new creature or transitional creature is produced—just another group of humans.
Thus, when the article touts this study for increasing our understanding of “how evolution works at the genomic level,” it errs, because the study doesn’t show the evolution of anything new, just the natural selection of a fruit fly that hatches earlier than its relatives.
A new fossil find containing a wide variety of species of animals all buried together has been unearthed at a utility dig site near Los Angeles.
Of the 35 different species already identified, not a single claim has been made for a transitional form. The finding of the wide variety and number of creatures all buried together (about 1450 bone fragments having being recovered) is consistent with a post-Flood Ice Age burial by local catastrophic watery conditions, particularly given the occurrence in LA where there is so much else that is of Ice Age origin (e.g., the La Brea pits).
The fossilized creatures have been “dated by observing the layers of sediment they were found in and fall at about 1.4 million years ago.” “Comparison of the fossils with those from other sites revealed their age.” This claim should remind us of more problems with the dating methods: Carbon dating, being based on the radioactive decay of carbon-14 with a half-life of only 5730 years, could not possibly date any organic material at 1.4 million years, as the carbon-14, due to the mathematics of the situation, would have long since dissipated. Therefore, fossils are dated according to the layers they are found in, as these were. However, radiometric dating methods do not work on sedimentary rock, the kind of rock fossils are generally found in. Hence, even more assumptions are required.
http://www.answersingenesis.org/articles/2010/09/25/news-to-note-09252010