Join us now for the year-end highlights of News to Note 2012. We’ll recap news items covering key issues in the battle for biblical authority, scientific discernment, and intellectual honesty. This year some evolutionists even took a shot at Doctor Who. Their determination to subvert biological and biblical truth by getting the public to think of dinosaurs as feathered beasts on an evolutionary path shows that dinosaurs have become a powerful propaganda tool. Bible-believing creationists must therefore always be prepared to show how biblical history can explain dinosaurs.

1. The Doctor’s dinosaurs

Smithsonian blog: “Dinosaurs on a Spaceship

What’s wrong with this picture? (Evolutionists critique Doctor Who.)

Dinosaurs, a perennial favorite in pop culture, finally made an appearance in September in BBC One’s revived television program Doctor Who. After all, Doctor Who has free run of time and space, so meeting up with dinosaurs was only a matter of, well, time!

BBC dinoOne of the Cretaceous critters featured in “Dinosaurs on a Spaceship,” the 2nd episode of Doctor Who (series 7). Paleo-bloggers complain that this fellow and his companions are not the fluffy-feathered bird-handed flesh-rippers evolutionists “know” they were. Image from BBC One TV trailer embedded at blogs.smithsonianmag.com (Be sure to watch the whole trailer, but if you, like River Song, can’t abide spoilers, wait until you’ve seen the show to read the blog.)

“Dinosaurs on a Spaceship,” episode 2 of series 7, treated those who love Doctor Who to a delightful program packed with comedy, drama, and a bit of the dark side. (That’s it: no spoilers here! But if you hate spoilers, you might want to delay reading the sources of this news item until you’ve watched it yourself.) In the wake of the episode's debut, dino-bloggers such as free-lance writer Brian Switek and the writers of “Love in the time of Chasmosaurs,”1 while admitting the program was pretty well-done, took the show to task for its “errors.”

First of all, Switek wrote, “My apologies to the Doctor, ‘pterodactyl’ isn’t the proper term for these animals. The proper general term for these flapping archosaurs is ‘pterosaur.’ ‘Pterodactyl’ is an outdated term derived from the genus name of the first pterosaur recognized by science, but the term isn’t used by specialists anymore. It’s time to put ‘pterodactyl’ to rest.” Of course, within the science fiction world, the Doctor can travel virtually anywhere in space and time, so nothing he says is really out of date, and scientific terminology, which is man-made, changes all the time.

More troubling to evolutionary paleontologists and those who blog about their beliefs is the distinct lack of feathers on the Doctor’s Cretaceous castaways. One blogger wrote, “Alas, Doctor Who trotted out the usual silly Jurassic Park-esque bunny-handed monstrosities, virtually devoid of feathers.”2 Concerning a young Tyrannosaurus, Switek writes, “Sadly the juvenile tyrant is neither fuzzy nor sufficiently awkward-looking.” He believes fossil evidence indicates that such juveniles were not only “leggy” and “slim” but also that they were “fluffy flesh-rippers.” (Fossils of Yutyrranus are among those that appear to have not feathers but some sort of artifact best termed “dino-fuzz.”)

The raptors in the show also suffered from “insufficient feathery coats,” Switek writes, adding “Filmmakers seem reluctant to drape feathers over dromaeosaurids, but . . . we know that these dinosaurs had exquisite plumage covering almost their entire body. If you’re going to have raptors, they should be intricately feathery.” The presumption that dromaeosaurids—the family of “raptors”—were “intricately feathery” comes from a combination of “dino-fuzz” (such as tufts of filaments on Sinornithosaurus, but without the anatomical characteristics of actual feathers) mixed with the evolutionary determination that birds must be the evolutionary descendants of dinosaurs.

Evolutionists since Darwin have desperately needed evidence that feathers evolved from something. Darwin himself commented, “The sight of a feather in a peacock’s tail, whenever I gaze at it, makes me sick!”3 Furthermore, evolutionists have been unable to document evolutionary progression in feather morphology over time, despite their insistence that filaments like these were the evolutionary prototypes of actual feathers. And some fossils with clear anatomical evidence of genuine feathers, such as Microraptor gui (see Did Microraptor gui invent the biplane before the Wright brothers?), are simply fossils of true birds, such as the ones in your backyard.4 Only evolutionary imagination connects these dots. The Bible records God’s eyewitness account that he created flying creatures—which includes of course birds—the day before He created land animals—which includes dinosaurs—and man. Thus only the grossest of biblical compromise can suppose that birds evolved from anything, including dinosaurs.

So creationists who enjoy good science fiction can continue to relegate the comments about millions of years of evolution in our favorites to the category of “fiction,” knowing full well that the God of the Bible has told us the truth. Meanwhile, we can also chuckle that at least this time filmmakers chose to ruffle the feathers of the paleo-bloggers by failing to jump on the “intricately feathered” bandwagon.

2. Walking up the evolutionary tree

Evolutionists regularly try to explain how we became human. Because they deny the explanation provided by our Creator in the Bible, they imagine an increasingly complex tree of life, supposing all the traits that make us different from animals are simply random animal add-ons that worked. Bipedality is currently viewed as the key skill that supposedly gave our ape-like ancestors the chance to think on their feet and develop bigger brains. And to cooked food currently goes the credit for fueling those bigger brains. The following two news items explore some of the fallacies in these supposed secrets of humanity’s imaginary ascent up the intellectual ladder.

ScienceDaily: “Humans Began Walking Upright to Carry Scarce Resources, Chimp Study Suggests

Are we human because we learned to walk on our own two feet . . . or do we walk upright because we’re human?

carrying apeApes are able to walk upright only with an inefficient leg-swinging gait and ordinarily do so only for relatively short distances. Their pelvic anatomy will not permit them to habitually walk in a genuine and truly efficient bipedal gait. So why would an ape choose to go to the extra effort to walk upright when he is perfectly designed to comfortably and enjoyably walk on all fours? These chimpanzees were subjects in a study that provided incentive for this behavior: favorite food. In photo A, a chimp is carrying stones for crushing nuts in his left hand and foot. He has coula nuts stuffed in his mouth and in his right hand. (The inset shows nuts provided to tempt the ape up onto two feet.) Photo B shows a chimp making off with three papayas using all his resources: both hands and mouth to carry, leaving only two feet for his escape. Image credit: S. Carvalho et al. “Chimpanzee carrying behaviour and the origins of human bipedality.” Current Biology, 22, no. 6 (2012): R180, www.cell.com.

Since fossils are notoriously reluctant to walk, “the earliest evidence of habitual bipedality”5 eludes evolutionists. But because they assume humans and chimpanzees share a common ape-like ancestor, an international team of researchers, who in March 2012 published studies of chimpanzee behavior in Current Biology, believe they have discovered “the selective advantage that led to the origin of hominin bipedality.”5

“Bipedality as the key human adaptation may be an evolutionary product of persisting competitive strategies that ultimately set our ancestors on a separate evolutionary path,”5 they write. Believing natural selection is the driving force that propelled ape-like ancestors up the evolutionary tree of life, the team wished to determine what survival advantage bipedality could bequeath. So they gave eleven chimps a small supply of coula nuts—a chimp favorite—and watched the show . . . for 44.5 hours.

The chimps were so eager to scarf up coula nuts they walked upright in intermittent bursts three times more often when coula nuts abounded. “Presence of coula nuts also stimulated more varied forms of carrying: chimpanzees used not only their hands,” they report, “but also mouths and feet to transport items and frequently employed more than one body part at a time, thereby increasing the number of items that could be carried simultaneously.”5

In a companion study, chimps were observed raiding crops. The most ambitious thieves—the ones carrying the most contraband—resorted to bouts of bipedality, using their hands and mouths to carry more.

“These chimpanzees provide a model of the ecological conditions under which our earliest ancestors might have begun walking on two legs,” explains anthropologist Brian Richmond. “Something as simple as carrying—an activity we engage in every day—may have, under the right conditions, led to upright walking and set our ancestors on a path apart from other apes that ultimately led to the origin of our kind.” The somewhat Lamarckian idea here is that ape-like ancestors best able to walk upright could carry away more food during tough times, survive and reproduce, and eventually through natural selection produce a bipedal anatomy. In other words, “Over time, intense bursts of bipedal activity may have led to anatomical changes.”

Based on the researchers’ logic, since eager chimps used their mouths and even feet to carry extra nuts, we might well ask why humans didn’t evolve cheek pouches like hamsters and toes designed to grip-on-the-go. Furthermore, exercising a physical skill feasible within the anatomical constraints of a creature’s design does not demonstrate it is ancestrally related to another kind of creature.

These researchers admit the impossibility of determining the origin of bipedality “from the fossil or archaeological records.”5 Yet they believe they can determine the evolution of bipedal anatomy and a resulting transformation to human-ness by observing counterfeits of bipedality in living animals not anatomically equipped for true bipedality. Humans are anatomically designed for bipedal walking. The angles of the leg and thigh bones at the knee, the structure of the feet, the shape of the pelvis, and arrangement of the muscles all contribute to an integrated design allowing humans to walk upright without the exaggerated side-to-side swing an ape must use to walk on two feet. Having a better ape-walk would not produce this constellation of anatomical changes.

But the real point—aside from the fact there is no evidence that humans and apes share an ape-like ancestor anyway—is that the study implies we became human because we learned how to walk. Biblically we know that God created animals (including apes) and humans as separate creations on the 6th day of Creation week. Humans were made in the image of God with unique mental and spiritual attributes. And while our Common Designer gave us certain similar physical features, He also created apes and humans with many distinct differences. Nothing in the fossil record or genetics confirms humans evolved from ape-like ancestors. Neither does this study tell us anything about how human ancestors learned to walk. Humanity’s ancestors were Adam and Eve. And God created Adam and Eve fully able to walk on their own two feet.

3. Brainfood

Science: “Raw Food Not Enough to Feed Big Brains

Cooking: the key to evolutionary success

catching-fireThis is the cover of the 2009 edition of Richard Wrangham’s book, Catching Fire: How Cooking Made Us Human.  In it he claims that because cooked food made more calories available faster than foraging, hominid ancestors were able to evolve into progressively more advanced human forms. He also suggests that cooking increased social interaction and so helped develop human civilization. Thus cooking gets the credit for fueling the intellectual advancement of early humans. Wrangham’s ideas have been examined in several studies this year. In none of the studies however has any researcher been able to demonstrate any way that ape-like ancestors could acquire the genetic information to develop larger and more intellectually and spiritually advanced brains. Image credit: Richard Wrangham, Catching Fire: How Cooking Made Us Human (2009) from www.brainwaving.com.

Nutritional maxims remind us that our brains need a steady supply of energy, exhort us to feed our kids a healthy breakfast before school, and instruct us on the foods we should eat to think our best. Extrapolating from nutritional observations, Brazilian evolutionary neuroscientists Suzana Herculano-Houzel and Karina Fonseca-Azevedo have defined the metabolic limits of brain growth possible for primates on a raw food diet. Since juicers hadn’t yet been invented, they claim cooking was the key enabling humanity’s ancestors to evolve bigger brains.

“If you eat only raw food, there are not enough hours in the day to get enough calories to build such a large brain,” says Herculano-Houzel. “We can afford more neurons, thanks to cooking.”

Evolutionists believe humans evolved from ape-like ancestors. Modern apes have smaller brains than humans. And human brains utilize 20% of the body’s energy at rest compared to only 9% for primate animals.6 Evolutionists therefore search through the sands of time to find what fueled the evolution of bigger and presumably brighter brains. Herculano-Houzel and Fonseca-Azevedo write, “The human brain is a linearly scaled-up primate brain in its relationship between brain size and number of neurons.”7

The neuroscientists counted the number of brain neurons in 13 species of primates as well as a number of other mammals. Having thus confirmed that brain size and the number of neurons are proportional, they estimated the number of neurons in living great apes, “extinct hominins,”7 and modern humans. Then they calculated the energy requirements for those neurons. And finally, adjusting for body mass, they estimated how long each primate would have to eat raw food each day to support its brain’s needs.

Chimps and orangutans could get by on 7.3 and 7.8 hours a day, but gorillas need to munch their veggies for 8.8 hours a day. Modern humans cross the line of feasibility with a 9.3 hour requirement. “Apes can’t afford both brain and body,” Herculano-Houzel says, and “King Kong could not exist.”8

Homo erectus, an early human with an average brain size thought to have been slightly smaller than modern humans, fell into the calculations between the gorillas and modern humans. Since archaeologists have found evidence that Homo erectus cooked (see News to Note, April 7, 2012), the neuroscientists conclude that a raw diet placed a limit on potential brain power, a limit that Homo erectus overcame with its hearth fires. They write, “This limitation was probably overcome in Homo erectus with the shift to a cooked diet. Absent the requirement to spend most available hours of the day feeding, the combination of newly freed time and a large number of brain neurons affordable on a cooked diet may thus have been a major positive driving force to the rapid increase in brain size in human evolution.”7

“The reason we have more neurons than any other animal alive is that cooking allowed this qualitative change—this step increase in brain size,” Herculano-Houzel explains. “By cooking, we managed to circumvent the limitation of how much we can eat in a day.” Evolutionists consider this analysis support for Harvard primatologist Richard Wrangham’s “cooking hypothesis.” The title of Wrangham’s book, Catching Fire: How Cooking Made Us Human, sums up his contention that learning to control fire and cook food was the secret to humanity’s evolutionary success. (See News to Note, August 27, 2011 and News to Note, April 7, 2012 for more information.) He pointed out that cooking makes more calories available faster and facilitates growth. Wrangham agrees that this study confirms “an ape could not achieve a brain as big as in recent humans while maintaining a typical ape diet.”

Of course, the notion that learning to control fire has been the key to humanity’s success is nothing new. Ancient Greeks believed that the demigod Prometheus stole fire and gave it to man in defiance of Zeus’s restrictions. Control of fire and all sorts of technology historically has advanced human civilization (there being no other kind). But the idea that being able to cook food transformed ape-like nonhumans into humans by enabling them to grow more neurons falls into the category of myth, like the Promethean story. Both are “just-so-stories.” Evolutionists assume humans evolved from ape-like ancestors simply because we exist, not because of experimental proof.

“Gorillas are stuck with this limitation of how much they can eat in a day; orangutans are stuck there; H. erectus would be stuck there if they had not invented cooking,” Herculano-Houzel reasons. “The more I think about it, the more I bow to my kitchen. It's the reason we are here.” She concludes, “Much more than harnessing fire, what truly allowed us to become human was using fire for cooking.”8 But her belief about our origins—which unlike the number of neurons in a brain and the number of calories obtainable from food is not amenable to actual scientific testing—is based purely upon her evolutionary worldview, not upon verifiable science.

Knowing that bigger brains require an efficient supply of energy in no way supports the notion of human evolution. Neither cooking food nor learning to walk upright could transform a hypothetical ape-like ancestor into a human being. God created the first two humans in His image, having unique mental and spiritual attributes, on the sixth day of Creation Week, about 6,000 years ago. According to Genesis chapter one, He made land animals, including apes, the same day. He created all kinds of living things to reproduce after their kinds, not to evolve into more complex kinds. And while our common Designer gave us some similar physical features, He created apes and humans with distinct intellectual and spiritual differences. Nothing in the fossil record or genetics actually demonstrates human evolution from ape-like ancestors; such connections are the unverifiable conclusions of evolutionists superimposed on the actual “facts.”

For more information:

4. Starburst

Have scientists actually discovered a stellar nursery? Have astronomers ever witnessed the birth of stars anywhere other than on science fiction programs or on computer screens displaying the results of software designed by an evolutionary astronomer to simulate what his mind imagines but has never observed? Or do discoveries like a bluish galaxy 5.7 billion years away actually provide strong evidence for a young universe?

FOX News: “Supermom galaxy birthing stars on cosmic scale discovered

A bright blue galaxy: is it bursting with new stars?

supermom galaxyAbout 5.7 billion light years away, this newly discovered bluish galaxy is at the center of a cluster of galaxies producing the brightest X-ray glow ever seen. But on what basis do its discoverers conclude it is rapidly producing new stars? Image: AP Photo/NASA on www.foxnews.com.

Is a distant galaxy awaking to bring forth new stars? The galaxy at the center of faraway galaxy cluster SPT-CLJ2344-4243, discovered by astronomers using NASA’s Chandra X-ray telescope, is believed by its discoverers to be producing new stars at a phenomenal rate—about two a day. They write, “The central galaxy, which is both the most massive and most luminous galaxy in the cluster, is considerably bluer than the rest of the member galaxies, suggesting significantly younger stellar populations.”9

The discovery is particularly unusual because the central galaxies in clusters are usually redder and therefore believed by evolutionary astronomers to be older and inactive. As MIT’s Michael McDonald, lead author of the study just published in Nature, says, the central galaxy is mature, about 6 billion years old, and is the kind that doesn’t typically “do anything new . . . what we call red and dead. It seems to have come back to life for some reason.” The team of astronomers therefore calls the galaxy cluster “Phoenix,” after the mythical bird that rose from the ashes.

Star formation has actually never been observed. Sometimes astronomers or the media refer to star formation as if they are actually seeing it happen, but they are not. When astronomers see phenomena near stars, they cannot know that the phenomena they are observing have anything to do with star formation. And, as creationist astronomer Dr. Danny Faulkner of the University of South Carolina—Lancaster points out, “At the distance of this galaxy, individual stars are not visible.”

The blue-ness of the distant galaxy is the reason the researchers believe it is rising from its ashes. Bright blue stars are the hottest stars and burn their fuel very rapidly. The maximum possible age for each color star can be estimated, and only red dwarf stars actually have enough fuel to have been burning since the time of the supposed big bang. Both creationist and evolutionary astronomers agree a blue star could not last more than a few million years. Since the latter claim the universe is 13.7 billion years old, they must assume stars have been forming and burning out since it began. They believe stars are still forming today and blue stars are those most recently formed.

Dr. Faulkner explains, “When a galaxy such as this appears very blue, it likely is because the galaxy's light is dominated by these bright, hot stars with very short lifetimes. However, the conclusion that there is a great deal of star birth happening there is an inference that is laden with evolutionary assumptions.”

Since astronomers with a biblical worldview believe the universe is only about 6,000 years old, blue stars are easily explained. God formed them on the fourth day of Creation Week as His eyewitness account in Genesis 1:16 says. They’ve only been burning for 6,000 years and so have plenty of fuel remaining.

Could stars be forming today? After all, we see supernovae occur in space, so why not star formation too? Although the Bible does not say God is not making more stars, it does say He finished the work of Creation on the sixth day. Genesis 2:1 says, “Thus the heavens and the earth, and all the host of them, were finished.10 When a star goes supernova, an existing star explodes. But star formation and star explosion are not the same.

Michigan State astronomer Megan Donahue commented that the findings will help “answer this basic question of how do galaxies form their stars.” Although “long-age” astronomers like Donahue assume stars must be forming, they do not know how. They believe swirling hydrogen gas must cool and condense until it is dense enough to possess enough gravity to prevent re-expansion. However, gases tend to expand, not contract. Secondly, if a swirling mass of gas contracted, it would spin faster in order to conserve angular momentum, and that increased angular velocity would oppose continued contraction. Finally, such collapse of a nebula would greatly magnify its magnetic field, again opposing the shrinkage required to form a star. Thus the ongoing formation of stars seems contrary to the laws of physics, given the conditions that exist in space.

So is this “starburst” an actual birthplace of stars? Neither we nor they can be certain at this point how to interpret this bluer hotter galaxy in the Phoenix, but the idea that it is bursting with new stars is only an interpretation based on the evolutionary demands to explain the existence of blue stars in a billions-of-years old galaxy. In reality, the existence of hot blue stars that could not possibly last for billions of years is evidence for a young universe.

5. Gene Genesis

Believers in molecules-to-man evolution insist life can randomly evolve from non-living elements, a process that has never been observed. Despite claims that information-carrying molecules can randomly evolve to replicate themselves and transmit blueprints to new generations of primordial life, no process exists by which such coded information could spontaneously arise. Furthermore, simpler organisms have no demonstrable way to acquire the genetic information to become new, more complex kinds of organisms. This year some evolutionary scientists claimed they witnessed genes acquiring new functions—evolution in action. Yet as you read on, you will see that they have only witnessed old genes performing functions they already had.

Scientific American: “Gene Genesis: Scientists Observe New Genes Evolving from Mutated Copies

Evolutionists claim to have demonstrated the evolution of a new function through gene duplication.

kitten on a mirrorMirror, mirror, what a spare! Some mutations involve duplication of genes. Evolutionary geneticists have long suggested that such duplicates could randomly mutate until they acquired new functions without any loss or harm to the original organism. Yet like this kitty’s copycat, a copy contains no new information. Furthermore, mutations within a copy can degrade the information it contains, but not produce new information. Some scientists now claim they have seen gene duplicates develop new functions, a necessary pre-requisite to the evolutionary scenario. However, they have not. Image credit: Christian Holmér (CC) via flickr from www.scientificamerican.com.

How can an old gene learn new tricks? That has long been a problem for evolutionists trying to explain how increasing genetic complexity evolved. Despite implicit faith that “gene duplications allow evolution of genes with new functions,”11 no one has actually shown how that could happen. Now, scientists from Sweden’s Uppsala University and the University of California, Davis, believe that they have witnessed just such evolution of a novel genetic function.

Because some random mutations involve gene duplications, geneticist Susumu Ohno in 1970 suggested that duplicate copies of genes acquired new and useful functions, got enhanced by natural selection, and added to the genetic complexity of evolving organisms. Since mutations are generally not helpful, however, evolutionists have had a hard time showing how they could survive the process of natural selection long enough to become useful. Since mutations actually don’t add any new information, they have also had a hard time coming up with any examples to demonstrate how copies of old genes could acquire new functions.

To get around this problem, John Roth, Dan Andersson, and their colleagues decided to assume that the useful function to be “acquired” was already present and only needed to be amplified by time, chance, and natural selection. Many genes have multiple functions. In their model, they chose a strain of the bacteria Salmonella that had lost the main gene needed to make the amino acid tryptophan. However, the bacteria had another gene, one for making the amino acid histidine, which was also able to produce tryptophan, albeit weakly. They grew the bacteria in a tryptophan-deprived environment for 3,000 generations and discovered that the surviving population had multiple copies of the “dual-function gene.”12 Those with duplication mutations had been favored for survival because they had greater ability to make the tryptophan they needed.

Microbial evolutionary biochemist Antony Dean of the University of Minnesota, Twin Cities, commenting on the discovery, said, “Ohno will go down as a very important historical figure, but Andersson has the new model for how genes duplicate. His theory is square one.”12

 

The main problem with using this discovery to support evolutionary theory is that no new function actually came into existence. The dual-function gene already existed in the organism. The genetic information was not new; it was already there. Natural selection in the tryptophan-deprived environment favored the survival and reproduction of those bacteria that had multiple copies of the gene, but no novel function had to evolve. The original bacteria already had a redundant way to make tryptophan.

By analogy, a book contains information. A million copies of the book do not contain a million-times more information, just more copies of the same information.

These bacteria did not evolve an innovation. They did not acquire new genetic information, as duplication of an existing gene is nothing new. They did not even do anything new or innovative with old information. All they did was experience a duplication mutation that allowed them to efficiently express an existing ability to manufacture tryptophan.

Cornell University evolutionary geneticist Richard Meisel cautions that this evolutionary mechanism may be limited to bacteria and viruses,12 which brings up another rather obvious point: nothing about this discovery provides a mechanism for Salmonella to become any new kind of more complex organism, only another variety of Salmonella. Mutations—even duplications that provide extra copies of something useful—do not provide new genetic information or the raw material that “moves bacteria in an upward evolutionary direction.”13

 

 

 

And Don’t Miss . . .

  • This year has finally seen the idea of “junk DNA” junked. Read more about it at Junk DNA and ENCODE Revisited.
  • Where will our curiosity take us? To Mars and to the future of space exploration, yes. But not back in time to prove life evolved on the Red Planet. Read more about the evolutionary motivation behind Curiosity’s mission at News to Note, August 11, 2012.
  • Did you know that secular humanists are engaged in a battle for your children? Popular TV personality Bill Nye has been crusading for your children, pleading with creationist parents not to teach their children biblical truth. Completely confusing experimental “here-and-now” science with historical (origins) science by equating technology (e.g., smoke detectors) made purposefully by intelligent human beings with the mindless, purposeless, directionless process of evolution, Nye claims that modern life as we know it will cease if the next generation doesn’t accept evolutionary dogma. Read more at Bill Nye’s Crusade for Your Kids and News to Note, December 1, 2012.
  • God’s Word declares the shedding of innocent blood to be abominable (Proverbs 6:16–17). This year a pair of bioethicists argued in favor of “after-birth abortions” on the ground that newborns “are not really persons.” Despite their reprehensible recommendations, they correctly point out that killing babies after birth and aborting the unborn are morally equivalent. Recently, in the aftermath of horrific mass murder in Connecticut,14 our President rightly condemned the killing of these elementary school children. Yet given the pro-abortion policies of his administration, we must note the moral inconsistency and hypocrisy of all leaders who rightly condemn the actions of a school shooter in Connecticut while justifying legalized murder through abortions, abortifacient medications, and procedures that destroy human embryos. The Russian poet Anna Akhmatova, writing of Stalinist atrocities in her poem “Requiem,” wrote, “The hour has come to remember the dead. . . . I’d like to name you all by name, but the list has been removed and there is nowhere else to look.”15 These words are a fitting lamentation for 55 million unborn murder victims in this country since Roe v. Wade (1973) and hundreds of millions more in countries around the world—millions who remain nameless.

Footnotes

  1. chasmosaurs.blogspot.com/2012/09/dinosaurson-spaceship.html
  2. From chasmosaurs.blogspot.com/2012/09/dinosaurson-spaceship.html The “bunny-handedness” relates to current evolutionary contentions that some dinosaur wrist bones appear to have sufficient symmetry to suggest they were mobile enough to turn “wards” like birds’, another effort to “prove” an evolutionary progression from dinosaurs to birds.
  3. F. Darwin, Ed., Letter to Asa Gray, dated April 3, 1860, The Life and Letters of Charles Darwin (New York: D. Appleton and Company, 1897), vol. 2, page 90 (from an unabridged facsimile edition of the 1897 edition copyright 2006 by Elibron Classics).
  4. Microraptor gui is an extinct bird with some unusual features, but it was a bird. Read more about this interesting creature in an article by Dr. David Mention, Did Microraptor gui invent the biplane before the Wright brothers?
  5. www.cell.com/current-biology/fulltext/S0960-9822(12)00082-6  (1)  (2)  (3)  (4)  (5)
  6. news.sciencemag.org/sciencenow/2012/11/live-chat-did-cooking-lead-to-b.html
  7. www.pnas.org/content/early/2012/10/17/1206390109  (1)  (2)  (3)
  8. news.nationalgeographic.com/news/2012/10/121026-human-cooking-evolution-raw-food-health-science  (1)  (2)
  9. M. McDonald, “A massive cooling-flow-induced starburst in the core of a luminous cluster of galaxies.” Nature 488 (16 August 2012): 349–352. doi:10.1038/nature11379
  10. Biological creations such as plants, animals, and human beings were designed to reproduce “after their kinds,” but there is no such corresponding attribute named in Scripture for stars.
  11. J. Nasvall, L. Sun, J. R. Roth, D. I. Andersson, “Real-Time Evolution of New Genes by Innovation, Amplification, and Divergence,” Science 338 (2012): 384, doi: 10.1126/science.1226521.
  12. E. Pennisi, “Gene Duplication’s Role in Evolution Gets Richer, More Complex,” Science 338 (2012): 316–317.  (1)  (2)  (3)
  13. A Poke in the Eye?
  14. News to Note, December 15, 2012
  15. “Requiem” by Anna Akhmatova, www.poemhunter.com/poem/requiem