Unlike today (when you can practically trip over three atheists just on the way to the supermarket), at the time when Sir Isaac Newton (often quoted as the most famous and influential scientist who’s ever lived) penned this in the mid 1600s, it was a very true statement.
When you think about the concept of atheism as a whole, a system of belief where practically everything is meaningless because our very existence is thought to have been brought about with no purpose or direction, it is extremely odious, as there cannot be true meaning to anything at all!
In this worldview, someday you and I will die, our kids will die, their kids will die, and eventually the whole universe will reach “heat death” and die! No one will remember any of our relationships, accomplishments, hopes, dreams, or discoveries, and everything that ever happened will ultimately be for nothing. However, for the unrepentant sinner there is a silver lining behind this worldview. It’s the assumption that there is no judgement.
A moral law requires a moral lawgiver, and with the concept that there is no God comes the freedom of belief that no matter what we do in life, no matter how egregious our thoughts and deeds, if we are somehow able to ‘get away with it,” then we will actually get away with it! So as long as you just concentrate on the “good part,” so to speak, you can swallow the bitter pill of atheism with the honey of non-accountability quite smoothly. It’s like putting lipstick on a pig: it looks better but still stinks just as bad.
Similar to the example of getting children to take something they normally wouldn’t want, secular thinkers in the 1800s understood that in order to sway people to accept a naturalistic worldview, they’d need to “dress it up” a bit and make it more feasible and attractive. After all, sin-cursed people are born with a proclivity to reject the things of God, so enticing them with an atheistic outlook wouldn’t be that hard if an intellectually acceptable way to lead them towards that conclusion could back it up.
In order for someone to accept atheism, they must believe that what they see around them has somehow come about by random forces in order to replace the need for a creator God. And in order for that to have happened, it is reasonable to assume the universe just didn’t pop into existence fully formed and functional the way we experience it today. So the concept of “millions of years” of earth history being a “fact” was crucial in establishing a naturalistic worldview among the populace in the West.
Like an image consultant or “date doctor” whose job it is to help their clients look their best while hiding their worst features, the concept of long ages was popularized by emphasizing its (supposed) innocuous effect on the dominant Christian worldview of the day, highlighting its so-called “scientific” attributes, and wrapping it all in a veneer of intellectual credibility.
Although this stood against the commonly accepted belief that the Bible’s timeline of approximately 6,000 years was true, the concept of long ages gradually made inroads against that understanding, often with help from those within the church itself, quite rapidly.
Radiometric Dating Methods
Although initially introduced through the idea of rock layers supposedly forming slowly, layer by layer, today, the majority of people assume that the earth is billions of years old often because they think dating methods like radioisotope (radiometric) calculations are foolproof. Radiometric dating was developed in the early 20th century1 and is considered by mainstream scientist and laypersons alike as very reliable for measuring absolute ages of rocks, and hence, the earth.
But is there really a reliable, direct way to determine the absolute age of any fossil, rock, or the earth? Contrary to the widespread belief that there is, no such method exists. If you closely examine specific examples of dating by this method, its “proven validity” falls apart rather quickly. There is, in fact, no way to directly determine the age of any fossil or rock.
For example, the revised age of the earth according to the latest radioactive dating is between 4.55 billion and 4.6 billion years old. This alone presents a considerable gap of 50 million years,2 which does not give an “absolute” age for the earth at all. And although some may see this “fudge-factor” as acceptable to the big picture, it can make the astute thinker realize there’s legitimate room for some real doubt regarding such methods.
Unfortunately, evolutionary scientists, including old-earth Christian geologists, fully embrace methods like varve, tree-ring chronology, and 14C dating as proven dating methods but ignore the glaring (but often little-known) weaknesses that these methods have.
How It Works
Some types of rocks, particularly igneous rocks that are formed from magma, contain radioactive isotopes of certain elements. In some cases, isotopes can be unstable or radioactive—called parent isotopes—and they are assumed to decay at a consistent rate into a stable isotope or a new element—called daughter isotopes. Scientists can measure the ratio of the parent isotopes and their daughter isotopes in the rocks with a great degree of accuracy.
This gives scientists a way to calculate what is called half-life, or the time required for one half of the amount of unstable material to degrade into a more stable material (although the length of a half-life is different for every radioisotope, the half-life pattern is the same,). “In other words, the less of the parent isotope (and the more of the daughter isotope) we measure in a specimen, the older we assume it to be.”3
By assuming the starting amount, how long it takes for the material to decay, and how much is present in the sample being examined, this provides a way for scientists to assume how old the material is. However, it’s important to understand that rocks aren’t clocks. What they are measuring are not absolute dates (hence the range in the supposed age of the earth) but ratios, and the dates have to be inferred based on assumptions about the ratios.4
It is vital to understand that assumptions about the facts we observe (rocks and fossils), rather than facts themselves, are the key to how these dating methods operate. Since we are not able to go back in time and witness rocks or fossils being formed or see the decay rates of the atoms at the time when the earth was created, scientists need to make assumptions in order to predict the age of a fossil or rock sample. Again, since radiometric dating measures the rate of radioactive decay, one of the assumptions scientists make is the amount of the initial parent and daughter isotopes in a rock, as they were not present in the past to measure it.5
The Burning Candle Analogy
To see what type of assumptions are at play when using these types of dating methods, a simple analogy can help. Trying to determine the age of a candle by measuring its burning rate and height serves as a helpful analogy to understand how geologists calculate a sample rock age.
Pretend you wanted to know how quickly a normal wax candle will burn away (decay so to speak). You set up an experiment with a candle 12 inches high and light it. Now let’s say for our analogy that we notice that every hour, one inch of the candle melts away. So after two hours, the candle is only 10 inches tall; after four hours the candle is now 8 inches tall. From the data you’ve collected you determine the “decay rate” of the candle is in fact one inch per hour.
Now let’s say you walk into a room you’ve never been in before and you see a burning candle that is six inches tall. Can you assume you can know how long it has been burning? Most people would say, based on the data collected earlier, that, yes, we can “know” with confidence that the candle has been burning for six hours now (starting height of 12 inches, a decay rate of one inch per hour = a height of six inches and six hours passed).
But what if the candle was initially 16 inches tall (we never observed its starting condition)? What if the candle had been lit 10 years ago, burned for a while then snuffed out, and had been re-lit five minutes before we walked into the room (we weren’t able to observe the processes that brought it to this state)? What if, unlike your experiment, someone has opened up a window in this room and a breeze has been blowing on the candle, making it burn faster than you observed in your experiment (you don’t know what specific factors may have influenced the result)?
Can you see how we might think that if we can calculate the rate at which the wax is melting now and the amount of wax that has dripped, then we can go backwards and measure how long the candle has been burning? Our assumptions may be reasonable, logical, and “scientific” based on careful observation and measurements, but our conclusions could be wildly incorrect! Those calculations could only be accurate if we knew and received data from a trusted source and/or observed the candle before it started burning and all the way through the process.
Similar assumptions used in the candle analogy apply to all radiometric dating methods.
- You must first know the starting amount of the parent isotope at the beginning of the specimen’s existence.
- You must be certain that there were no daughter isotopes present in the beginning.
- You must be certain that neither parent nor daughter isotopes have ever been added or removed from the specimen.
- You must be certain that the decay rate of parent isotope to daughter isotope has always been the same.
However, just like the burning candle, if the set of assumptions is incorrect, then the resulting ages using these studies are also incorrect.6 It is true that we can measure decay rate using observational science. But these dating methods must assume certain things that supposedly occurred in the past, which can neither be repeated nor tested. The chances of at least one of the four assumptions being invalid is made obvious when we look at some of the published radiometric “dates” found in scientific literature.
Around the world, a number of studies have been done to try to validate radiometric dating but failed.
For example, rocks of a known age, from observed events, like Mount’s St. Helens’ last eruption in 1986 have been tested, and several different radiometric dating methods used resulted in dates estimated at millions of years, when we know that the rock was only 10 years old when tested through radiometric dating.7
Also, the north rim of the Grand Canyon has lava flows from volcanoes on it that erupted after its formation, which potassium-argon dating methods have determined are a billion years “older” than the oldest rocks at the bottom of the canyon!
And near Hawaii, lava from underwater volcanoes known to have erupted in 1801 AD has given dates varying between 160 million years up to 3 billion years old, which is an incredible difference for what many people consider as so-called “reliable” methods.8
The point being, if you can’t trust these dating methods with rocks you do know the age of, why would you trust them with rocks you don’t know the age of, especially considering the huge number of unproven assumptions that go into the methodology behind them?
A Trustworthy Timeline
Secular humanism, founded on the declaration that there is no need to believe in a Creator God because of the story of evolution, is now the dominant force in progressive minds shaping our society. This ideology all rests on an undergirding belief in supposed millions of years of earth history as fact, and radiometric dating is certainly a lynchpin of that platform of thought.
Unfortunately, many Christian scientists and laypersons of today continue to choose the path of conformity to what secular science has been dictating, wrongly assuming that the grand scope of evolutionary thought is truth and that specific, plainly read truths presented in the Bible are not to be taken literally. This has conceded biblical authority and granted it to secular forces, who have used that to take control of western society.
By controlling the education and media of the day to impose these ideas to the masses, it can be summed up this way: he who controls the present controls the past, and he who controls the past controls the future. Christians need to learn, and teach their children and the world around them, that the “date doctors” of the day, who promote these so-called scientific dating methods, do not trump the revealed Word of God. Rather than trust the false conclusions based on the supposed age of the rocks, we should continue to put our trust in the Rock of Ages.