Last Updated December 6, 2003
The Home link will now take you to the blog. This site is no longer being updated.
The Myth of c Decay
One of the more moronic creationist arguments for a young universe is that the speed of light was faster in the past than it is today. Since we can see starlight from millions of lightyears away, it must have taken the light millions of years to reach us, so the universe can't be a paltry 6,500 years old. Anyone with a brain should be able to grasp the logic behind this...anyone except, of course creationists.
How do they arrive at this conclusion? Simple. They assume that the Bible is inerrant ( laughable assumption, at best), and that the universe is 6,500 years old (another laughable assumption) and attempt to create evidence to support those assumptions. This is the polar opposite of the scientific method which finds facts first and draws conclusions from those facts. This essay will put to rest the stupidity of a c decay on the magnitude of which they are discussing.
Before we begin, it should be noted that there is legitimate scientific research going into the prospect of a decaying c. The scientific community takes evidence supporting this prospect very seriously, since the implications are absolutely enormous, as this essay will show. However, it is doubtful that c has been decaying by such a huge amount as to lessen the estimated age of the universe by six orders of magnitude.
Rate of Decay
The speed of light is a velocity (m/s), so any decay it undergoes will be an acceleration (m/s2). Now, since creationists don't bother to provide a mechanism for this decay, we are left with a few choices of which kind of decay to analyze. We'll use a constant linear decay (the speed of light decreases by x amount of m/s per unit of time). So, let's say we have a star that is measured to be 100 million lightyears away. According to creationists, this star can be no older than 6,500 years. So, we have a discrepancy. Astronomers, cosmologists and physicists will tell you that that star can be no less than 100 million years old.
To begin, we'll compare the creationist upper limit with the scientific lower limit. Creationists maintain that the distance is the same (100 million light years), but the time is only 6,500 years. If light covered 100 million lightyears in such a small amount of time, it'd amount to an average velocity of 4.6E12m/s (about 1000 times greater than it is now). We know that it must have taken 6,500 years for c to decay from an unknown velocity to 3E8m/s. Using the formula d=(1/2)at2, we can find the average acceleration (any change in velocity or direction is considered an acceleration; it is not limited to increases in velocity) that light undergoes. With this method, we find that c decays at about -45m/s2, or 1.42E9m/s/yr. So, if the speed of light today is 3E8m/s, then it would have been 1.72E9m/s this time last year, almost 10 times as fast as it is now.
This is clearly ridiculous and flies in the face of observed fact. But, while we're having fun with other people's stupid ideas, let's take it one step further. According to creationists, all the stars were created at once. So, what do we get when we use a star that's 200 million lightyears away as a base? We get a different rate of decay! So, light's velocity from one star slows down more quickly than the light's velocity from another star. It is absolutely incredible that people still buy into this clearly moronic idea.
That's Not What Setterfield Said!
Barry Setterfield was the first to introduce the idea of c decay with any kind of testable evidence. Setterfield took 41 measurements taken of the speed of light since 1675 and plotted them on a curve, not on a straight line, like we did above. He surmised that the point at which the curve reached infinity must have been the creation point of the universe. Not surprisingly, it turned out to be around 4040B.C. Similarly unsurprising is the fact that Setterfield has no relevent science degree (like an physics, astronomy or astrophysics degree), or he'd be able to grasp the immense implications of a c that started out as infinite and exponentially decayed.
However, Setterfied's credibility is shattered when he made the claim that c had reached its minimum value around the year 1960 and would decay no further, just around the time we began to develop much more accurate and precise instruments for measurement. Coincidence? He automatically assumes that data taken from 300 years ago on the speed of light wll be just as valid as data taken in 1960! He doesn't even think of attributing the discrepancies in value to human error or imprecise and inaccurate measuring instruments. Furthermore, he gives no mechanism for this decay, nor does he give a reason that it stopped just around the time technology became advanced enough to measure the speed of light to consistent, precise results. Nor did he submit his article to any scientific journal (probably because he knew that real scientists would dismiss it as the garbage it was). He just wrote a book, which creationists today still herald forth as some of the best evidence of their opinions.
Gravitational Warp Decay
Quick lesson in general relativity. Most (hopefully) know from high school physics that anything with mass exerts a certain amount of attractive force on everything else. On scales like your body, the force is so small as to be insignificant, but it's still there. This is why you are on the floor instead of floating. The Earth, with its huge mass, is pulling down on you, causing you to undergo a constant acceleration downward of 9.8m/s2. What they don't tell you in high school is why these forces are exerted. General relativity reveals that for us. It turns out that anything with mass creates warps in spacetime. Your body has a very minuscule warp in spacetime around it (isn't that special?). Comparably, the Earth's warp is huge. What these warps do is decrease the distance between you and something else. This distance decrease exhibits all the properties of a force (attraction, measurability, et cetera). So, you're standing on Earth right now because the Earth's mass has warped spacetime in such a way that the distance between you and it has become zero (practically speaking).
You may have heard that light always travels in a straight line. You may have also heard that a black hole's gravitational field is so strong that it can suck light off its path. These two statements may seem self-contradictory, but with a relativistic understanding of gravity, they are in tandem and make sense. When something traverses a distance, it traverses spacetime. But, on local scales (i.e. our solar system), spacetime is not a straight line! This is because of the gravitational warps created by the planets and sun. So, a beam of light traversing through a black hole or our solar system will not follow a straight line according to us. The light will be following the path that spacetime has laid out for it which, according to us, is curved and, according to the light, is straight. A black hole warps spacetime enough so that light is visibly pulled off its course of travel. Since light always follows the path that spacetime gives it, we can use the path light follows to gauge how large the gravitational warp around something is. Our sun, for example, causes light to deviate from its course by less than 1/1,000,000 of a radian.
OK, so where am I going with this? It starts with the formula used to determine the gravitational warp's magnitude, which is:
Were c undergoing a process of decay, like creationists claim, then the all gravitational warps should be getting larger and larger as time goes on, and must have started out trivially small! The deviation of light's path caused by an object is directly proportional to the amount of force exerted by that object on another one. So, the Earth's gravitational pull would have been much less a thousand years ago than it is today by a very significant amount! But, unlike creationists, I won't just leave this like a statement with no numbers. To determine the amount of increase in gravitational warp vs. the amount of increase in acceleration due to gravity (g), we must look at the equations governing both. To find g, we use the formula GM/r2. Now, what do we do to get the gravitational acceleration from the value for straight-line deviation? Well, the two equations are very similar. In fact, if you have the gravitational acceleration value, you need only multiply by 4r /c2 to derive the staight-line deviation. So, to go from the deviation to acceleration, you only need to divide by 4r /c2. So, if the Earth was preformed by God, would it be able to hold an atmosphere? Using our previous linear decay rate, the speed of light at that time would have been in the area of 9E12m/s. The Earth would have bent light a staggaring 3E-18 radians, resulting in an immense gravitational acceleration of 9.7E-37m/s2, over 1E37 times less than it is now! There is absolutely no way in hell that Earth could have even held itself together, even if it were preformed, much less hold an atmosphere! Adam and Eve would have been sucked out into the vacuum of space!
Even more problems arise when we consider the planets' original states of motion. If God set the Earth in a state of motion around the sun, then that state of motion would be constantly interrupted by the increasing gravitational attractions due to a decaying c. This would, in turn, increase the force pulling the planets in orbit around the sun, thus drawing them closer to the sun as time went on. If the Earth was suddenly wrenched 1 meter closer to the sun off its natural orbital path, the effects would be catastrophic, as anyone with any marginal knowledge of Newtonian physics could tell you. People would literally go flying through the air, buildings would fly apart and the oceans would undergo a monstrous shift toward landmasses, creating titanic tsunamis and tidal waves. Imagine all of this happening regularly on Earth because of the creationist explanation! Earth would literally be torn apart, and any life would cease to be.
Now, some of you reading this may be a little confused about how the speed of light can affect the pull of gravity. So, I've written a more in-depth explanation of the general relativistic conclusion here.
Constantly Decreasing Energy Output from Nuclear Fusion
A postulate of special relativity is that energy and mass are two forms of the same overall thing (we just call it energy, because there is much more energy than mass in the unviverse) by the relation E=mc2. Through this relation, we know that the absolute maximum amount of energy that can be released from 1kg of matter is 9E16J (m=1kg and c=3E8m/s). This relation is readily visible and takes place every second in our sun. The sun releases approximately 4E26J of energy per second, or 4E26 Watts (W). So, using our relation, we can easily see that the sun is converting about 4.4E9kg of mass into energy every second.
Now, creationists believe that all the stars were created at the same time, and in order for Adam and Eve to have lived on Earth in the beginning, our sun must have been created in its current state, expending the same amount of its mass as it does now. So, we won't have to worry about the standard astrophysical problems of stars going through different stages. Creationists have made disproving their ridiculous claims quite easy. At the time the sun was created, it would have been radiating 3.6E35W, over 800,000,000 times as much as it does now! Needless to say, Adam and Eve would have been cooked while being sucked out into the vacuum of space due to the lack of gravitational attraction between them and Earth. Earth itself would have been vaporized from being heated to such extreme temperatures in a state where its own gravitational binding energy was insufficient to even hold itself together!
No Measureable Redshift in Light from Stars
When light travels to us from a star, it must first escape the gravitational well of that star. This escape requires energy. Now, you may ask how the amount of energy in a beam of light is measured. Well, without delving into a more accurate quantum physics explanation, light is made of mediating particles called photons (which are, in fact, the particles that make up any reaction involving electromagnetic interactions). The photon is light's way of transferring energy and momentum. Thus, each photon carries a certain amount of energy with it that it must expend in order to escape the gravitational field of a star. When a photon loses energy, it is called a redshift, because, as photons lose energy, their oscillation frequencies drop and their wavelengths become longer. The wavelength carried by photons will determine whether or not it falls with our visible spectrum (between 500 nanometers and 900nm wavelengths). The EM spectrum goes from red (long wavelengths, low frequencies, low energy) to blue (short wavelengths, high frequencies, high energy). Since the photon loses energy in the escape from gravitational fields, it shifts toward the red end of the spectrum, so we have the term redshift.
How does all this apply? Well, if light was so much faster, then gravity must have been much less, which means that initial light escaping stars would have had to expend virtually no energy to escape the gravity of its originator star! This is completely out-of-line with astrophysical measurements, which confirm that significant redshifts must have taken place in order for the photons to reach us with as much energy as they did!
According to Einstein's relation E=mc2, the value of c controls the relationship between matter and energy. This relationship, in turn, defines pretty much everything in the universe, from nuclear reactions to gravity and force ratios. I couldn't even begin to attempt listing all the effects of a decaying c. The idea is clearly ludicrous, and the scientific community utterly shredded Setterfield's work when he published his book in 1981. Creationists have largely ignored the damning criticisms that Setterfield's work has recieved and still trumpet his book as proof of Creationism. Setterfield's solution is tantamount to using a bomb to plug a hole. It only creates more problems for the creationist position, which is already stupid enough. If the speed of light has changed (which is possible), it certainly hasn't changed exponentially.