Certain isotopes are unstable and undergo a process of radioactive decay, slowly and steadily transforming, molecule by molecule, into a different isotope.
This rate of decay is constant for a given isotope, and the time it takes for one-half of a particular isotope to decay is its radioactive half-life.
But for humans whose life span rarely reaches more than 100 years, how can we be so sure of that ancient date? Even the Greeks and Romans realized that layers of sediment in rock signified old age.
But it wasn't until the late 1700s -- when Scottish geologist James Hutton, who observed sediments building up on the landscape, set out to show that rocks were time clocks -- that serious scientific interest in geological age began.
Layers of rock build one atop another — find a fossil or artifact in one layer, and you can reasonably assume it’s older than anything above it.
Paleontologists still commonly use biostratigraphy to date fossils, often in combination with paleomagnetism and tephrochronology.
Geologist Ralph Harvey and historian Mott Greene explain the principles of radiometric dating and its application in determining the age of Earth.
By measuring the ratio of lead to uranium in a rock sample, its age can be determined.
A submethod within biostratigraphy is faunal association: Sometimes researchers can determine a rough age for a fossil based on established ages of other fauna from the same layer — especially microfauna, which evolve faster, creating shorter spans in the fossil record for each species.
Paleomagnetism: Earth’s magnetic polarity flip-flops about every 100,000 to 600,000 years.
Before then, the Bible had provided the only estimate for the age of the world: about 6,000 years, with Genesis as the history book.
Hutton's theories were short on evidence at first, but by 1830 most scientists concurred that Noah's ark was more allegory than reality as they documented geological layering.