The Earth is about 4.54 billion years old, contrary to the unsubstantiated beliefs of Young-Earth Creationists.

British naturalist, William Smith, realized that two layers of rock with similar fossils are probably the same age. His nephew, John Phillips, used this realization to conclude that the Earth was at least 96 million years old.

Next up, William Thomson calculated how long the Earth would have taken to cool to its current temperature. His calculations yielded an age between 10 and 100 million years old, but neither Thomson nor anybody else knew about radioactivity.

Radioactivity was discovered in 1896 and soon it was found that radioactivity gives off heat. This meant that the Earth could stay warm long after its formation. Scientists have since confirmed that much of Earth’s internal heat comes from radioactive decay.

Radioactive decay also led to another way to measure age. When an element (such as uranium) undergoes radioactive decay, it becomes a different element (such as lead), which is called the daughter element. By measuring the amounts of the radioactive element compared to the amounts of its daughter element, we can determine the age of rocks.

The process of using radioactive decay to determine an object's age is called radiometric dating and has been used to determine the Earth’s age.

We can also apply radiometric dating to meteorites.

The Sun burns hydrogen, fusing it into helium through nuclear fusion. We can use the ratio of hydrogen to helium to determine the age of the Sun. Not surprisingly, the Sun is about 4.57 billion years old: only slightly older than the Earth. There are other ways we can determine the age of the Sun.

That the age of the Earth and Sun are chronologically consistent (with respect to cosmological theories and actual observations of solar system formation—the Hubble Space Telescope has taken pictures of protoplanetary disks) gives us confidence that: