Radioactive elements were incorporated into the Earth when the Solar System formed.
All rocks and minerals contain tiny amounts of these radioactive elements.
A commonly used radiometric dating technique relies on the breakdown of potassium (Ar in an igneous rock can tell us the amount of time that has passed since the rock crystallized.
If an igneous or other rock is metamorphosed, its radiometric clock is reset, and potassium-argon measurements can be used to tell the number of years that has passed since metamorphism.
Each atom is thought to be made up of three basic parts.
The nucleus contains protons (tiny particles each with a single positive electric charge) and neutrons (particles without any electric charge).
Since the 1950s, geologists have used radioactive elements as natural "clocks" for determining numerical ages of certain types of rocks. "Forms" means the moment an igneous rock solidifies from magma, a sedimentary rock layer is deposited, or a rock heated by metamorphism cools off.
It's this resetting process that gives us the ability to date rocks that formed at different times in earth history.
The thing that makes this decay process so valuable for determining the age of an object is that each radioactive isotope decays at its own fixed rate, which is expressed in terms of its half-life.The rate of decay (given the symbol λ) is the fraction of the 'parent' atoms that decay in unit time.For geological purposes, this is taken as one year.After all, textbooks, media, and museums glibly present ages of millions of years as fact.Yet few people know how radiometric dating works or bother to ask what assumptions drive the conclusions.
Carbon-14 is a method used for young (less than 50,000 year old) sedimentary rocks.