When news is announced on the discovery of an archaeological find, we often hear about how the age of the sample was determined using radiocarbon dating, otherwise simply known as carbon dating. Deemed the gold standard of archaeology, the method was developed in the late s and is based on the idea that radiocarbon carbon 14 is being constantly created in the atmosphere by cosmic rays which then combine with atmospheric oxygen to form CO2, which is then incorporated into plants during photosynthesis. When the plant or animal that consumed the foliage dies, it stops exchanging carbon with the environment and from there on in it is simply a case of measuring how much carbon 14 has been emitted, giving its age. But new research conducted by Cornell University could be about to throw the field of archaeology on its head with the claim that there could be a number of inaccuracies in commonly accepted carbon dating standards.
Does carbon dating prove the earth is millions of years old?
Learn how variations in atomic structure form isotopes of an element and how the three natural isotopes of carbon differ from each other. Meet paleoclimatologist Scott Stine, who uses radiocarbon dating to study changes in climate. Find out what it means for an isotope to be radioactive and how the half-life of carbon allows scientists to date organic materials.
If a carbon atom is in existence at the beginning of the Earth, spends some of its time in bacteria, then in fish and animals and so on. How does it really work? They key to carbon dating is that the carbon isn't the carbon that's been on Earth ever since the Earth was formed. The carbon that's in carbon dating is carbon that's been newly made. Where that comes from is when cosmic rays - high energy particles from the sun - hit the Earth's atmosphere they interact with atoms and send neutrons flying around.