Please explain why it was widely accepted--even by the religious zealots of the time--that the world was old a few centuries before the development of radiometric techinques.texanarob said:The earth is demonstrably young using all scientific theories, other than radiometric dating.
Also, please briefly state how you know so much about radiometric dating. I'd like to know where to start showing you how wrong you are. For my part, I've taken several university classes (graduate level) on the topic and have used several radiometric dating methods on multiple industrial projects. I've done this professionally.
This can only be called a lie. Radiometric dating relies upon the same principles as radioactive tracers in medicine--every time a doctor uses a radioactive tracer it amounts to a test of these principles. Also, radiometric dating has been, to a surprising extent corroborated by non-radiometric methods, such as varves, corals, bivalves, and tree rings. Then there are the natural reactors. I believe there are three of them on Earth, and they serve as good tests for this sort of thing.Radiometric dating (such as carbon dating) has never been demonstrated to be accurate,
Please, please tell us the study you're citing as evidence of this. I have my suspicions, but I want you to put this noose around your neck yourself.and always seems to over estimate the known ages of specimens.
Again, this is a lie. Radiometric dating is actually fairly complicated, and done properly it's a means to test that theory. I was once given a list of radiometric dates, and their corresponding atmoic families and minerals sampled. From that (and a healthy dose of igenous stratigraphy) I was able to deduce multiple episodes of volcanic activity, including intrusions, actual volcanism (meaning the magma broke through the surface), and contact metamorphism (if you didn't see that coming, you didn't deserve to be in the class, but being able to pinpoint the max temperature was nice).This is due to it's uniformitarian assumptions, where temperature, pressure and concentrations of chemicals are assumed to have been constant for extreme periods of time.
The concept you are so blithely ignoring here is closing temperature. Each mineral and each radioactive sequence has a particular temperature above which the atoms can more or less freely move out of the crystal structure. Given that radioactive decay usually alters the chemical nature of the crystal, the daughter isotopes tend to leave rather readily. Below that temperature, the atoms remain locked in as defects in the crystal structure (a common enough thing--crystals with defects are more stable over time than those without). This temperature can be surprisingly low--some of the more complex silicates have extremely low closing temperatures, as their structure provides what amount to pipelines out of the crystal, and uranium includes a radon phase, which is a gas. The closing temperature is frequently below the point where a mineral will metamporphose at all, and almost always well below the melting point.
You're also ignoring the fact that temperature and pressure have essentially no impact on radioactive decay rates. You simply can't squeeze the atoms together tight enough with crustal temperatures and pressures to impact the nucleous of the atoms. Concentrations can influence things, but that's where good old-fashioned stratigraphy comes in--NO ONE does radiometric dating without first doing a stratigraphic analysis.