*spits Sprite and re-reads article*
Holy shit.
We are finally getting into the beginnings of NANOTECHNOLOGY!
Holy shit.
We are finally getting into the beginnings of NANOTECHNOLOGY!
Hardly.DragonLord Seth said:*flips open cell phone* Kirk to Enterprise.
That is LITERALLY the reason why there are flip open cell phones, someone wanted a working Star Trek communicator.
Well, like I said, much of the risk can be reduced through the use of artificial chromosomes, which would keep us from mucking with whatever is already there, and could more easily be shut off in the case of some unknown side effect appearing.Venats said:Agreed, and I've been somewhat informed on the problems of genetic tinkering by my uncle as he is in the biochemistry field (I work in Physics), and his lab works with studying just simple bacteria, and he tells me how their cocktails can have such vast and varying effects with so little difference. They change one thing that causes some proteins to form differently, with even the slightest variation, and the end result often times doesn't even live to reproduce.ReiverCorrupter said:Well, the mysteries of quantum mechanics not withstanding, we would still need to have a holistic understanding of human physiology as it arises out of genetics and cellular mechanics. The greatest problem for us now is pleiotropy. Genes code for proteins, not traits, and the proteins they produce are used in thousands of different processes, some mundane, others not. Changing a gene could seem completely harmless in years of medical trials, but if that gene is somehow involved in producing a nutrient that we normally get from our diet, then the one day that you run out of that nutrient is the day you die. Just as a crude example. That's why most people agree that any genetic enhancement we do have is going to have to be in the form of artificial chromosomes that we can turn off easily if any problems arise. As it stands now it's just too dangerous and unpredictable to mess with such a complex and interdependent system.
Not something to touch lightly, not something to touch for a long time. The worst part of it is as you said, that you could understand everything about every part, and understand everything down the line for four billion parts of the equation, and then one change all the way at the end changes something... like wooops, you're now sterile.
Good point, though I'm sure mother nature has her tricks to ruin the best laid plans of mice and men.ReiverCorrupter said:Well, like I said, much of the risk can be reduced through the use of artificial chromosomes, which would keep us from mucking with whatever is already there, and could more easily be shut off in the case of some unknown side effect appearing.
Not to mention that we still cannot (unless this has changed recently) the folding patterns of said basic proteins. We have a new hire in our department that will be working specifically on trying to solve that problem by use of qunatum theory, and in an overlap with our biology department. I wish them luck.ReiverCorrupter said:It all depends on computers. The biological system is too holistic to accommodate the normal scientific method, which tries to isolate only a small part and see how it works. In order to make progress we'll basically need complex computer simulations, and considering we can barely even simulate the physical properties of single proteins in a computer program now, it's going to be a while. It'll probably be 2050 before we start seeing the first commercial genetic enhancements, and it'll probably be simple stuff to help people stay in shape and be healthier. I imagine intellectual and cosmetic enhancements may be quite a bit more controversial.
That assumption can be argued to have ended, in some ways at the very least. We have not made much progress in computing power over the last several years as we have effectively reached the physical limits on silicon-resistor technology. All we've done recently is make "bigger" processor chips and been finding little loop holes or tricks to squeeze out a bit more throughput from the classic processor but each individual unit itself hasn't seen an increase in throughput in a while. Even IBMs new SYNAPSE chip is dated by at least five years (as all they did was make it slightly smaller) and the concept itself has been around for over a decade.ReiverCorrupter said:This is all assuming computers continue to progress exponentially, which is a large assumption.
Yeah, the person who championed the exponential progress model is Ray Kurzweil, and he's freaking nuts: thinks he's going to bring his dead father back to life with nanotech. Nuts. Plus he just based the theory on the very inductive argument that processing power seems to double every decade without even referencing the physical limitations of computers.Venats said:Good point, though I'm sure mother nature has her tricks to ruin the best laid plans of mice and men.ReiverCorrupter said:Well, like I said, much of the risk can be reduced through the use of artificial chromosomes, which would keep us from mucking with whatever is already there, and could more easily be shut off in the case of some unknown side effect appearing.
Not to mention that we still cannot (unless this has changed recently) the folding patterns of said basic proteins. We have a new hire in our department that will be working specifically on trying to solve that problem by use of qunatum theory, and in an overlap with our biology department. I wish them luck.ReiverCorrupter said:It all depends on computers. The biological system is too holistic to accommodate the normal scientific method, which tries to isolate only a small part and see how it works. In order to make progress we'll basically need complex computer simulations, and considering we can barely even simulate the physical properties of single proteins in a computer program now, it's going to be a while. It'll probably be 2050 before we start seeing the first commercial genetic enhancements, and it'll probably be simple stuff to help people stay in shape and be healthier. I imagine intellectual and cosmetic enhancements may be quite a bit more controversial.
That assumption can be argued to have ended, in some ways at the very least. We have not made much progress in computing power over the last several years as we have effectively reached the physical limits on silicon-resistor technology. All we've done recently is make "bigger" processor chips and been finding little loop holes or tricks to squeeze out a bit more throughput from the classic processor but each individual unit itself hasn't seen an increase in throughput in a while. Even IBMs new SYNAPSE chip is dated by at least five years (as all they did was make it slightly smaller) and the concept itself has been around for over a decade.ReiverCorrupter said:This is all assuming computers continue to progress exponentially, which is a large assumption.
There is, in my opinion, an illusion of exponential progress at this point as the newer "tech" is rampantly coming out and speeding up, but the thing that many people forget is that it is walking on an already paved road. Dunno, though, I'm not that much of a computer person.
Quantum computing, and the location where AI research will have to go in this honest poster's opinion if they want to simulate real conscious AI, hasn't budged in five years as is still stuck on the same problems.
That is a bad pun my friendSpace Jawa said:I have a feeling that these things are going to have a lot of bugs to work out before they're ready for use in people.