Well, shit. Here we go. http://www.damninteresting.com/on-the-origin-of-circuits/ For once, kickassfacts was totally on the ball.
That is a scary as hell. I understand it was simple, and it took it many iterations to accomplish the task.. but...
I'm pretty confident we will kill ourselves off. The question is if we go extinct because or war, weather, or creation of our replacement
hmm, any chance we could get them to up the progression on this one, I dunno about the rest of you guys, but if I've gotta go, it certainly wouldn't hurt my feelings if everyone else went with me XD
I'm pretty sure it will end with 99.9% of mankind simply starving to death while 0.1% (or less) own fucking everything and have their remaining loyal robots wage destructive war against an AI that deems their selfish ways a threat to all intelligent life.
Since we never had any guidelines or plan and just the will to survive seems to be difficult to come by...
To be honest I don't see this particular advancement as an issue. If it was produced in conjunction with a intelligent AI, then maybe, but it also depends on the general disposition of the AI to begin with. As it stands right now? All this means is that in a few generations, we'll have computers that can be dropped or bumped around and then just repair themselves, and I'm okay with that.
First they created better versions of themselves, and I said nothing. Second they self repaired, and again, I said nothing.
Bunker wont help you as the chip has already adapted to the humans instinct to survive apocalyptic events by going into bunkers.
Instead of creating things to help skynet in domination, I'd feel a lot safer if they'd develop things that point towards the positronic brain and the three laws.
Exactly. Definition of human has to be vaguely hardwired into all systems so they'll sort self destruct if they realize that their actions were somehow in violation of the laws.
Except then you make a humanoid robot and your loyal robots won't be able to hurt it while it kills all the humans.