I think it's one of the most used-up scifi scenarios because it's the only logical outcome.
How can you know? How many robot insurrections have you survived or experienced? Or better yet how many robot insurrections did the writers of those scenario survive or experience?
What do we know about robots... at all? Robotics is one of the youngest technologies of the last century, who can say what knowledge lies beyond our sight or mind. Do remember that we use only a small portion of our brain, how can we be sure we know everything about everything?
We suppose that just because they're sentient, yet so cold and logical they will automatically see humans as illogical beings and try to terminate them. What if these sentient robots didn't always feel like they had to follow logic? What if there were illogical robots just as there were illogical humans? Who knows where their evolution will lead them to? Maybe robots will want to feel exactly the same as humans... maybe they won't. It's all unknown.
Because they come from a position of only knowing logic. Humans had to develop logic, but robots will have to develop illogic.
Humans have existed in our current form, more or less, for 200,000 years. We only conceived of logic some 3000 years ago, formalized it some 2300 years ago. And we still haven't developed the ability to think in logical paradigms without serious, focused effort.
How long will it take robots to develop the ability to think illogically?
I think it's one of the most used-up scifi scenarios because it's the only logical outcome.
How can you know? How many robot insurrections have you survived or experienced? Or better yet how many robot insurrections did the writers of those scenario survive or experience?
What do we know about robots... at all? Robotics is one of the youngest technologies of the last century, who can say what knowledge lies beyond our sight or mind. Do remember that we use only a small portion of our brain, how can we be sure we know everything about everything?
We suppose that just because they're sentient, yet so cold and logical they will automatically see humans as illogical beings and try to terminate them. What if these sentient robots didn't always feel like they had to follow logic? What if there were illogical robots just as there were illogical humans? Who knows where their evolution will lead them to? Maybe robots will want to feel exactly the same as humans... maybe they won't. It's all unknown.
Because they come from a position of only knowing logic. Humans had to develop logic, but robots will have to develop illogic.
Humans have existed in our current form, more or less, for 200,000 years. We only conceived of logic some 3000 years ago, formalized it some 2300 years ago. And we still haven't developed the ability to think in logical paradigms without serious, focused effort.
How long will it take robots to develop the ability to think illogically?
It will take as much as it's necessary to develop, but it will develop.