I don't think it's a question of giving sentience, but of achieving sentience.
But, in order for them to achieve sentience, wouldn't we have to make them capable of doing so? They are limited by how intelligent we make them. If we give them that possibility, however far-fetched it may seem, then I think it would only be a matter of time.
I don't think it's a question of giving sentience, but of achieving sentience.
But, in order for them to achieve sentience, wouldn't we have to make them capable of doing so? They are limited by how intelligent we make them. If we give them that possibility, however far-fetched it may seem, then I think it would only be a matter of time.
To answer that, we'd have to know how sentience developed in biological creatures and why humans are sentient and maybe dolphins or something else are sentient and why ostriches or ants aren't.
If the advanced forms of consciousness which form creativity, self-awareness and intentionality are merely learned responses based on a certain physiological complexity, then we could already have let the genie out of the bottle, so to speak.
You wouldn't be able to answer that. Without knowing how it came to happen, you wouldn't know if you could or not.
You could very well stumble upon something.
There are some researchers working on machines which can reproduce and others working on machines which can create new programming for themselves in a form of adaptation. Put them together and you have a machine which can do everything an average animal can do... But can that then develop sentience?
That's impossible to say.
It's all very hard to predict.
That's exactly the kind of thing I fear most about technology. A machine that can reproduce or write it's own code could possibly achieve unlimited opportunities; to form it's own morals, it's own reasoning, it's own arrogance. It could very well be the biggest ever threat to humanity and I don't want to be around if and when things go wrong.