I know that. It could be programmed to be exactly like a human. Or like anything, but using the method I described, it could be better. (My definition of better anyway.)
"Will it seek to annihilate them to be replaced by others like itself?" Is this a survival instinct? Does it wish to populate the world with it's own "species" or is it willing to evolve or even go extict?
If you want to talk to me about this some more, here's my e-mail. (I like this sort of subject a lot.)
Email
Organic life
is inefficient.
Part of why I like machines (artificial intelligence to be exact) is their lack of emotions. They never get angry, completly reliable (if programmed correctly), etc. (By the way, I appear to have one quality of thjat sort of machine, I never get angry.)
Heh. Depends on the machine. If programmed correctly, a machine could be just what you say they're not. No, an intelligent machine wouldn't necessarily be a better inhabitant nor a worse one. It depends on its outlook -- does it find organisms to be frightfully inefficient? Will it seek to annihilate them to be replaced by others like itself? Those are the kinds of machines you want to avoid. Thus why emotions and empathy should be hard-programmed into such things, in my opinion.
Of course, what's "good" and "bad", "better" or "worse" does stem from our very human, very organic point of view... |