This is for my peeps:
Here is my problem when I hear that some governments are trying to enact laws that require robots to use The Three Laws of Robotics (from Asimov’s books).
First: it’s all in the interpretation. What is a human, or a living being even, to a robot? What constitutes harm? What is an instruction, and is the literal interpretation or the spirit of intent more important, if that intent can even be discerned? Defining a moral code for a robot (at least a robot designed in the next fifty years) is a ridiculous idea.
Second: The WHOLE POINT of the three laws was to point out how making pat rules like these can be circumvented. A million SF books and/or movies exist to show the myriad ways in which a human-equivalent intelligence can reinterpret high-order rules like this to do what they see fit.
On the other hand, Asimov’s 30 laws seem perfectly cromulent.
“8. A robot may not act in such a fashion as would make dogs obsolete, because dogs are less expensive than robots, and robots should be reserved for science things.”
He’s a dog person.
Whodda thunk.
“8. A robot may not act in such a fashion as would make dogs obsolete, because dogs are less expensive than robots, and robots should be reserved for science things.”
He’s a dog person.
Whodda thunk.
The Good Doctor is one of, if not my actual, favorite authors. The 30 laws are new to me and I was surprised that the 0th law was never mentioned. R. Daneel made that one up to justify saving humanity at the cost of a few individual lives. You cannot get more black and white than a robot making up it’s own laws. Why don’t people seem to understand satire?
Oh, good question about defining what is human. 150 or even as few as 30 years ago most people in this country wouldn’t have been able to agree. There are places in the world today where the definition is fluid on a case-by-case basis.
C’MON FUZZY-LOGIC!
The Good Doctor is one of, if not my actual, favorite authors. The 30 laws are new to me and I was surprised that the 0th law was never mentioned. R. Daneel made that one up to justify saving humanity at the cost of a few individual lives. You cannot get more black and white than a robot making up it’s own laws. Why don’t people seem to understand satire?
Oh, good question about defining what is human. 150 or even as few as 30 years ago most people in this country wouldn’t have been able to agree. There are places in the world today where the definition is fluid on a case-by-case basis.
C’MON FUZZY-LOGIC!