This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | Archive 2 | Archive 3 |
This book includes an essay where Asimov discusses the 3 laws of tools, dose any one know which essay, I've included the ref (12 I think) to the book in general but as an FA it should have more detail & I can't find my copy at present to look it up. -- Nate 11:39, 3 April 2007 (UTC)
I find the idea that if humans were to make robots, they would be made with such benevolent ideas as put forward by Isaac Asimov's laws of robotics. Individual humans would probably,on the whole err towards the side of cauton and want to make robots obey the laws.
I propose that it is unlikely to be individual humans with everyday human ethics and morality who will be the entities driving the design of robots with the necessary intelligence to process such laws. Governments with strong incentives to use robots as weapons will likely be the first. The keenest edge on technology is that technology designed for warfare. Over $1000Bn was earmarked in 2004 for a decade of weapons technology funding. http://www.washingtonpost.com/wp-dyn/articles/A32689-2004Jun10.html .
As of early 2007, new laws of robotics are being drafted with somewhat different ethics to those proposed by Isaac Asimov. http://www.theregister.co.uk/2007/04/13/i_robowarrior/ . So although we can indulge ourselves in the fantasy and science fiction of I,Robot, let's be under no illusion it is fantasy, and sadly not real life. To put it very bluntly, I believe the first machines able to process such laws will probably be built with the goal of extending it's master's dominion, and to kill if it helps further that goal. Nick R Hill 21:10, 15 April 2007 (UTC)
It is well within the reach of the current state of the art to enable military drones such as the US "Predator" as used in Afghanistan and Iraq to fire its weapon automatically. It already finds and "locks on" its target without any (human) operator intervention. I wonder if these Laws are the reason why the US military presently do not allow such a level of autonomy - the "fire button" is always under the control of the operators back at the base. Roger 09:10, 18 October 2007 (UTC)
Apparently yes, based on the above discussion and on the IEEE papers by Roger Clarke:
'Asimov's laws of robotics: Implications for information technology.' Part 1: IEEE Computer, Dec 1993, p53–61. Part 2: IEEE Computer, Jan 1994, p57–66.
If they should, then it's a sad fact that apparently Bill Gates never read the Asimov texts, otherwise Microsoft's products, in compliance of the Second Law, would obey their owners. -- AVM ( talk) 00:11, 22 November 2007 (UTC)
A story that proceeds from the logical consequences of having Asimov robot society, especially one with the zeroth law is "With Folded Hands", Rating: Five Planets. by Jack Williamson. pub. 1947.
This story ends the human race with a whimper.
Check it out, it should connect here.
Sean —Preceding unsigned comment added by Seanearlyaug ( talk • contribs) 00:28, 29 January 2008 (UTC)
Just a question that has probably been asked in the past, why no mention of the three rules governing RoboCop's behavior. It appears to me that they were created with Asimov's laws in mind.-- Jeremy ( Blah blah...) 03:25, 17 May 2008 (UTC)
It's been a few months since the issues were brought up with it. It's a fine section idea, and relevant, but poorly-written and without sources. It is also a problem that it is half of the section dealing with one particular film, when the other half deals with films specifically of Asimov's works.
Since there has been no discussion of it on the talk page that I can see, I'm going to remove it, partly so that I can avoid including it in the audio version. This is pushing my Wikipedia boundaries -- please let me know if I'm in the wrong here.
Triacylglyceride ( talk) 17:25, 28 November 2008 (UTC)
While I know this isn't a real complaint about the article, does it really make sense for the fourth and fifth laws to be numbered the way they are? The lower the number of the law, the higher the precedence (so in an scenario where a robot telling a human it was a robot would harm the human, the robot would disobey the fourth law). However, the fifth law is higher priority than the fourth.
If a robot is to establish it's identity as a robot at all times, it must know it is a robot. However, knowing it is a robot does not force it to tell everybody else it is a robot. A robot cannot follow through on the fourth law without obeying the fifth, yet it can follow through on the fifth law without obeying the fourth, which is the reverse of most of the other laws. 72.148.112.184 ( talk) 08:09, 2 July 2009 (UTC)
Want Responsible Robotics? Start With Responsible Humans
Military robots are sure to put these three laws to shame. -- Taxa ( talk) 23:18, 1 September 2009 (UTC)
...the first passage in Asimov's short story "Liar!" (1941) that mentions the First Law is the earliest recorded use of the word robotics...
Should this passage be there right at the head of the article given that we are talking about the Three Laws not the word Robotics, per se? Further, shouldn't it actually read ...that also alludes to the First Law..., since the Laws were not directly stated until Runaround in 1942?
Jubilee♫
clipman
22:28, 6 October 2009 (UTC)
Should something be mentioned in the article regarding how in the new Battlestar and Caprica shows that there is no mention of the laws and robots are programmed apparently without any sense of not attacking humans, even at the initial stage. For example, on Twitter ( http://twitter.com/SergeGraystone), there was this tweet: "Technically nothing, I suppose. It is not as though there are laws of robotics. RT @clockpie What prevents conspiring against humans? 9:55 PM Feb 5th from web." -- RossF18 ( talk) 14:59, 14 February 2010 (UTC)
Whats a robot to do if 2 humans give it two orders that contradict one another? The 2nd law says it has to listen to any order given to it by any human(or by any qualified personnel or whatever that doesnt matter) but the two persons could disagree example
person1:"robot do the dishes" person2:"robot do not do the dishes"
pretty basic example but this problem in the laws could have some much bigger consequences if the persons were to disagree on some thing bigger then dishes —Preceding unsigned comment added by 207.61.78.21 ( talk) 17:42, 14 October 2007 (UTC)
A human would be in a similar bind in this situation. Presumably, the robot, like a human, would have some means of deciding which order to obey. Fro example, if person1 was superior to person2, the robot might say "I am sorry, but I have instructions to do the dishes." Or if it were ambiguous, the robot might request more information, for example "I have instructions from person1 to do the dishes. Do you wish to override these instructions?" Just like with humans, a robot would presumably not give equal weight to commands from different people.-- RLent 20:37, 15 October 2007 (UTC)
If you order a robot to not obey the order of someone your basicaly ordering it to not listen to the 2nd law. Which means yor order basical is in contradiction with it self: it cant not fallow that order or it would violate the 2nd law, but it cant fallow it because that would violate the 2nd law. If these laws are supost to supersede all everything else the robot experienceses in its operation simply telling it wouldnt really fix the problem. The problem is the law doesnt say anything about anyone being superior. (even if i did you could have 2 people on the same level) so if the robot ever get an order to disregard an order its still stuck in a loop. So its not gonna do anything...which could be yet another violation... you know what the things just gonna end up in failure mode. Thats what I think unless you can come up with something different.—Preceding unsigned comment added by 207.61.78.21 ( talk • contribs)
pgr94 ( talk) 01:42, 5 August 2010 (UTC)
Here are some articles that specifically mention the 3 laws:
pgr94 ( talk) 06:39, 5 August 2010 (UTC)
Were the laws really inspired by the Hippocratic oath?
This is an archive of past discussions. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | Archive 2 | Archive 3 |
This book includes an essay where Asimov discusses the 3 laws of tools, dose any one know which essay, I've included the ref (12 I think) to the book in general but as an FA it should have more detail & I can't find my copy at present to look it up. -- Nate 11:39, 3 April 2007 (UTC)
I find the idea that if humans were to make robots, they would be made with such benevolent ideas as put forward by Isaac Asimov's laws of robotics. Individual humans would probably,on the whole err towards the side of cauton and want to make robots obey the laws.
I propose that it is unlikely to be individual humans with everyday human ethics and morality who will be the entities driving the design of robots with the necessary intelligence to process such laws. Governments with strong incentives to use robots as weapons will likely be the first. The keenest edge on technology is that technology designed for warfare. Over $1000Bn was earmarked in 2004 for a decade of weapons technology funding. http://www.washingtonpost.com/wp-dyn/articles/A32689-2004Jun10.html .
As of early 2007, new laws of robotics are being drafted with somewhat different ethics to those proposed by Isaac Asimov. http://www.theregister.co.uk/2007/04/13/i_robowarrior/ . So although we can indulge ourselves in the fantasy and science fiction of I,Robot, let's be under no illusion it is fantasy, and sadly not real life. To put it very bluntly, I believe the first machines able to process such laws will probably be built with the goal of extending it's master's dominion, and to kill if it helps further that goal. Nick R Hill 21:10, 15 April 2007 (UTC)
It is well within the reach of the current state of the art to enable military drones such as the US "Predator" as used in Afghanistan and Iraq to fire its weapon automatically. It already finds and "locks on" its target without any (human) operator intervention. I wonder if these Laws are the reason why the US military presently do not allow such a level of autonomy - the "fire button" is always under the control of the operators back at the base. Roger 09:10, 18 October 2007 (UTC)
Apparently yes, based on the above discussion and on the IEEE papers by Roger Clarke:
'Asimov's laws of robotics: Implications for information technology.' Part 1: IEEE Computer, Dec 1993, p53–61. Part 2: IEEE Computer, Jan 1994, p57–66.
If they should, then it's a sad fact that apparently Bill Gates never read the Asimov texts, otherwise Microsoft's products, in compliance of the Second Law, would obey their owners. -- AVM ( talk) 00:11, 22 November 2007 (UTC)
A story that proceeds from the logical consequences of having Asimov robot society, especially one with the zeroth law is "With Folded Hands", Rating: Five Planets. by Jack Williamson. pub. 1947.
This story ends the human race with a whimper.
Check it out, it should connect here.
Sean —Preceding unsigned comment added by Seanearlyaug ( talk • contribs) 00:28, 29 January 2008 (UTC)
Just a question that has probably been asked in the past, why no mention of the three rules governing RoboCop's behavior. It appears to me that they were created with Asimov's laws in mind.-- Jeremy ( Blah blah...) 03:25, 17 May 2008 (UTC)
It's been a few months since the issues were brought up with it. It's a fine section idea, and relevant, but poorly-written and without sources. It is also a problem that it is half of the section dealing with one particular film, when the other half deals with films specifically of Asimov's works.
Since there has been no discussion of it on the talk page that I can see, I'm going to remove it, partly so that I can avoid including it in the audio version. This is pushing my Wikipedia boundaries -- please let me know if I'm in the wrong here.
Triacylglyceride ( talk) 17:25, 28 November 2008 (UTC)
While I know this isn't a real complaint about the article, does it really make sense for the fourth and fifth laws to be numbered the way they are? The lower the number of the law, the higher the precedence (so in an scenario where a robot telling a human it was a robot would harm the human, the robot would disobey the fourth law). However, the fifth law is higher priority than the fourth.
If a robot is to establish it's identity as a robot at all times, it must know it is a robot. However, knowing it is a robot does not force it to tell everybody else it is a robot. A robot cannot follow through on the fourth law without obeying the fifth, yet it can follow through on the fifth law without obeying the fourth, which is the reverse of most of the other laws. 72.148.112.184 ( talk) 08:09, 2 July 2009 (UTC)
Want Responsible Robotics? Start With Responsible Humans
Military robots are sure to put these three laws to shame. -- Taxa ( talk) 23:18, 1 September 2009 (UTC)
...the first passage in Asimov's short story "Liar!" (1941) that mentions the First Law is the earliest recorded use of the word robotics...
Should this passage be there right at the head of the article given that we are talking about the Three Laws not the word Robotics, per se? Further, shouldn't it actually read ...that also alludes to the First Law..., since the Laws were not directly stated until Runaround in 1942?
Jubilee♫
clipman
22:28, 6 October 2009 (UTC)
Should something be mentioned in the article regarding how in the new Battlestar and Caprica shows that there is no mention of the laws and robots are programmed apparently without any sense of not attacking humans, even at the initial stage. For example, on Twitter ( http://twitter.com/SergeGraystone), there was this tweet: "Technically nothing, I suppose. It is not as though there are laws of robotics. RT @clockpie What prevents conspiring against humans? 9:55 PM Feb 5th from web." -- RossF18 ( talk) 14:59, 14 February 2010 (UTC)
Whats a robot to do if 2 humans give it two orders that contradict one another? The 2nd law says it has to listen to any order given to it by any human(or by any qualified personnel or whatever that doesnt matter) but the two persons could disagree example
person1:"robot do the dishes" person2:"robot do not do the dishes"
pretty basic example but this problem in the laws could have some much bigger consequences if the persons were to disagree on some thing bigger then dishes —Preceding unsigned comment added by 207.61.78.21 ( talk) 17:42, 14 October 2007 (UTC)
A human would be in a similar bind in this situation. Presumably, the robot, like a human, would have some means of deciding which order to obey. Fro example, if person1 was superior to person2, the robot might say "I am sorry, but I have instructions to do the dishes." Or if it were ambiguous, the robot might request more information, for example "I have instructions from person1 to do the dishes. Do you wish to override these instructions?" Just like with humans, a robot would presumably not give equal weight to commands from different people.-- RLent 20:37, 15 October 2007 (UTC)
If you order a robot to not obey the order of someone your basicaly ordering it to not listen to the 2nd law. Which means yor order basical is in contradiction with it self: it cant not fallow that order or it would violate the 2nd law, but it cant fallow it because that would violate the 2nd law. If these laws are supost to supersede all everything else the robot experienceses in its operation simply telling it wouldnt really fix the problem. The problem is the law doesnt say anything about anyone being superior. (even if i did you could have 2 people on the same level) so if the robot ever get an order to disregard an order its still stuck in a loop. So its not gonna do anything...which could be yet another violation... you know what the things just gonna end up in failure mode. Thats what I think unless you can come up with something different.—Preceding unsigned comment added by 207.61.78.21 ( talk • contribs)
pgr94 ( talk) 01:42, 5 August 2010 (UTC)
Here are some articles that specifically mention the 3 laws:
pgr94 ( talk) 06:39, 5 August 2010 (UTC)
Were the laws really inspired by the Hippocratic oath?