Isaac Asimov once described in his science fiction the three laws that regulate the behavior of robots, including: not being able to harm humans, obey human orders, and protect themselves.
Now, the British Standards Association officially issued a set of guidelines for robotic ethics that are more complex and mature than the three laws of Asimov. It is time for our society to set a code of conduct for robots. This can not help but give people a wonderful feeling of interweaving science fiction and reality.
The British Standards Institute (BSI) is a UK national standards agency with over 100 years of history and is highly authoritative worldwide. The full name of this ethical guide is "Guidelines for the Ethical Design and Application of Robots and Machine Systems" (hereinafter referred to as the "Guide"). The main target group is robot design researchers and manufacturers, who guide them on how to conduct a moral hazard assessment on a robot. . The ultimate goal is to ensure that the intelligent robots produced by humans can be integrated into the existing ethics of human society.
The standard document code is BS8611, which was released at the "Social Robotics and AI" conference in Oxfordshire, England, on September 15th. Alan Winfield, a professor of robotics at the University of the West of England, said that this is the industry's first open standard for robot ethical design.
Although the content of the document is boring, the scenes depicted in it are just like the ones drawn from science fiction. Robots are deceitful, addictive, and have the ability to self-learn beyond the scope of their current capabilities. These are listed as hazards and are a consideration for designers and manufacturers.
A broad principle is given at the beginning of the Guide: Robots are not designed to be used exclusively or primarily to kill or harm humans; humans are responsible subjects, not robots; make sure to identify the behavior of a robot. The person in charge .
The Guide also focuses on some controversial topics. For example, can humans and robots make emotional connections? Especially when the robot itself is designed to interact with children and the elderly.
According to Noel Sharkey, a professor at the University of Sheffield, this is an example of robots deceiving humans unintentionally. Robots have no emotion, but humans sometimes don't think so. He pointed out that there was a recent study that put small robots in a kindergarten. Children were very fond of it, and they thought that these robots had higher awareness than their home pets.
The "Guide" suggests that designers should pay more attention to the transparency of robots, but scientists say that this is difficult to apply in reality. This is because people do not know exactly how the AI ​​system (especially the deep learning system) decides.
An artificial agent does not use a fixed program to guide its task. It is through millions of learning and attempts to find a successful solution. This process of learning and making decisions is sometimes not understood by human designers.
The guide also mentions the issue of robots' gender and racial discrimination. This is because many of the deep learning systems use data on the Internet for training, and these data are inherently biased.
The prejudice of the robot will directly affect the application of technology. It will deepen the study of certain groups and ignore other groups. In later applications, this means that robots will “look at the food, talk to peopleâ€.
The existing example is that the voice recognition system has weaker understanding of women. For example, it is easier for a machine to recognize the face of a Caucasian person, and it is more difficult to recognize a black man's face. In the long run, in the future of medical applications, robots will be less professional in the diagnosis of women and minorities.
Prof Sharkey said: "We also need a robot black box that can be opened for scrutiny. If a robot shows a racist tendency, we can turn it off and pick it up from the street."
Two weeks ago, Reuters article pointed out that some scientists said that sex robots will make people addicted, because robots will not refuse.
The "Guide" also refers to this "over-reliance on robots" phenomenon, but does not give the designer an exact guideline to follow. This is also something that needs to be added in future work.
Via The Guardian
Extended reading:
Prejudice and discrimination in computer systems: In addition to killing, there are other ways
Surveys have shown that sex robots can reduce the spread of sexually transmitted diseases and illegal transactions. Do you want to try it?
P01 Series Push Wire Connectors
Push-in wire connector with good clamping force
For solid/stranded/tinned flexible wires
Transparent housing for visual inspection of connection
Interlock housing for orderly installation
Colour-coded for easy identification
P01 Series Push Wire Connectors,Push Wire Connectors 3 Poles,2 Conductor Connector,Wire Connectors 8 Way
Jiangmen Krealux Electrical Appliances Co.,Ltd. , https://www.krealux-online.com