Artists preventively punish robot for not to hunt humans!

Artists preventively punish robot for not to hunt humans!
© Provided by: SKAI.gr. Artists punish
preventively robots not to hunt humans (vid)

The punishment of a “naughty robot” was attended by visitors to the “Future en Seine” digital festival in Paris.

At the artistic establishment of French designer Philip Vilaz-Boas and his co-architect Paul Kundamy, a robotic hand writes, again and again in a piece of paper, that must NOT chase people defying the first law of robotics.

The hand sits on the classical wooden desk of schools, filling the pages of a notepad with the same phrase: “I do not have to hunt people … I do not have to hunt people”.

Artists preventively punish robot for not to hunt humans!

“Preventive punishment is necessary to avoid future indecisiveness”,

commented Boas, referring to the science fiction books of Isaac Asimov, who in 1942, in his narrative “Runaround”, wrote the Three Laws of Robotics, to which they would obey all artificial intelligence units in the future.

These laws provide that:

1). A robot will never harm a person, nor will it allow his inertia to harm any human being

2). Every robot should obey the orders given to it by humans, unless these orders contradict the first law

and

3). Each robot will have to protect his existence if this does not conflict with the first and second laws

Of course, Asimov formulated these laws in the “easy” English language, leaving the “translation” into algorithms for the developers of the future.

Besides, he (except a popular writer) was a simple biochemistry professor without the necessary background, at a time when robotics were still in their infancy (died in 1992 from renal and heart failure due to AIDS).

But technology is now moving at a rapid pace toward the robotic autonomy, with the fear that the robot generation may lead to a dystopian anthropomorphism.

Boas, who, along with Kundamy, “punish” the robot, dreams of a future where artificial intelligence will release people from the obligation of hard, long-term work, creating the basis for a fairer global community.

“If this is not achieved, then we will witness a precarious margin of our society!”

The future to be avoided!

Boas shares the concern of eminent scientists and entrepreneurs about the potential risks of mistaken exploitation of technological progress. Steven Hawking, Elon Mask and Bill Gates have repeatedly expressed their fears about the future inequalities of a fully automated society.

Steven Hawking: The Damocles Sword of Disparities!

At the end of last year, the theoretical physicist at Cambridge University, warned of a possible “disappearance” of middle-class jobs due to automation, which would lead to an increase in economic inequalities on a global scale.

“Now, more than any other time in our history, our species has to work together”,

Hawking wrote to the Guardian stressing the need to redistribute part of the accumulated wealth to the less developed societies to give new impetus to their development.

Elon Mask: Salvation even to those who will never work!

But also Tesla and Space-X CEO Taylor and Space-X CEO Elon Mask, speaking at the World Governance Conference in Dubai in February, predicted that the jobs to be lost would be so many that governments would have to set a general basic income, Even for the unemployed and the people who will never get a job.

“I think it will be necessary. There will be fewer jobs that a robot will not be able to do better”,

Gates: Robot tax!

In an interview with Quartz, founder of Microsoft’s Bill Gates (and the world’s richest man), he argued that the US government should now tax companies that replace human resources with automation systems to fund vocational training for displaced people, for their absorption in similar or even entirely different branches.

During his run, Bill Gates has donated more than $31 billion to charities and humanitarian and development projects in the less favored areas of the planet.

Source: http://www,skai.gr


(Συνολικές Επισκέψεις: / Total Visits: 8)

(Σημερινές Επισκέψεις: / Today's Visits: 1)
Σας αρέσει το άρθρο; / Do you like this post?
+1
0
+1
0
+1
0
+1
0
+1
0
+1
0
+1
0

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.