课堂英语

美文欣赏童话故事历史文化英语诗歌名人名言英文歌词幽默笑话人文地理星座英语双语阅读

科技界呼吁全球禁止“杀手机器人”(双语)

cocotang 于2015-08-13发布 l 已有人浏览
增大字体 减小字体
科技界的一些最知名人士呼吁在世界范围内禁止“杀手机器人”,他们警告称,越过这一界限将启动一场新的全球军备竞赛。

Some of the biggest names in science and technology have called for a global ban on “killer robots”, amid warnings that crossing that threshold would start a new global arms race.  

科技界的一些最知名人士呼吁在世界范围内禁止“杀手机器人”,他们警告称,越过这一界限将启动一场新的全球军备竞赛。

The intervention by more than 1,000 experts in the field of artificial intelligence came in an open letter, which was also signed by Professor Stephen Hawking, the cosmologist, Elon Musk, Tesla’s chief executive, and Steve Wozniak, the co-founder of Apple. 

人工智能领域的1000多名专家是在一封公开信中这么呼吁的,宇宙学者斯蒂芬?霍金(Stephen Hawking)教授、特斯拉(Tesla)首席执行官埃伦?穆斯克(Elon Musk)以及苹果(Apple)联合创始人史蒂夫?沃兹尼亚克(Steve Wozniak)也在这封信中签名。

Although “robot soldiers” are still confined to the drawing board, the rapid advances in computational power and artificial intelligence have raised the prospect that the military could field one within two decades. 

尽管“机器人士兵”仍限于人们的设想,但计算能力以及人工智能的快速发展带来了的可能性是:军队可能在未来20年内部署这类机器人。

The petition, which will be presented on Tuesday at the International Joint Conference on Artificial Intelligence in Buenos Aires, warns that the development of weapon systems that can independently identify and attack targets without any human intervention would create the “third revolution in warfare” after the invention of gunpowder and nuclear weapons. 

这封请愿书将于周二递交位于布宜诺斯艾利斯的国际人工智能联合会议(International Joint Conference on Artificial Intelligence)。信中警告称,继发明火药和核武器之后,能够在没有人类干预下独立识别并攻击目标的武器系统的发展,将带来“第三次战争革命”。

It paints a stark scenario of future conflicts akin to something from the film franchise Terminator. 

信中描绘了一幅可怕的情景,未来冲突将类似于影片《终结者》(Terminator)的情节。

“Autonomous weapons are ideal for tasks such as assassinations, destabilising nations, subduing populations and selectively killing a particular ethnic group”, the letter states. 

信中称:“自动武器是暗杀、破坏国家稳定、控制人口以及有选择性地杀害某个民族等的理想工具。”

“We therefore believe that a military AI arms race would not be beneficial for humanity. There are many ways in which AI can make battlefields safer for humans, especially civilians, without creating new tools for killing people.” 

“因此我们认为,人工智能军备竞赛不会对人类有利。人工智能可以在不制造新的杀人武器的情况下,通过多种方式让战场变得对人类更安全,特别是对平民而言。”

The United Nation is so concerned about the development of what it calls Lethal Autonomous Weapons that it convened its first ever meeting to discuss the risk posed by the new technology last year. 

联合国(UN)对于它所称的“致命自动武器”的发展感到非常担忧,去年,联合国首次召开会议,讨论这种新技术所带来的风险。

The letter is the second this year co-ordinated by the Future of Life Institute (FLI) hitting out at introducing artificial intelligence to the battlefield. But the latest petition is much harder hitting in calling for a ban on these weapon systems. 

请愿书是未来生活研究所(Future of Life Institute)今年协调的第二封抨击人工智能用于军事的请愿书。但最新请愿书在呼吁禁止这类武器系统方面的态度要比第一封强硬的多。

The Pentagon is one of the biggest backers of robotic research and in 2013 one of its think-tanks, the Office of Naval Research, awarded a $7.5m grant to researchers at Tufts, Brown, Rensselaer Polytechnic Institute, Georgetown and Yale to look into how autonomous robots could be taught the difference between right and wrong.

美国国防部是最支持机器人研究的国家之一,2013年,美国智库海军研究办公室(Office of Naval Research)曾向塔夫斯大学(Tufts)、布朗大学(Brown)、伦斯勒理工学院|Rensselaer Polytechnic Institute)、乔治城大学(Georgetown)以及耶鲁大学(Yale)的研究人员发放750万美元补助,以研究如何教授自动机器人分辨对错。

 1 2 下一页

分享到

添加到收藏

双语阅读排行