Menu
Forums
New posts
Search forums
What's new
New posts
New profile posts
Latest activity
Vault
Time Travel Schematics
T.E.C. Time Archive
The Why Files
Have You Seen...?
Chronovisor
TimeTravelForum.tk
TimeTravelForum.net
ParanormalNetwork.net
Paranormalis.com
ConspiracyCafe.net
Streams
1
Live streams
Featured streams
Multi-Viewer
Store
Members
Current visitors
New profile posts
Search profile posts
Log in
Register
What's new
Search
Search
Search titles only
By:
New posts
Search forums
Menu
Log in
Register
Navigation
Install the app
Install
More options
Contact us
Close Menu
Forums
Paranormal Forum
Science & Technology
"Physical Intelligence" - DARPA Constructs Robots With "Real" Brain
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="Opmmur" data-source="post: 68780" data-attributes="member: 13"><p><span style="font-size: 26px"><strong><a href="http://www.nbcnews.com/technology/futureoftech/activists-un-put-killer-robots-crosshairs-6C9633925" target="_blank">Activists, UN put 'killer robots' in the crosshairs</a></strong></span></p><p> </p><p><img src="http://msnbcmedia4.msn.com/j/streams/2013/April/130429/6C7150507-rheinmetall-skyshield.streams_desktop_large.jpg" alt="" class="fr-fic fr-dii fr-draggable " style="" /></p><p>Angela Blattner / Rheinmetall Air Defence AG</p><p> </p><p><span style="font-size: 18px">Skyshield cannons can be outfitted with a "Modular, Automatic and Network-capable Targeting and Interception System" (MANTIS) that provides automated target detection and engagement, with human supervision. The German Air Force first took delivery of the system in November 2012.</span></p><p> </p><p><span style="font-size: 18px">Nearly every fighting ship in the U.S. Navy carries a <a href="http://www.navy.mil/navydata/fact_display.asp?cid=2100&tid=487&ct=2" target="_blank">Phalanx</a> defense system, a computerized Gatling gun set on a six-ton mount that uses radar to spot targets flying out of the sky, or cruising across the ocean's surface. Once it "evaluates, tracks, engages and performs a kill assessment," a human gives the order to rattle off 4,500 rounds per minute.</span></p><p> </p><p><span style="font-size: 18px">This sort of "<a href="http://www.usnwc.edu/getattachment/d6f79610-c65c-4df1-919a-e60cc96f7bfe/Autonomous-Weapon-Systems-Repl-to-Critics-HNSJ.aspx" target="_blank">supervised</a>" automation is not out of the ordinary. When Israel's "Iron Dome" radar spots incoming missiles, it can automatically fire a counter missile to intercept it. The <a href="http://www.rheinmetall-defence.com/en/media/editor_media/rm_defence/publicrelations/pressemitteilungen/2013_1/2013_Rheinmetall_IDEX_Air_Defence.pdf" target="_blank">German Air Force's Skyshield system</a> can now also shoot down its targets with very little human interaction.</span></p><p> </p><p><span style="font-size: 18px">For years, "sniper detectors" have pointed telltale lasers at shooters who are firing on troops; DARPA is even <a href="http://www.darpa.mil/Our_Work/STO/Programs/Counter-Sniper_Program_%28C-Sniper%29.aspx" target="_blank">working on a version</a> that operates "night and day" from a moving military vehicle that's under fire. Meanwhile, sniper rifles themselves are getting smarter: In the case of the <a href="http://www.nbcnews.com/technology/gadgetbox/futuristic-rifle-turns-novice-sharpshooter-1B7916613" target="_blank">TrackingPoint precision guided firearm</a>, the operator pulls the trigger, but the gun's built-in computer decides when the bullet flies.</span></p><p> </p><p><span style="font-size: 18px">"We are not in the 'Terminator' world and we may never reach there," says Peter Singer, author of "Wired for War" and director of the <a href="http://www.brookings.edu/about/centers/security-and-intelligence" target="_blank">Center for 21st Century Security and Intelligence </a>at the Brookings Institution. "But to say there isn't an ever increasing amount of autonomy to our systems — that's fiction."</span></p><p> </p><p><span style="font-size: 18px">Preparing for a future in which robots may be given a tad more independence, an international coalition of humans rights organizations including Human Rights Watch are banding together to propose a treaty ban on "killer robots."</span></p><p> </p><p><span style="font-size: 18px">The <a href="https://redstone.msnbc.msn.com/streamseditorial/streams/entries/edit/entries/stopkillerrobots.org" target="_blank">Campaign to Stop Killer Robots</a> publicly launched April 23 with the goal of bringing the discussion about autonomous weapons systems to regular people, not just politicians and scientists. Also this month, the <a href="http://www.ohchr.org/Documents/HRBodies/HRCouncil/RegularSession/Session23/A-HRC-23-47_en.pdf" target="_blank">United Nations Special Rapporteur recommended</a> a suspension of autonomous weapons — or "lethal autonomous robotics" — until their control and use is discussed in detail. But critics of those reports argue that it's too early to call for a ban because the technology in question does not yet exist. Others say this is the reason to start talking now.</span></p><p> </p><p><span style="font-size: 18px">"Our feeling is that [it is] morally and ethically wrong that these machines make killing decisions rather than humans [making] killing decisions," <a href="http://www.hrw.org/bios/stephen-goose" target="_blank">Stephen Goose</a>, director of the arms division at the Human Rights Watch, told NBC News.</span></p><p> </p><p><span style="font-size: 18px">The group clarifies that it isn't anti-robot, or anti-autonomy — or even anti-drone. It's just that when a decision to kill is made in a combat situation, they want to ensure that decision will always be made by a human being.</span></p><p> </p><p><span style="font-size: 18px">Goose says the title of the new campaign is deliberately provocative and designed to catalyze conversation. He said, "If you have a campaign to stop 'Fully autonomous weapons,' you will fall asleep."</span></p><p> </p><p><span style="font-size: 18px">"The problem with modern robotics is there's no way a robot can discriminate between a civilian and a soldier," said <a href="http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/" target="_blank">Noel Sharkey</a>, a professor of artificial intelligence and robotics at the University of Sheffield in the U.K. and an outspoken <a href="http://icrac.net/" target="_blank">advocate for "robot arms control."</a> "They can just about tell the difference between a human and a car."</span></p><p> </p><p><span style="font-size: 18px">But a treaty prohibition at this time is unnecessary and "might even be counterproductive," cautions <a href="http://www.law.columbia.edu/fac/Matthew_Waxman" target="_blank">Matthew Waxmann</a>, a national security and law expert at Columbia Law School. Waxmann told NBC News that he anticipates a day when robots may be better than human beings at making important decisions, especially in delicate procedures like surgeries.</span></p><p> </p><p><span style="font-size: 18px">"In some of these contexts, we are going to decide not only is it appropriate for machines to operate autonomously, we may demand it, because we are trying to reduce human error," said Waxmann.</span></p><p> </p><p><span style="font-size: 18px"><a href="http://www.usnwc.edu/Academics/Faculty/Michael-Schmitt.aspx" target="_blank">Michael Schmitt</a>, professor of international law and chairman of the U.S. Naval War College, told NBC News that a ban now, as a matter of law, is a "bad idea." When the Human Rights Watch wrote a <a href="http://www.hrw.org/news/2012/11/19/ban-killer-robots-it-s-too-late" target="_blank">50-page report on the future of robotic warfare</a>, Schmitt wrote a rebuttal in Harvard's National Security Journal. His main argument: "International humanitarian law's restrictions on the use of weapons ... are sufficiently robust to safeguard humanitarian values during the use of autonomous weapon systems."</span></p><p> </p><p><span style="font-size: 18px"><a href="http://www.brookings.edu/experts/singerp" target="_blank">Singer</a>, whose work has made him an ombudsman in the growing debate over robotic warfare, says that now is the time to talk — now, when Google cars are guiding themselves through San Francisco's streets and algorithm-powered stock trading accounts crash markets based on keywords.</span></p><p> </p><p><span style="font-size: 18px">Singer thinks the debate needs to gain traction before governments and big companies become invested in the technology — and begin to influence the direction of policy. "People aren't pushing for more autonomy in these systems because it is cool. They're pushing for it because companies think they can make money out of it," he said.</span></p></blockquote><p></p>
[QUOTE="Opmmur, post: 68780, member: 13"] [SIZE=7][B][URL='http://www.nbcnews.com/technology/futureoftech/activists-un-put-killer-robots-crosshairs-6C9633925']Activists, UN put 'killer robots' in the crosshairs[/URL][/B][/SIZE] [IMG]http://msnbcmedia4.msn.com/j/streams/2013/April/130429/6C7150507-rheinmetall-skyshield.streams_desktop_large.jpg[/IMG] Angela Blattner / Rheinmetall Air Defence AG [SIZE=5]Skyshield cannons can be outfitted with a "Modular, Automatic and Network-capable Targeting and Interception System" (MANTIS) that provides automated target detection and engagement, with human supervision. The German Air Force first took delivery of the system in November 2012.[/SIZE] [SIZE=5]Nearly every fighting ship in the U.S. Navy carries a [URL='http://www.navy.mil/navydata/fact_display.asp?cid=2100&tid=487&ct=2']Phalanx[/URL] defense system, a computerized Gatling gun set on a six-ton mount that uses radar to spot targets flying out of the sky, or cruising across the ocean's surface. Once it "evaluates, tracks, engages and performs a kill assessment," a human gives the order to rattle off 4,500 rounds per minute.[/SIZE] [SIZE=5]This sort of "[URL='http://www.usnwc.edu/getattachment/d6f79610-c65c-4df1-919a-e60cc96f7bfe/Autonomous-Weapon-Systems-Repl-to-Critics-HNSJ.aspx']supervised[/URL]" automation is not out of the ordinary. When Israel's "Iron Dome" radar spots incoming missiles, it can automatically fire a counter missile to intercept it. The [URL='http://www.rheinmetall-defence.com/en/media/editor_media/rm_defence/publicrelations/pressemitteilungen/2013_1/2013_Rheinmetall_IDEX_Air_Defence.pdf']German Air Force's Skyshield system[/URL] can now also shoot down its targets with very little human interaction.[/SIZE] [SIZE=5]For years, "sniper detectors" have pointed telltale lasers at shooters who are firing on troops; DARPA is even [URL='http://www.darpa.mil/Our_Work/STO/Programs/Counter-Sniper_Program_%28C-Sniper%29.aspx']working on a version[/URL] that operates "night and day" from a moving military vehicle that's under fire. Meanwhile, sniper rifles themselves are getting smarter: In the case of the [URL='http://www.nbcnews.com/technology/gadgetbox/futuristic-rifle-turns-novice-sharpshooter-1B7916613']TrackingPoint precision guided firearm[/URL], the operator pulls the trigger, but the gun's built-in computer decides when the bullet flies.[/SIZE] [SIZE=5]"We are not in the 'Terminator' world and we may never reach there," says Peter Singer, author of "Wired for War" and director of the [URL='http://www.brookings.edu/about/centers/security-and-intelligence']Center for 21st Century Security and Intelligence [/URL]at the Brookings Institution. "But to say there isn't an ever increasing amount of autonomy to our systems — that's fiction."[/SIZE] [SIZE=5]Preparing for a future in which robots may be given a tad more independence, an international coalition of humans rights organizations including Human Rights Watch are banding together to propose a treaty ban on "killer robots."[/SIZE] [SIZE=5]The [URL='https://redstone.msnbc.msn.com/streamseditorial/streams/entries/edit/entries/stopkillerrobots.org']Campaign to Stop Killer Robots[/URL] publicly launched April 23 with the goal of bringing the discussion about autonomous weapons systems to regular people, not just politicians and scientists. Also this month, the [URL='http://www.ohchr.org/Documents/HRBodies/HRCouncil/RegularSession/Session23/A-HRC-23-47_en.pdf']United Nations Special Rapporteur recommended[/URL] a suspension of autonomous weapons — or "lethal autonomous robotics" — until their control and use is discussed in detail. But critics of those reports argue that it's too early to call for a ban because the technology in question does not yet exist. Others say this is the reason to start talking now.[/SIZE] [SIZE=5]"Our feeling is that [it is] morally and ethically wrong that these machines make killing decisions rather than humans [making] killing decisions," [URL='http://www.hrw.org/bios/stephen-goose']Stephen Goose[/URL], director of the arms division at the Human Rights Watch, told NBC News.[/SIZE] [SIZE=5]The group clarifies that it isn't anti-robot, or anti-autonomy — or even anti-drone. It's just that when a decision to kill is made in a combat situation, they want to ensure that decision will always be made by a human being.[/SIZE] [SIZE=5]Goose says the title of the new campaign is deliberately provocative and designed to catalyze conversation. He said, "If you have a campaign to stop 'Fully autonomous weapons,' you will fall asleep."[/SIZE] [SIZE=5]"The problem with modern robotics is there's no way a robot can discriminate between a civilian and a soldier," said [URL='http://staffwww.dcs.shef.ac.uk/people/N.Sharkey/']Noel Sharkey[/URL], a professor of artificial intelligence and robotics at the University of Sheffield in the U.K. and an outspoken [URL='http://icrac.net/']advocate for "robot arms control."[/URL] "They can just about tell the difference between a human and a car."[/SIZE] [SIZE=5]But a treaty prohibition at this time is unnecessary and "might even be counterproductive," cautions [URL='http://www.law.columbia.edu/fac/Matthew_Waxman']Matthew Waxmann[/URL], a national security and law expert at Columbia Law School. Waxmann told NBC News that he anticipates a day when robots may be better than human beings at making important decisions, especially in delicate procedures like surgeries.[/SIZE] [SIZE=5]"In some of these contexts, we are going to decide not only is it appropriate for machines to operate autonomously, we may demand it, because we are trying to reduce human error," said Waxmann.[/SIZE] [SIZE=5][URL='http://www.usnwc.edu/Academics/Faculty/Michael-Schmitt.aspx']Michael Schmitt[/URL], professor of international law and chairman of the U.S. Naval War College, told NBC News that a ban now, as a matter of law, is a "bad idea." When the Human Rights Watch wrote a [URL='http://www.hrw.org/news/2012/11/19/ban-killer-robots-it-s-too-late']50-page report on the future of robotic warfare[/URL], Schmitt wrote a rebuttal in Harvard's National Security Journal. His main argument: "International humanitarian law's restrictions on the use of weapons ... are sufficiently robust to safeguard humanitarian values during the use of autonomous weapon systems."[/SIZE] [SIZE=5][URL='http://www.brookings.edu/experts/singerp']Singer[/URL], whose work has made him an ombudsman in the growing debate over robotic warfare, says that now is the time to talk — now, when Google cars are guiding themselves through San Francisco's streets and algorithm-powered stock trading accounts crash markets based on keywords.[/SIZE] [SIZE=5]Singer thinks the debate needs to gain traction before governments and big companies become invested in the technology — and begin to influence the direction of policy. "People aren't pushing for more autonomy in these systems because it is cool. They're pushing for it because companies think they can make money out of it," he said.[/SIZE] [/QUOTE]
Insert quotes…
Verification
Post reply
Forums
Paranormal Forum
Science & Technology
"Physical Intelligence" - DARPA Constructs Robots With "Real" Brain
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.
Accept
Learn more…
Top