"Physical Intelligence" - DARPA Constructs Robots With "Real" Brain

Opmmur

Time Travel Professor
Messages
5,049
"Physical Intelligence" - DARPA Constructs Robots With "Real" Brain

We might soon hear a new generation of robots say they will act and think just like humans.
The ambitious goal of DARPA's project "physical intelligence" is to create robots that are controlled by a living brain!

We have previously seen that it is possible to create robots that are self-aware.

Now, the Department of Defense is reportedly almost finished building robots with "real" brains.
According to National Defense Magazine the "physical intelligence" program is a research and development initiative launched back in 2009 in order to "to understand intelligence as a physical phenomenon and to make the first demonstration of the principle in electronic and chemical systems."
A Pentagon-funded team of researchers has constructed a tiny machine that would allow robots to act independently.

Unlike traditional artificial intelligence systems that rely on conventional computer programming, this one "looks and 'thinks' like a human brain," said James K. Gimzewski, professor of chemistry at the University of California, Los Angeles.

Gimsewski is a member of the team that has been working under sponsorship of the Defense Advanced Research Projects Agency on a program called "physical intelligence." This technology could be the secret to making robots that are truly autonomous, Gimzewski said during a conference call hosted by Technolink, a Los Angeles-based industry group.

This project does not use standard robot hardware with integrated circuitry, he said.
The device that his team constructed is capable, without being programmed like a traditional robot, of performing actions similar to humans, Gimzewski said.

Participants in this project include Malibu-based HRL (formerly Hughes Research Laborary) and the University of California at Berkeley's Freeman Laboratory for Nonlinear Neurodynamics. The latter is named after Walter J. Freeman, who has been working for 50 years on a mathematical model of the brain that is based on electroencephalography data. EEG is the recording of electrical activity in the brain.
What sets this new device apart from any others is that it has nano-scale interconnected wires that perform billions of connections like a human brain, and is capable of remembering information, Gimzewski said. Each connection is a synthetic synapse.

A synapse is what allows a neuron to pass an electric or chemical signal to another cell. Because its structure is so complex, most artificial intelligence projects so far have been unable to replicate it.

Is a new generation of robots controlled by a living brain "being born"?
A "physical intelligence" device would not require a human controller the way a robot does, said Gimzewski. The applications of this technology for the military would be far reaching, he said. An aircraft, for example, would be able to learn and explore the terrain and work its way through the environment without human intervention, he said. These machines would be able to process information in ways that would be unimaginable with current computers.

Artificial intelligence research over the past five decades has not been able to generate human-like reasoning or cognitive functions, said Gimzewski. DARPA's program is the most ambitious he has seen to date. "It's an off-the-wall approach," he added.

Studies of the brain have shown that one of its key traits is self-organization. "That seems to be a prerequisite for autonomous behavior," he said.

"Rather than move information from memory to processor, like conventional computers, this device processes information in a totally new way."

This could represent a revolutionary breakthrough in robotic systems, said Gimzewski.

It is not clear, however, that the Pentagon is ready to adopt this technology for weapon systems. The Obama administration's use of drones in "targeted killings" of terrorist suspects has provoked a backlash and prompted the Pentagon to issue new rules for the use of robotic weapons.

"Autonomous and semi-autonomous weapon systems shall be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force," said a Nov. 2012 Defense Department policy statement. Autonomous weapons, the document said, must "complete engagements in a timeframe consistent with commander and operator intentions and, if unable to do so, [must] terminate engagements or seek additional human operator input before continuing the engagement."


Should robots with a "real" brain have the same rights as humans?
Yes!
No!
I don't know...
pollcode.com free polls
 

BlastTyrant

Senior Member
Messages
2,601
No, a robot doesn't have any rights lol This is a very very big mistake the human brain is a corrupt and misguided tool that is very easily misguided, it would end up like IRobot
 

TnWatchdog

Senior Member
Messages
7,099
I think there are many robots in politics...but they are not as intelligent.
 

Opmmur

Time Travel Professor
Messages
5,049
Activists, UN put 'killer robots' in the crosshairs

6C7150507-rheinmetall-skyshield.streams_desktop_large.jpg

Angela Blattner / Rheinmetall Air Defence AG

Skyshield cannons can be outfitted with a "Modular, Automatic and Network-capable Targeting and Interception System" (MANTIS) that provides automated target detection and engagement, with human supervision. The German Air Force first took delivery of the system in November 2012.

Nearly every fighting ship in the U.S. Navy carries a Phalanx defense system, a computerized Gatling gun set on a six-ton mount that uses radar to spot targets flying out of the sky, or cruising across the ocean's surface. Once it "evaluates, tracks, engages and performs a kill assessment," a human gives the order to rattle off 4,500 rounds per minute.

This sort of "supervised" automation is not out of the ordinary. When Israel's "Iron Dome" radar spots incoming missiles, it can automatically fire a counter missile to intercept it. The German Air Force's Skyshield system can now also shoot down its targets with very little human interaction.

For years, "sniper detectors" have pointed telltale lasers at shooters who are firing on troops; DARPA is even working on a version that operates "night and day" from a moving military vehicle that's under fire. Meanwhile, sniper rifles themselves are getting smarter: In the case of the TrackingPoint precision guided firearm, the operator pulls the trigger, but the gun's built-in computer decides when the bullet flies.

"We are not in the 'Terminator' world and we may never reach there," says Peter Singer, author of "Wired for War" and director of the Center for 21st Century Security and Intelligence at the Brookings Institution. "But to say there isn't an ever increasing amount of autonomy to our systems — that's fiction."

Preparing for a future in which robots may be given a tad more independence, an international coalition of humans rights organizations including Human Rights Watch are banding together to propose a treaty ban on "killer robots."

The Campaign to Stop Killer Robots publicly launched April 23 with the goal of bringing the discussion about autonomous weapons systems to regular people, not just politicians and scientists. Also this month, the United Nations Special Rapporteur recommended a suspension of autonomous weapons — or "lethal autonomous robotics" — until their control and use is discussed in detail. But critics of those reports argue that it's too early to call for a ban because the technology in question does not yet exist. Others say this is the reason to start talking now.

"Our feeling is that [it is] morally and ethically wrong that these machines make killing decisions rather than humans [making] killing decisions," Stephen Goose, director of the arms division at the Human Rights Watch, told NBC News.

The group clarifies that it isn't anti-robot, or anti-autonomy — or even anti-drone. It's just that when a decision to kill is made in a combat situation, they want to ensure that decision will always be made by a human being.

Goose says the title of the new campaign is deliberately provocative and designed to catalyze conversation. He said, "If you have a campaign to stop 'Fully autonomous weapons,' you will fall asleep."

"The problem with modern robotics is there's no way a robot can discriminate between a civilian and a soldier," said Noel Sharkey, a professor of artificial intelligence and robotics at the University of Sheffield in the U.K. and an outspoken advocate for "robot arms control." "They can just about tell the difference between a human and a car."

But a treaty prohibition at this time is unnecessary and "might even be counterproductive," cautions Matthew Waxmann, a national security and law expert at Columbia Law School. Waxmann told NBC News that he anticipates a day when robots may be better than human beings at making important decisions, especially in delicate procedures like surgeries.

"In some of these contexts, we are going to decide not only is it appropriate for machines to operate autonomously, we may demand it, because we are trying to reduce human error," said Waxmann.

Michael Schmitt, professor of international law and chairman of the U.S. Naval War College, told NBC News that a ban now, as a matter of law, is a "bad idea." When the Human Rights Watch wrote a 50-page report on the future of robotic warfare, Schmitt wrote a rebuttal in Harvard's National Security Journal. His main argument: "International humanitarian law's restrictions on the use of weapons ... are sufficiently robust to safeguard humanitarian values during the use of autonomous weapon systems."

Singer, whose work has made him an ombudsman in the growing debate over robotic warfare, says that now is the time to talk — now, when Google cars are guiding themselves through San Francisco's streets and algorithm-powered stock trading accounts crash markets based on keywords.

Singer thinks the debate needs to gain traction before governments and big companies become invested in the technology — and begin to influence the direction of policy. "People aren't pushing for more autonomy in these systems because it is cool. They're pushing for it because companies think they can make money out of it," he said.
 

Opmmur

Time Travel Professor
Messages
5,049
Futuristic rifle turns novice into sharpshooter

The Link: available soon from TrackingPoint

It all goes back to "Top Gun." In the heads-up display on Maverick's Tomcat, you can see a computer compensate for human aim with precision laser guidance and careful calculations. How long before that technology made its way to to a conventional hunting rifle? It's here now, with a price tag of $17,000 to $21,000.

We came to Las Vegas the first week of January, the way we always do, for the Consumer Electronics Show. The vast trade show features over 3,300 exhibitors, and covers 1.9 million square feet. But there are no shooting ranges at CES. To check out TrackingPoint, we had to drive out to the hills outside of town.

As someone who not only isn't a marksman but pretty much avoids guns altogether, I approached the TrackingPoint rifle a bit gingerly. However, when the company's president, Jason Schauble, walked me through it, I realized that as long as I paid attention (and observed the basic safety rules of firearms), I would be able to hit that target without trouble. Not 15 minutes later, I did — at a distance of nearly seven football fields.

How does it work? A laser rangefinder identifies the target, and tells the gun where to aim to hit it, given conditions such as humidity, wind, and the typical ballistic drop you'd expect from a bullet shot from a gun at such a distance.

You pick your target by dropping a pin on it using the camcorder-like zoom lens. When you want to shoot that target, you line up crosshairs inside the scope with the pin you dropped. The weirdest thing is, when you squeeze the trigger, it doesn't fire. You have to squeeze the trigger and line up the crosshairs with your mark. When you do, the gun goes boom, and the target takes a bullet.

No matter where you are on the gun debate, the technology used is an impressive system. The rifle will be available soon from TrackingPoint. Watch the video above for the whole story.
 

Top