self-driven cars ought to give a considerable contribution to road safety. Computers are just so much smarter than people, and don’t drink the alcohol before they sit down at the wheel either. Maybe we know yet an ounce of concern. Make the machine is always a better decision than I would have done? If the election occurs, it protects me, or quake?
on the military battlefield experience more issues.
Already the drones raised ethical objections. Certainly it is handy to be able to sit at a screen and kill the terrorists two continents away. The distance between the operator and the target can, however, create a tv-spelsmentalitet. The one who hits the wrong may yet new life.
Autonomous weapons that become too independent can theoretically find a bit of everything. Science fiction? Maybe.
the Missiles were fired, also at long range. Primitive land mines have killed people without discernment in ages. Robots equipped with artificial intelligence (AI) still gives an entirely new dimension to warfare, especially if they may act completely or partly without human decision.
in recent years, and it obviously applies to military application. Last week’s issue of the magazine “Economist” describes both ”opportunities” and problems.
That humans and machines collaborate is not new and hardly controversial. Autonomous vehicles can evacuate the wounded or carrying supplies. But there are already dozens of weapons systems that can identify targets, aim, and kill without anyone pressing a button. They can defend themselves, but also to wait for the right opportunity to attack.
to natural persons determine what should be pushed and when. No Terminator, as in the movie with Arnold Schwarzenegger from the 1984, would not be unleashed.
With the AI, and with machines that can both learn from experiences and make plans, it opens other perspectives. 3D printers can place them in the appropriate place. Robots do not need to be buried. And they receive no pension, as one british general puts it.
to figure the right is the dependent of, for example, communication systems that can be disrupted by the enemy. Pictures can fool the computer, a rifle, be a turtle, as an expert put it on the People and defence national conference in Sälen last week.
Autonomous weapons that become too independent can theoretically find a bit of everything. Science fiction? Maybe.
People have throughout history shown themselves fully capable of terrible atrocities and war crimes, and that has been the case both so-called civilized countries and remote jungles. Everyone should still basically be in agreement that the machines do not decide who should live or die. Responsibilities must be able to be required, now, if it is of governments, generals or AI engineers.
not straightforward. To begin with, can the same machine be equipped with different software, and to determine when a weapon becomes autonomous is a challenge even if all the facts are at hand. The technical sprången for the AI sets are constantly up new barriers to the attempts to regulate and monitor.
It is reflected in how different states look at it. The united states, Russia and the united kingdom belong to those who say ‘ no ‘ to ban on autonomous weapons. China will not refrain from developing them. Dictatorships here and there will likely autonomous trade-offs.
Therefore, effective control is difficult to achieve. The Economist believes that the traditional laws, like the Geneva and hague conventions, is the right starting point. All to protect civilians and not to use more violence than necessary, which sounds reasonable.
like that; no one stopped the barbarity in Syria, the former Yugoslavia or Rwanda. And when an army is at a disadvantage can be reviewed waver, and the autonomous robots are called in.
the atomic Bomb is enough still good as domedagsvapen, but requires a lot of infrastructure that only few have access to. Terrorrörelsen IS used by the drones. When the artificial intelligence is spreading in the entire society, even the small players find gear.
to be desired, but tricky to enforce. The difficulty to control the robotrustningen may not, however, lead to the world just do not care to try.