A super-skilled AI might negate any risk of jamming and enable fleets of smart FPV drones to attack simultaneously without human operators

Saturday, September 23rd, 2023

An AI racing drone recently beat human pilots, raising the question of when AI drones will transform warfare:

“The AI is superhuman because it discovers and flies the best maneuvers, also it is consistent and precise, which humans are not,” says Scaramuzza. He notes that, as with AlphaGo, Swift was able to use moves — in this case flight trajectories — which the human champions did not even think were possible.

[…]

A $400 FPV with the warhead from an RPG rocket launcher can knock out a tank, personnel carrier or artillery piece from several miles away, or chase down and destroy a truck traveling at high speed. They are cheap enough to use against individual footsoldiers and can dive into trenches. But it requires a skilled human pilot. Ukrainian sources say the training takes around a month to achieve proficiency, and many people fail the course.

FPV success rates appear to vary wildly, with different sources citing 20%, 30%, 50% or 70% — much appears to depend on the exact situation, the presence of jamming, and the skill of the pilot. A super-skilled AI might push that rate far above 70%, negate any risk of jamming and enable fleets of smart FPV drones to attack simultaneously without human operators.

[…]

Swift relies on having reliable information on the speed, location and orientation of the drone in real time. This is far more challenging outdoors where there are changes of illumination, wind gusts and other variables to contend with.

Also, Swift has to learn the course ahead of time to work out its flight path.

“The current system only works for drone racing and for a specific racing track of which you perfectly know the map,” says Scaramuzza.

The neural network which navigates through the gates is trained specifically for that layout . The other problem is that Swift trains on a specific setup and if conditions change – for example the wind changes direction – all its learning may be wasted.

“Swift’s perception system and physics model assumes that the appearance of the environment and its physics are both consistent with what was observed during training,” says Scaramuzza. “If this assumption fails, the system can fail.”

Comments

  1. Bomag says:

    So, when AI is robust enough to give us self driving cars, and I can finally relax and enjoy life, the FBI will then have complete ability to track and take out badthinkers; so I will have a new set of worries.

  2. McChuck says:

    The ‘AI’ racing drone beat humans, but only over the course it was trained on. When they moved the rings 6″, it lost every time.

  3. Allen says:

    Old me: “There’s no way real scientists could be as reckless as those in science fiction novels.”

Leave a Reply