Microsoft has revealed Project AirSim, a new platform that uses the metaverse to safely train and test autonomous flight.
In the metaverse, which presents realistic environments, AI models may fly millions of times in seconds, learning how to react to many factors just like they would in the physical world. They will have a greater understanding of how the aircraft can fly in rain or snow, how weather conditions may affect the battery, and much more.
Project AirSim harnesses the power of Azure to create large volumes of data for training AI models on the exact actions to take at each stage of flight, from takeoff to cruising to landing.
It will also include libraries of simulated 3D environments encompassing a wide range of urban and rural landscapes, as well as a suite of advanced pre-trained AI models to aid in the automation of aerial infrastructure inspection, last-mile deliveries, and urban air mobility.
The high-fidelity simulation was essential to AirSim, an earlier open-source project from Microsoft Research that has since been decommissioned but inspired today’s debut. AirSim was a popular research tool, but it needed extensive coding and machine learning knowledge.
Microsoft has now turned that open-source technology into an end-to-end platform that enables Advanced Aerial Mobility (AAM) clients to test and train AI-powered aircraft in the metaverse more easily.
Developers will be able to use Project AirSim to access pre-trained AI building pieces, such as sophisticated models for recognizing and avoiding obstacles and performing precision landings.