My Master’s Thesis in engineering took part in an artistic project involving drones to create a puppet show: « Isabella’s Flight« .
My job was to develop a program capable of controlling multiple drones simultaneously in an indoor environment.
Multicopters are naturally unstable, and for that reason a localization method was required to measure their positions and guide them where we wanted.
No pilot would fly a drone blindfolded!
Let’s put the plan into action
Multiple localization techniques exist for drones: GPS, ultra-wide band, lasers… and motion capture.
We chose the latter for its numerous advantages: first, its scalability: to locate more drones in the same space, the only requirement is more markers instead of more sensors.
The possibility to add multiple markers to one body allows us to not only measure the position of the object but also its orientation, so the control of the yaw was straightforward.
The frequency of measurement can be as high as hundreds of hertz, and the precision is extremely high: a stationary object placed in the middle of a 10x10x5m space can be located with a precision of 50 microns!
Motion capture is also typically used indoor so that was fitting with our specifications.
The main disadvantage of this technique is its upfront cost, which is prohibitive for a university project.
Fortunately, we could access an indoor flying range with 12 motion capture cameras of the brand Qualisys at ID2Move.
This system is intuitive to use and can export its measurement to other software such as the python program that I wrote.
The project was a success: the drone was able to fly and follow a pre-registered trajectory with a maximum deviation of 30cm and hover with a precision of 3cm.
Gaëtan Permentiers – UCLouvain