Zürich based Daedalean aims to bring full “level-5” autonomy to the personal electric aircraft of the near future. Daedalean’s engineers apply insights from modern robotics, deep learning and computer vision to build an autonomous guidance system that meets the highest bar for safety critical aerospace systems. Their product provides visual based guidance and navigation intended to enable both unmanned operations and personal transport certified for Visual Flight Rules (VFR) conditions.
UAVenture‘s AirRails autopilot system makes the commercial operation of unmanned Vertical Takeoff and Landing (VTOL) vehicles accessible to all with its advanced flight control and highly intuitive flight planning and monitoring software. Since 2015 UAVenture has focused on Hybrid VTOLs, a fusion of multirotor and fixed-wing technology, and is now a leader in making this technology available to mapping, search & rescue, surveillance and agricultural applications.
Both Swiss start-ups have signed an exclusive partnership to integrate Daedalean’s vision based technology into UAVenture’s AirRails VTOL autopilot system for unmanned applications and to make it available to all AirRails users. This partnership allows UAVenture to provide unrivalled guidance, navigation and control for the most challenging UAV applications to their customers.
The system, christened Magpie, will be a lightweight version of Daedalean’s technology for personal transport, featuring a real-time vision based detection of suitable emergency landing locations and detection of wires and other obstacles during a landing approach and landing.
Trials have already started
Initial offerings to a selected group of manufacturers and their customers will be at a reduced charge in exchange for this data. Trials of the system have already commenced with the first systems expected to be available to customers later this year.
The cooperation allows Daedalean to gain engineering validation in real flight at scale, and to gather a high volume of high quality imagery correlated with actual flight data in a multitude of realistic environments for training and testing of their systems.
Picture: Downward facing segmentation by on-board deep neural network as part of Daedalean’s landing guidance system. The system recognizes roofs, low and high vegetation as well as other obstacles. A large enough gray patch is a candidate for (emergency) landing. The system updates in real time on board on a small single-board computer. © Daedalean AG 2018