In addition to the future of work, health, and mobility, MSRM PIs are researching future-oriented technologies and their applications for the future of our environment. To this end, Prof. Jia Chen, Prof. Timo Oksanen, Prof. Sami Haddadin, Prof. Markus Ryll, Prof. Stefan Leutenegger, and their teams showed for the first time a combination of prototypes of how AI and intelligent robots could be used for sustainable agriculture and environmental monitoring at Die Neue Sammlung - The Design Museum in Munich. Their research opens possibilities in the direction of a home office workplace for farmers. State Minister for Science and Art Bernd Sibler, and TUM President Prof. Thomas Hofmann attended the demonstration.
The MSRM presented two demonstrators; the first one combined the research activities of Prof. Chen, Prof. Oksanen, and Prof. Haddadin, and showed how machine intelligence can be used in agriculture and environmental monitoring. Remote-controlled drones from the air supported technology on the ground.
A team led by Prof. Jia Chen, Professorship of Environmental Sensing and Modeling, developed an intelligent sensor system for environmental monitoring. The purpose of this demo was to show how future mobile sensors on the drone can serve the purpose of calibrating the sensor values and interpolating the concentration maps in the city using machine learning algorithms. For this, one sensor node was in Dürnast and a second was at Die Neue Sammlung - The Design Museum, in Munich. The sensor system measured nitrogen oxides, fine dust, temperature, and humidity every second. The values were transmitted in real-time and displayed at the museum. The sensor system is innovative and smart in terms of energy management and drift compensation. This autumn, they will permanently set up 50 of these sensor systems to generate a real-time air quality map in Munich. The sensor system shown in this demo is funded by the Bavarian Ministry for the Environment and Consumer Protection.
Pollutant emissions in agriculture can be further reduced with intelligent aids, for example through autonomous tractors; this is the research of Prof. Oksanen, Chair of Agrimechatronics, and his team. Participants could watch an autonomous tractor mowing on the TUM research farm in Freising. The wayfinding took place automatically. The advantage of robotized agricultural machines is that they can save diesel fuel and thus reduce CO2 and NOx emissions.Drones can be used to optimize the autonomous navigation of the tractor. These can monitor the environment during operation and thus recognize obstacles, for example. The researchers of the Environmental Robotics Lab at the MSRM under the direction of Prof. Haddadin, Chair of Robotics and Systems Intelligence, have developed a framework for the ultra-long-range teleoperation of mobile robots. Although this term seems complex, it is now very easy to use. Dr. Maria Danninger, Chief of Technology and Innovation at MSRM, explained that all we still need today to control a drone remotely is a conventional internet connection and a completely normal computer.
The drone also remembers its starting position and can return home after the mission is over. Currently, mainly specialists and research organizations use telerobotics due to the expensive interface for robot teleoperation. With this demo, the team wanted to show that access to telerobotics is possible for everyone. A unique feature of the MSRM’s interface is the development of close interdisciplinary cooperation with human-computer interaction researchers. The current system was developed in a user-centered design process, with the aim that the most intuitive workflow possible also enables non-robotics experts to master the teleoperation of the drone, almost like a computer game.
But in the future, flying robots should become more and more autonomous and no longer need a pilot. For this step “into independence”, the drone must be able to recognize its surroundings and at some point also manipulate it “from the air”, e.g. using gripping systems. To illustrate what that could look like, the second demonstrator showed the research work of Prof. Stefan Leutenegger, formerly from Imperial College London (ICL), and Prof. Markus Ryll, recently brought to Munich from MIT. On the one hand, Prof. Leutenegger, Professorship Machine Learning for Robotics, explained that for a drone to fly fully automatically, it needs an accurate localization in the environment. The drone uses cameras and AI to build up a map of the environment in which it can also locate itself in real-time. Attendees could see through the drone’s eyes, how it produced a real-time map of the environment around the Demo Booth and locate itself. The drone with the camera sensor did not fly inside Die Neue Sammlung - The Design Museum due to safety reasons, but it was moved around manually. On the other hand, Prof. Ryll, Professorship for Autonomous Aerial Systems, showed on screen a simulated dual-arm drone, which can collect/grasp items and can be used for harvesting apples. One step further towards enabling home office for a farmer.