The augmented agronomist: Synthesis of AI, ML and robotics to assist decision support

With recent advances in Artificial Intelligence and Machine Learning and maturity gained in many robotic applications and domains, this project sets out to provide agronomist with dedicated technological support in assessment and decision making. Adopting innovative paradigms already successfully deployed in tele-medicine and –care, a mobile robotic “proxy”, equipped with multi-modal sensing to directly facilitate visual as well as multi-modal (e.g. NIR, moisture, …) inspection, will be developed and field-tested in the context of soft fruit production. Objectives of the project are

  • Shared Control and Assisted Assessment from a Mobile Robotic Platform,
  • Integration of Automated Diagnosis Employing Multi-Modal Sensing,
  • Robotic Telepresence facilitated through Adaptive Augmented/Virtual Reality Interfaces

Consequently, decision support and outcomes from the automated analysis, as well as control-relevant information, are provided by means of virtual and augmented reality, offering an immersive experience and fluid shared control and assessment for the operator. The operator is in a closed loop with the system, despite their remote location, enabling them to effectively assess the situation and decide on interventions quickly with all information available at hand. The project is closely linked with the RASberry project (https://rasberryproject.com/) and will have access to its software and hardware resources to minimise risks and maximise synergies.

Research objectives

The scientific questions addressed with this PhD projects is mainly

  • How can a mobile robotic platform, equipped with multi-modal sensing, be effectively used by a remote agronomist to assess a situation and make decisions?
  • How can AI and ML technology be employed to support decision-making?
  • What is the performance of different AI and ML techniques in this scenario, and which are methods are well-suited?