Description

The SNOW flagship aims to develop, integrate, demonstrate and evaluate hybrid Al capabilities for an autonomous system that can operate safely and effectively in an open world.

Problem Context

Recently, robotic systems have moved on from closed and prepped environments to the open and real world. While robotic systems are capable of moving in the real world, they also need to be able to operate intelligently in such an open world. Challenges they might face are the fact that the environment can now change and that it may contain unknowns. Additionally, in an open world, the robotic system will likely have multiple purposes rather than only a single purpose. Finally, in an open world, the robotic system will likely have to work together cooperatively with others, be it other humans or other machines. To this end, the SNOW flagship aims to develop, integrate, demonstrate and evaluate hybrid Al capabilities for an autonomous system that can operate safely and effectively in an open world. Within the project we develop AI that make robotic systems capable of autonomously understanding their real open world, while simultaneously planning the remainder of their operation. In the previous iteration of Snow (Snow 1.0) we developed autonomous capabilities that allowed a Spot robot (from Boston Dynamics) to autonomously navigate its surroundings and succesfully complete a search-and-rescue task. We also worked on capabilities that enabled the robot to be tasked with a number of inspection/surveillance type of questions, for which it could autonomously acquire information from its environment and use that to draw conclusions to answer the questions. We showcased this capability by using the enhanced SPOT to perform an inspection of an industrial site. Generally, the research from previous years worked off the assumption that the autonmous systems do not manipulate their operational environment; they are passive observers. In some cases, however, autonomous systems do need to interact with their environment, such as when an inspection requires physical interaction, like taking a lab-sample or pushing a sensor onto a material to get proper coupling.

Solution

In Snow 2.0, we focus on creating autonomous capabilities precisely for systems that do have to interact with their environment. Such systems need to be able to plan their interactions and interventions, and to evaluate these while executing and coping with new situations (e.g. learning to manipulate novel objects). In the project, we will focus on creating capabilities that allow autonomous systems to recognise affordances of objects and their causal effects when (physically) interacting with these objects. This will eventually allow autonomous systems to perform physical interventions in their environment and to manipulate objects when necessary and legally allowed. We will continue with the SPOT robot, equipped with an extra hand for pushing and grabbing, and we will continue with the use case of inspecting an industrial site as well. Compared to 2022, the task will be made more complex by going from an inspection-survay task to an inspection-answer task. The robot will be expected to pro-actively seek answers for any discovered abnormalities. We will compare the number of interventions in the new, more complex task against the original, simpler task to evaluate the degree of autonomy of the sytem, when enhanced with the new capabilities.

Results

Contact

  • Willeke van Vught, Deputy Research Manager and Project Manager - Human-Machine Teaming, TNO, e-mail: willeke.vanvught@tno.nl