Home

Awesome

Attentive support

banner Static Badge Static Badge

A simulation-based implementation of the attentive support robot introduced in the paper To Help or Not to Help: LLM-based Attentive Support for Human-Robot Group Interactions.
See the project website for an overview.

Setup

Python 3.8 - 3.11
Prerequisites for building the simulator workspace: g++, cmake, Libxml2, Qt5, qwt, OpenSceneGraph, Bullet Physics

<details> <summary>Ubuntu 20</summary> libxml2-dev, qt5-default, libqwt-qt5-dev, libopenscenegraph-dev, libbullet-dev, libasio-dev, libzmq3-dev, portaudio19-dev </details> <details> <summary>Ubuntu 22</summary> libxml2-dev, qtbase5-dev, qt5-qmake, libqwt-qt5-dev, libopenscenegraph-dev, libbullet-dev, libasio-dev, libzmq3-dev, portaudio19-dev </details> <details> <summary>Fedora</summary> cmake, gcc-c++, OpenSceneGraph-devel, libxml2, qwt-qt5-devel, bullet-devel, asio-devel, cppzmq-devel, python3-devel, portaudio </details>

Clone this repo and change into it: git clone https://github.com/HRI-EU/AttentiveSupport.git && cd AttentiveSupport
You can either run the setup script: bash build.sh or follow these steps:

  1. Get the submodules: git submodule update --init --recursive
  2. Create a build directory in the AttentiveSupport directory: mkdir -p build and change into it cd build
  3. Install the smile workspace: cmake ../src/Smile/ -DCMAKE_INSTALL_PREFIX=../install; make -j; make install
    Note that if you have the Smile workspace installed somewhere else, you have to change the relative path in config.yaml accordingly. For details, check here
  4. Install the Python dependencies: python -m venv .venv && source .venv/bin/activate && pip install -r requirements.txt
  5. Make sure you have an OpenAI API key set up, see the official instructions
  6. Enjoy 🕹️

Containerized Runtime

podman:

podman run \
-e OPENAI_API_KEY=replace_me \
-e WAYLAND_DISPLAY \
--net=host \
-it \
localhost/attentive_support

docker (rootless):

docker run \
-e OPENAI_API_KEY=replace_me \
-v /tmp/.X11-unix:/tmp/.X11-unix \
-it \
localhost/attentive_support

remote, rootless with internal ssh server:
In certain scenarios it might not be possible to display the graphical window, e.g., when running docker rootless on a remote machine with X11. For these scenarios, the docker image can be built with the option docker build --build-arg WITH_SSH_SERVER=true -t localhost/attentive_support .. Include proxy settings as necessary. Start the image:

docker run \
  -it \
  -p 2022:22 \
  localhost/attentive_support

This starts an ssh server on port 2022 that can be accessed with username root and password hri: ssh -X root@localhost -p 2022. Run the example script:

export RCSVIEWER_SIMPLEGRAPHICS=True
export OPENAI_API_KEY=replace_me
/usr/bin/python -i /attentive_support/src/tool_agent.py

Usage

Running the agent

Customizing the agent

Additional features

Example

Running the simulation with "Move the red glass to Felix":
demo sequence

For reproducing the situated interaction scenario run the following: