Home

Awesome

Open AR-Sandbox

There seems to be a problem with bokeh/ panel and Python at the moment. These versions seem to work:

pip install panel==0.14.4 bokeh==2.4.3 jupyter-bokeh==2.0.4

Welcome to the# Open AR-Sandbox repository. If you do not know what this is all about, have a look at this video:

The CGRE Sandbox in action

What is an AR-sandbox?

Python 3 License: LGPL v3

Table of Contents

:warning: Warning! It is unfortunate that we have to state this here, but: downloading the software and presenting it somewhere as your own work is serious scientific fraud! And if you develop content further, then please push these developments back to this repostory - in the very interest of scientific development (and also a requirement of the license). For more details, please consult the information below and the license.

Introduction

Augmented Reality Sandboxes (AR-sandboxes) are a great tool for science outreach and teaching due to their intuitive and interaction-enhancing operation. Recently AR Sandboxes are becoming increasingly popular as interactive exhibition pieces, teaching aids and toys.

AR-sandboxes consist of a box of sand that can be freely sculpted by hand. The topography of the sand is constantly scanned with a depth camera and a computed image is projected back onto the sand surface, augmenting the sandbox with digital information.

However, most of these common AR Sandboxes are limited to the visualization of topography with contour lines and colors, as well as water simulations on the digital terrain surface. The potential for AR Sandboxes for geoscience education , and especially for teaching strutural geology, remains largely untapped.

For this reason, we have developed Open AR-Sandbox, an augmented reality sandbox designed specifically for the use in geoscience education. In addition to the visualization of topography it can display geologic subsurface information such as the outcropping lithology, creating a dynamic and interactive geological map. The relations of subsurface structures, topography and outcrop can be explored in a playful and comprehensible way.

Features

Some of the modules already implemented include:

Check the video below for some of the features in action: Open AR Sandbox Features

The Open AR-Sandbox software as well as GemPy are under continuous development and including more modules for major outreach. Some of the features we are currently working on include:

License, use and attribution

If you use Open AR-Sandbox in a scientific abstract or publication, please include appropriate recognition of the original work. For the time being, please cite our publication in the journal Geosphere:

Florian Wellmann, Simon Virgo, Daniel Escallon, Miguel de la Varga, Alexander Jüstel, Florian M. Wagner, Julia Kowalski, Hu Zhao, Robin Fehling, Qian Chen; Open AR-Sandbox: A haptic interface for geoscience education and outreach. Geosphere 2022; doi: https://doi.org/10.1130/GES02455.1

Directly in BibTeX-format:

@article{10.1130/GES02455.1,
    author = {Wellmann, Florian and Virgo, Simon and Escallon, Daniel and de la Varga, Miguel and Jüstel, Alexander and Wagner, Florian M. and Kowalski, Julia and Zhao, Hu and Fehling, Robin and Chen, Qian},
    title = "{Open AR-Sandbox: A haptic interface for geoscience education and outreach}",
    journal = {Geosphere},
    year = {2022},
    month = {02},
    issn = {1553-040X},
    doi = {10.1130/GES02455.1},
    url = {https://doi.org/10.1130/GES02455.1},
    eprint = {https://pubs.geoscienceworld.org/gsa/geosphere/article-pdf/doi/10.1130/GES02455.1/5541527/ges02455.pdf},
}

Feel free to download and use the Open AR-Sandbox software! We do not provide any warranty and any guarantee for the use. We also do not provide professional support, but we aim to answer questions posted as Issues on the github page as quickly as possible.

Open AR-Sandbox is published under an GNU Lesser General Public License v3.0, which means that you are free to use it, if you do not do any modifications, in a wide variety of ways (even commercially). However, if you plan to modify and redistribute the code, you also have to make it available under the same license!

Also, if you do any modifications, especially for scientific and educational use, then please provide them back to the main project in the form of a pull request, as common practice in the open-source community. If you have questions on the procedure, feel free to contact us about it.

These are the main conditions for using this library:

For more details on the licsense, please see provided license file.

:warning: Warning! It is unfortunate that we have to state this here, but: downloading the software and presenting it somewhere as your own work is serious scientific fraud! And if you develop content further, then please push these developments back to this repostory - in the very interest of scientific development (and also a requirement of the license). For more details, please consult the information below and the license.

Requirements

You will need:

Mount the kinect and projector facing down vertically in the center above of the box. The optimal distance will depend on the size of your sandbox and the optics of the projector, from our experience a distance of 150 cm is well suited for a 80 cm x 100 cm box. More details on how to set up the kinect and projector can be found in the 1_calib_projector.ipynb and 2_calib_sensor.ipynb notebooks, and if you want to use the ArUco markers 3_calib_arucos.ipynb.

Installation

First of all you will need a healthy Python 3 environment. We recommend using Anaconda. In addition to some standard Python packages, you will need a specific setup dependent on the Kinect version you are using. In the following we provide detailed installation instructions.
Now download or clone this repository open_AR_Sandbox from github.

  1. First clone the repository:
git clone https://github.com/cgre-aachen/open_AR_Sandbox.git
  1. Enter the new downloaded project folder:
cd open_AR_Sandbox
  1. Create a new anaconda environment
conda create -n sandbox-env python=3.8
  1. Now when you want to use the sandbox and the packages we are about to installl you will have to activate the environment before starting anything
conda activate sandbox-env

Standard packages

To install all the standard packages please use the requirements.txt file:

pip install -r requirements.txt

[RECOMMENDED] You can also have a local installation of the sandbox by using the File "setup.py" by doing:

pip install -e . 

[ALTERNATIVELY] You can use our sandbox-environment.yml file to instantly install all the dependencies with the extensions. Beware that you still need to install the kinect sensors by yourself according to your operative system.

conda env create -f sandbox_environment.yml

Download sample data

You have the option to download some publicly shared files from our Open AR-Sandbox project shared folder. You will need to do this if you want to run the tests, use the landslides simulations and/or get the trained models for the use of the Landscape generation module.

In the terminal type:

python3 sandbox/utils/download_sample_datasets.py

and follow the instruction on the terminal to download the specific files you need. We use Pooch to help us fetch our data files and store them locally in your computer to their respective folders. Running this code a second time will not trigger a download since the file already exists.

You can also follow the Jupyter Notebook 'Download_datasets.ipynb' and follow the commands.

Kinect Installation

For Windows

Kinect v1 - Future

There is still no support for kinect V1...

Kinect V2 - PyKinect2

(Tested on Windows 10). First, install the current Kinect SDK including drivers. You can use the software bundle to test the connection to your kinect, before you continue.

To make Python and the Kinect SDK communicate, install the related PyKinect2 wrappers which can be easily installed via:

pip install pykinect2

Unfortunately, the configuration of PyKinect2 needs to be adjusted to work on a 64 bit System. Therefore, edit the Lib/site-packages/pykinect2/PyKinectV2.py file, go to line 2216 and comment it:

# assert sizeof(tagSTATSTG) == 72, sizeof(tagSTATSTG)

Add the following lines below:

import numpy.distutils.system_info as sysinfo
required_size = 64 + sysinfo.platform_bits / 4
assert sizeof(tagSTATSTG) == required_size, sizeof(tagSTATSTG)

For Linux

Kinect v1 - libfreenect

To make Open AR-Sandbox talk to the first generation kinect you will need the Libfreenect Drivers with Python Wrappers. The installation is kind of straight forward for Linux and MacOS but challenging for Microsoft (in fact: if you pull it off, let us know how you did it!) The steps can be summarized as follows (refer to any problems regarding installation in to link) To build libfreenect, you'll need

Once these are installed we can follow the next commands

sudo apt-get install git cmake build-essential libusb-1.0-0-dev
sudo apt-get install freeglut3-dev libxmu-dev libxi-dev
git clone https://github.com/OpenKinect/libfreenect
cd libfreenect
mkdir build
cd build
cmake -L .. # -L lists all the project options
cmake .. -DBUILD_PYTHON3=ON
make 


cd ../wrappers/python
python setup.py install
# now you can see if the installation worked running an example
python demo_cv2_async.py

Kinect v2 - freenect2

or pylibfreenect2
For this we are going to use a python interface for the library libfreenect2 called freenect2.

git clone https://github.com/OpenKinect/libfreenect2.git
cd libfreenect2
sudo apt-get install build-essential cmake pkg-config
sudo apt-get install libusb-1.0-0-dev libturbojpeg0-dev libglfw3-dev
mkdir build && cd build
cmake .. -DENABLE_CXX11=ON -DENABLE_OPENCL=ON -DENABLE_OPENGL=ON -DBUILD_OPENNI2_DRIVER=ON -DCMAKE_INSTALL_PREFIX=$HOME/freenect2 -DCMAKE_VERBOSE_MAKEFILE=ON
make
make install
sudo cp ../platform/linux/udev/90-kinect2.rules /etc/udev/rules.d/

Now unplug and replug the Kinect sensor.

./bin/Protonect
export PKG_CONFIG_PATH=$HOME/freenect2/lib/pkgconfig

NOTE: if you installed the freenect2 in other location, specify variables with the corresponding path

pip install freenect2

IMPORTANT: To this point will work in any python that starts with the terminal. Nevertheless, if we start python from another source, the error ImportError: libfreenect2.so.0.2: cannot open shared object file: No such file or directory will appear every time we import the package. To fix this problem we will need to export the variables again or if you want a more permanent solution, open the .bashrc file and paste the following at the end of the file:

# set PATH to freenect2 to be imported in python
export PKG_CONFIG_PATH=$HOME/freenect2/lib/pkgconfig
<your_path>/anaconda3/envs/<sandbox-env>/lib
sudo cp $HOME/freenect2/lib/libfreenect2{.so,.so.0.2,.so.0.2.0} $HOME/anaconda3/envs/sandbox-env/lib/

LiDAR L515 Installation

Installing in Windows

First, go to the latest release page on GitHub and download and execute the file:

Intel.RealSense.Viewer.exe

Follow the instructions for the installation and update the firmware of your sensor. You should be able to use and see the depth and RGB image.

Installing in Linux

Detailed installation steps can be found in the linux installation guide. The steps are as follows:

Reconnect the Intel RealSense depth camera and run: realsense-viewer to verify the installation.

Running with python

After the sensor is installed on your pltaform, the Python wrapper can be easily installed via:

pip install pyrealsense2

If any problems with the installation reference to Intel RealSense Python Installation

External Packages

GemPy

To use implicit geological models inside the sandbox, go to GemPy, clone or download the repository and follow the Gempy Installation instructions. With gempy installed you can follow the tutorial GempyModule.

pip install gempy

If using windows you will need to install Theano separately as instructed in here

conda install mingw libpython m2w64-toolchain
conda install theano
pip install theano --force-reinstall

Optional: Gempy will print some output each time a frame is calculated, which can fill up the console. to supress this, go to your gempy installation and comment out line 381 in gempy/core/model.py

# print(f'Active grids: {self._grid.grid_types[self._grid.active_grids]}')

Devito

This package uses the power of Devito to run wave proppagation simmulations. More about this can be found in notebooks/tutorials/10_SeismicModule/. Follow the Devito installation instructions.

pip install --user git+https://github.com/devitocodes/devito.git

PyGimli

This library is a powerful tool for Geophysical inversion and Modelling. Some examples can be found in notebooks/tutorials/11_Geophysics/. PyGimli can be installed following the installation intructions here

We recommend creating a new environment where PyGimli is already installed and over that one install the sandbox dependencies.

conda create -n sandbox-env -c gimli -c conda-forge pygimli=1.1.0

PyTorch

To use the LandscapeGeneration module we need to install PyTorch. This module use the power of CycleGAN to take a topography from the sandbox, translate this as a DEM and then display it again on the sandbox as a Landscape image. To install the dependencies for this module do:

#For Windows
pip install torch===1.6.0 torchvision===0.7.0 -f https://download.pytorch.org/whl/torch_stable.html
#For Linux
pip install torch torchvision
git clone https://github.com/junyanz/pytorch-CycleGAN-and-pix2pix
cd pytorch-CycleGAN-and-pix2pix
pip install -r requirements.txt

Once this is installed, copy the trained model in /notebooks/tutorials/09_LandscapeGeneration/checkpoints folder, and then follow the notebook. Get in contact with us to provide you with the train model for this module.

Pynoddy

To use Pynoddy, please follow the installation instructions. We recommend installing Noddy from source files.

Project Development

Open AR-Sandbox is developed at the research unit Computational Geoscience and Reservoir Engineering (CGRE) at RWTH Aachen University, Germany.

CGRE

Project Lead

Prof. Florian Wellmann, PhD

Maintainers (also external to CGRE)

Obtaining a full system

If you are interested in buying a fully operating set-up including appropriate hardware, pre-installed software, and set-up and maintenance, please contact Terranigma Solutions GmbH.