Home

Awesome

<a href="https://github.com/aestream/aestream"><img src="https://github.com/aestream/aestream/raw/main/logo.png" /></a>

<p align="center"> <a href="https://github.com/aestream/aestream/actions"> <img src="https://github.com/aestream/aestream/workflows/Build%20and%20test/badge.svg" alt="Test status"></a> <a href="https://pypi.org/project/aestream/" alt="PyPi"> <img src="https://img.shields.io/pypi/dm/aestream" /> </a> <a href="https://github.com/aestream/aestream/pulse" alt="Activity"> <img src="https://img.shields.io/github/last-commit/aestream/aestream" /> </a> <a href="https://discord.gg/7fGN359"> <img src="https://img.shields.io/discord/723215296399147089" alt="chat on Discord"></a> <a href="https://doi.org/10.1145/3584954.3584997"><img src="https://zenodo.org/badge/DOI/10.1145/3584954.3584997.svg" alt="DOI"></a> </p>

AEStream sends event-based data from A to B. AEStream is both a command-line tool an a C++/Python library with built-in GPU-acceleration for use with PyTorch, and Jax. We support reading and writing from files, event cameras, network protocols, and visualization tools.

<img src="https://github.com/aestream/aestream/raw/main/docs/aestream_flow.png" />

Read more about the inner workings of the library in the AEStream publication.

Installation

Read more in our installation guide

The fastest way to install AEStream is by using pip: pip install aestream.

SourceInstallationDescription
pip<code>pip install aestream</code> <br/> <code>pip install aestream --no-binary aestream</code>Standard installation <br/> Support for <a href="https://aestream.github.io/aestream/install.html#Event-camera-support">event-cameras</a> and CUDA kernels</a> (more info)
nix<code>nix run github:aestream/aestream</code> <br/> <code>nix develop github:aestream/aestream</code>Command-line interface <br/> Python environment
dockerSee <a href="https://aestream.github.io/aestream/install.html">Installation documentation</a>

Contributions to support AEStream on additional platforms are always welcome.

Usage (Python): Load event files

Read more in our Python usage guide

AEStream can process .csv, .dat, .evt3, and .aedat4 files like so. You can either directly load the file into memory

FileInput("file.aedat4", (640, 480)).load()

or stream the file in real-time to PyTorch, Jax, or Numpy

with FileInput("file.aedat4", (640, 480)) as stream:
    while True:
        frame = stream.read("torch") # Or "jax" or "numpy"
        ...

Usage (Python): stream data from camera or network

Streaming data is particularly useful in real-time scenarios. We currently support Inivation, Prophesee, and SynSense devices over USB, as well as the SPIF protocol over UDP. Note: requires local installation of drivers and/or SDKs (see installation guide).

# Stream events from a DVS camera over USB
with USBInput((640, 480)) as stream:
    while True:
        frame = stream.read() # A (640, 480) Numpy tensor
        ...
# Stream events from UDP port 3333 (default)
with UDPInput((640, 480), port=3333) as stream:
    while True:
        frame = stream.read("torch") # A (640, 480) Pytorch tensor
        ...

More examples can be found in our example folder. Please note the examples may require additional dependencies (such as Norse for spiking networks or PySDL for rendering). To install all the requirements, simply stand in the aestream root directory and run pip install -r example/requirements.txt

Example: real-time edge detection with spiking neural networks

We stream events from a camera connected via USB and process them on a GPU in real-time using the spiking neural network library, Norse using fewer than 50 lines of Python. The left panel in the video shows the raw signal, while the middle and right panels show horizontal and vertical edge detection respectively. The full example can be found in example/usb_edgedetection.py

Usage (CLI)

Read more in our CLI usage documentation page

Installing AEStream also gives access to the command-line interface (CLI) aestream. To use aestraem, simply provide an input source and an optional output sink (defaulting to STDOUT):

aestream input <input source> [output <output sink>]

Supported Inputs and Outputs

InputDescriptionExample usage
DAVIS, DVXPlorerInivation DVS Camera over USBinput inivation
EVK CamerasProphesee DVS camera over USBinput prophesee
FileReads .aedat, .aedat4, .csv, .dat, or .raw filesinput file x.aedat4
SynSense SpeckStream events via ZMQinput speck
UDP networkReceives stream of events via the SPIF protocolinput udp
OutputDescriptionExample usage
STDOUTStandard output (default output)output stdout
Ethernet over UDPOutputs to a given IP and port using the SPIF protocoloutput udp 10.0.0.1 1234
File: .aedat4Output to .aedat4 formatoutput file my_file.aedat4
File: .csvOutput to comma-separated-value (CSV) file formatoutput file my_file.csv
ViewerView live event streamoutput view

CLI examples

ExampleSyntax
View live stream of Inivation camera (requires Inivation drivers)aestream input inivation output view
Stream Prophesee camera over the network to 10.0.0.1 (requires Metavision SDK)aestream input output udp 10.0.0.1
Convert .dat file to .aedat4aestream input example/sample.dat output file converted.aedat4

Acknowledgments

AEStream is developed by (in alphabetical order):

The work has received funding from the EC Horizon 2020 Framework Programme under Grant Agreements 785907 and 945539 (HBP) and by the Deutsche Forschungsgemeinschaft (DFG, German Research Fundation) under Germany's Excellence Strategy EXC 2181/1 - 390900948 (the Heidelberg STRUCTURES Excellence Cluster).

Thanks to Philipp Mondorf for interfacing with Metavision SDK and preliminary network code.

<a href="https://github.com/aestream/aestream/graphs/contributors"> <img src="https://contrib.rocks/image?repo=aestream/aestream" /> </a>

Citation

Please cite aestream if you use it in your work:

@inproceedings{10.1145/3584954.3584997,
    author = {Pedersen, Jens Egholm and Conradt, Jorg},
    title = {AEStream: Accelerated event-based processing with coroutines},
    year = {2023},
    isbn = {9781450399470},
    publisher = {Association for Computing Machinery},
    address = {New York, NY, USA},
    url = {https://doi.org/10.1145/3584954.3584997},
    doi = {10.1145/3584954.3584997},
    booktitle = {Proceedings of the 2023 Annual Neuro-Inspired Computational Elements Conference},
    pages = {86–91},
    numpages = {6},
    keywords = {coroutines, event-based vision, graphical processing unit, neuromorphic computing},
    location = {San Antonio, TX, USA, },
    series = {NICE '23}
}