Home

Awesome

TensorStream

TensorStream is a C++ library for real-time video streams (e.g., RTMP) decoding to CUDA memory which supports some additional features:

The library supports both Linux and Windows.

Simple example how to use TensorStream for deep learning tasks:

from tensor_stream import TensorStreamConverter, FourCC, Planes

reader = TensorStreamConverter("rtmp://127.0.0.1/live", cuda_device=0)
reader.initialize()
reader.start()

while need_predictions:
    # read the latest available frame from the stream
    tensor = reader.read(pixel_format=FourCC.BGR24,
                         width=256,                 # resize to 256x256 px
                         height=256,
                         normalization=True,        # normalize to range [0, 1]
                         planes_pos=Planes.PLANAR)  # dimension order [C, H, W]

    # tensor dtype is torch.float32, device is 'cuda:0', shape is (3, 256, 256)
    prediction = model(tensor.unsqueeze(0))

Note: All tasks inside TensorStream processed on a GPU, so the output tensor is also located on the GPU.

Table of Contents

Install TensorStream

Dependencies

It is convenient to use TensorStream in Docker containers. The provided Dockerfiles is supplied to create an image with all the necessary dependencies.

Installation from source

TensorStream source code

git clone -b master --single-branch https://github.com/osai-ai/tensor-stream.git
cd tensor-stream

C++ extension for Python

On Linux:

python setup.py install

On Windows:

set FFMPEG_PATH="Path to FFmpeg install folder"
set path=%path%;%FFMPEG_PATH%\bin
python setup.py install

C++ library:

On Linux:

mkdir build
cd build
cmake ..

On Windows:

set FFMPEG_PATH="Path to FFmpeg install folder"
mkdir build
cd build
cmake ..

Building examples and tests

Examples for Python and C++ can be found in c_examples and python_examples folders. Tests for C++ can be found in tests folder.

Python example

Can be executed via Python after TensorStream C++ extension for Python installation.

cd python_examples
python simple.py

C++ example and unit tests

On Linux:

cd c_examples  # tests
mkdir build
cd build
cmake -DCMAKE_PREFIX_PATH=$PWD/../../cmake ..

On Windows:

set FFMPEG_PATH="Path to FFmpeg install folder"
cd c_examples or tests
mkdir build
cd build
cmake -DCMAKE_PREFIX_PATH=%cd%\..\..\cmake ..

Docker image

To build TensorStream need to pass Pytorch version via TORCH_VERSION argument:

docker build --build-arg TORCH_VERSION=2.0 -t tensorstream .

Run with a bash command line and follow the installation guide

docker run --gpus=all -ti tensorstream bash

Usage

Python examples

  1. Simple example demonstrates RTMP to PyTorch tensor conversion. Let's consider some usage scenarios:

Note: You can pass --help to get the list of all available options, their description and default values

python simple.py -i tests/resources/bunny.mp4 -fc RGB24 -o dump

Warning: Dumps significantly affect performance. Suffix .yuv will be added to the output filename.

python simple.py -i tests/resources/bunny.mp4 -fc RGB24 -w 720 -h 480 --resize_type NEAREST -o dump

Note: Besides nearest resize algorithm, bilinear, bicubic and area (similar to OpenCV INTER_AREA) algorithms are available.

Warning: Resize algorithms applied to NV12, so b2b with popular frameworks, which perform resize on other than NV12 format, aren't guaranteed.

python simple.py -i tests/resources/bunny.mp4 -fc RGB24 -w 720 -h 480 -o dump -n 100
python simple.py -i tests/resources/bunny.mp4 -fc RGB24 -w 720 -h 480 --crop 0,0,320,240 -o dump -n 100

Warning: Crop is applied before resize algorithm.

python simple.py -i tests/resources/bunny.mp4 -fc RGB24 -w 720 -h 480 -o dump -n 100 --normalize True
python simple.py -i tests/resources/bunny.mp4 -fc RGB24 -w 720 -h 480 -o dump -n 100 --planes MERGED
python simple.py -i tests/resources/bunny.mp4 -fc RGB24 -w 720 -h 480 -o dump -n 100 --planes MERGED --buffer_size 5

Warning: Buffer size should be less or equal to decoded picture buffer (DPB)

python simple.py -i tests/resources/bunny.mp4 -fc RGB24 -w 720 -h 480 -o dump -n 100 --planes MERGED --cuda_device 0
python simple.py -i tests/resources/bunny.mp4 -fc RGB24 -w 720 -h 480 -o dump -n 100 --planes MERGED --framerate_mode NATIVE
python simple.py -i tests/resources/bunny.mp4 -fc RGB24 -w 720 -h 480 -o dump -n 100 --planes MERGED --skip_analyze
python simple.py -i tests/resources/bunny.mp4 -fc RGB24 -w 720 -h 480 -o dump -n 100 --planes MERGED --timeout 2
python simple.py -i tests/resources/bunny.mp4 -fc RGB24 -w 720 -h 480 -o dump -n 100 --planes MERGED -v HIGH -vd CONSOLE --nvtx
  1. Example demonstrates how to use TensorStream in case of several stream consumers:
python many_consumers.py -i tests/resources/bunny.mp4 -n 100
  1. Example demonstrates how to use TensorStream if several streams should be handled simultaneously:
python different_streams.py -i1 <path-to-first-stream> -i2 <path-to-second-stream> -n1 100 -n2 50 -v1 LOW -v2 HIGH --cuda_device1 0 --cuda_device2 1

Warning: Default path to second stream is relative, so need to run different_streams.py from parent folder if no arguments are passing

PyTorch example

Real-time video style transfer example: fast-neural-style.

Documentation

Documentation is in Doxygen, can be built manually.

License

TensorStream is LGPL-2.1 licensed, see the LICENSE file for details.

Used materials in samples

Big Buck Bunny is licensed under the Creative Commons Attribution 3.0 license. (c) copyright 2008, Blender Foundation / www.bigbuckbunny.org