Awesome
ONNX Runtime PHP
:fire: ONNX Runtime - the high performance scoring engine for ML models - for PHP
Check out an example
Installation
Run:
composer require ankane/onnxruntime
Add scripts to composer.json
to download the shared library:
"scripts": {
"post-install-cmd": "OnnxRuntime\\Vendor::check",
"post-update-cmd": "OnnxRuntime\\Vendor::check"
}
And run:
composer install
Getting Started
Load a model and make predictions
$model = new OnnxRuntime\Model('model.onnx');
$model->predict(['x' => [1, 2, 3]]);
Download pre-trained models from the ONNX Model Zoo
Get inputs
$model->inputs();
Get outputs
$model->outputs();
Get metadata
$model->metadata();
Load a model from a stream
$stream = fopen('model.onnx', 'rb');
$model = new OnnxRuntime\Model($stream);
Get specific outputs
$model->predict(['x' => [1, 2, 3]], outputNames: ['label']);
Session Options
use OnnxRuntime\ExecutionMode;
use OnnxRuntime\GraphOptimizationLevel;
new OnnxRuntime\Model(
$path,
enableCpuMemArena: true,
enableMemPattern: true,
enableProfiling: false,
executionMode: ExecutionMode::Sequential, // or Parallel
freeDimensionOverridesByDenotation: null,
freeDimensionOverridesByName: null,
graphOptimizationLevel: GraphOptimizationLevel::None, // or Basic, Extended, All
interOpNumThreads: null,
intraOpNumThreads: null,
logSeverityLevel: 2,
logVerbosityLevel: 0,
logid: 'tag',
optimizedModelFilepath: null,
profileFilePrefix: 'onnxruntime_profile_',
sessionConfigEntries: null
);
Run Options
$model->predict(
$inputFeed,
outputNames: null,
logSeverityLevel: 2,
logVerbosityLevel: 0,
logid: 'tag',
terminate: false
);
Inference Session API
You can also use the Inference Session API, which follows the Python API.
$session = new OnnxRuntime\InferenceSession('model.onnx');
$session->run(null, ['x' => [1, 2, 3]]);
The Python example models are included as well.
OnnxRuntime\Datasets::example('sigmoid.onnx');
GPU Support
Linux and Windows
Download the appropriate GPU release and set:
OnnxRuntime\FFI::$lib = 'path/to/lib/libonnxruntime.so'; // onnxruntime.dll for Windows
and use:
$model = new OnnxRuntime\Model('model.onnx', providers: ['CUDAExecutionProvider']);
Mac
Use:
$model = new OnnxRuntime\Model('model.onnx', providers: ['CoreMLExecutionProvider']);
History
View the changelog
Contributing
Everyone is encouraged to help improve this project. Here are a few ways you can help:
- Report bugs
- Fix bugs and submit pull requests
- Write, clarify, or fix documentation
- Suggest or add new features
To get started with development:
git clone https://github.com/ankane/onnxruntime-php.git
cd onnxruntime-php
composer install
composer test