Home

Awesome

mnn-llm

mnn-llm

License Download

English

示例工程

模型导出与下载

llm模型导出onnxmnn模型请使用llm-export

modelscope模型下载:

<details> <summary>qwen系列</summary> </details> <details> <summary>glm系列</summary> </details> <details> <summary>llama系列</summary> </details> <details> <summary>其他</summary> </details>

构建

CI构建状态:

Build Status Build Status Build Status Build Status Build Status Build Status

本地编译

# clone
git clone --recurse-submodules https://github.com/wangzhaode/mnn-llm.git
cd mnn-llm

# linux
./script/build.sh

# macos
./script/build.sh

# windows msvc
./script/build.ps1

# python wheel
./script/py_build.sh

# android
./script/android_build.sh

# android apk
./script/android_app_build.sh

# ios
./script/ios_build.sh

一些编译宏:

默认使用CPU后端且不实用多模态能力,如果使用其他后端或能力,可以在编译MNN的脚本中添加MNN编译宏

4. 执行

# linux/macos
./cli_demo ./Qwen2-1.5B-Instruct-MNN/config.json # cli demo
./web_demo ./Qwen2-1.5B-Instruct-MNN/config.json ../web # web ui demo

# windows
.\Debug\cli_demo.exe ./Qwen2-1.5B-Instruct-MNN/config.json
.\Debug\web_demo.exe ./Qwen2-1.5B-Instruct-MNN/config.json ../web

# android
adb push libs/*.so build/libllm.so build/cli_demo /data/local/tmp
adb push model_dir /data/local/tmp
adb shell "cd /data/local/tmp && export LD_LIBRARY_PATH=. && ./cli_demo ./Qwen2-1.5B-Instruct-MNN/config.json"

Reference

<details> <summary>reference</summary> </details>