Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update build from source instructions #21468

Merged
merged 5 commits into from
Sep 11, 2024
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
166 changes: 58 additions & 108 deletions docs/genai/howto/build-from-source.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ nav_order: 2
## Pre-requisites

- `cmake`
- `.Net v6` (if building C#)
- `.NET6` (if building C#)

## Clone the onnxruntime-genai repo

Expand All @@ -25,11 +25,10 @@ git clone https://github.com/microsoft/onnxruntime-genai
cd onnxruntime-genai
```

## Install ONNX Runtime
## Download ONNX Runtime binaries

By default, the onnxruntime-genai build expects to find the ONNX Runtime include and binaries in a folder called `ort` in the root directory of onnxruntime-genai. You can put the ONNX Runtime files in a different location and specify this location to the onnxruntime-genai build via the --ort_home command line argument.

### Option 1: Install from release

These instructions assume you are in the `onnxruntime-genai` folder.

Expand All @@ -38,161 +37,96 @@ These instructions assume you are in the `onnxruntime-genai` folder.
These instruction use `win-x64`. Replace this if you are using a different architecture.

```bash
curl -L https://github.com/microsoft/onnxruntime/releases/download/v1.18.0/onnxruntime-win-x64-1.18.0.zip -o onnxruntime-win-x64-1.18.0.zip
tar xvf onnxruntime-win-x64-1.18.0.zip
move onnxruntime-win-x64-1.18.0 ort
curl -L https://github.com/microsoft/onnxruntime/releases/download/v1.19.2/onnxruntime-win-x64-1.19.2.zip -o onnxruntime-win-x64-1.19.2.zip
tar xvf onnxruntime-win-x64-1.19.2.zip
move onnxruntime-win-x64-1.19.2 ort
```

#### Linux and Mac

These instruction use `linux-x64-gpu`. Replace this if you are using a different architecture.

```bash
curl -L https://github.com/microsoft/onnxruntime/releases/download/v1.18.0/onnxruntime-linux-x64-gpu-1.18.0.tgz -o onnxruntime-linux-x64-gpu-1.18.0.tgz
tar xvzf onnxruntime-linux-x64-gpu-1.18.0.tgz
mv onnxruntime-linux-x64-gpu-1.18.0 ort
curl -L https://github.com/microsoft/onnxruntime/releases/download/v1.19.2/onnxruntime-linux-x64-gpu-1.19.2.tgz -o onnxruntime-linux-x64-gpu-1.19.2.tgz
tar xvzf onnxruntime-linux-x64-gpu-1.19.2.tgz
mv onnxruntime-linux-x64-gpu-1.19.2 ort
```

### Option 2: Install from nightly
#### Android

Download the nightly nuget package `Microsoft.ML.OnnxRuntime` from: https://aiinfra.visualstudio.com/PublicPackages/_artifacts/feed/ORT-Nightly.

Extract the nuget package.

```bash
tar xvf Microsoft.ML.OnnxRuntime.1.18.0-dev-20240322-0323-ca825cb6e6.nupkg
```

Copy the include and lib files into `ort`.

On Windows

Example is given for `win-x64`. Change this to your architecture if different.

```cmd
copy build\native\include\onnxruntime_c_api.h ort\include
copy runtimes\win-x64\native\*.dll ort\lib
```

On Linux

Example is given for `linux-x64`. Change this to your architecture if different.

```cmd
cp build/native/include/onnxruntime_c_api.h ort/include
cp build/linux-x64/native/libonnxruntime*.so* ort/lib
```

### Option 3: Build from source

#### Clone the onnxruntime repo
If you do not already have an `ort` folder, create one.

```bash
cd ..
git clone https://github.com/microsoft/onnxruntime.git
cd onnxruntime
mkdir ort
```

#### Build ONNX Runtime for CPU on Windows

```bash
build.bat --build_shared_lib --skip_tests --parallel --config Release
copy include\onnxruntime\core\session\onnxruntime_c_api.h ..\onnxruntime-genai\ort\include
copy build\Windows\Release\Release\*.dll ..\onnxruntime-genai\ort\lib
copy build\Windows\Release\Release\onnxruntime.lib ..\onnxruntime-genai\ort\lib
```

#### Build ONNX Runtime for DirectML on Windows

```bash
build.bat --build_shared_lib --skip_tests --parallel --use_dml --config Release
copy include\onnxruntime\core\session\onnxruntime_c_api.h ..\onnxruntime-genai\ort\include
copy include\onnxruntime\core\providers\dml\dml_provider_factory.h ..\onnxruntime-genai\ort\include
copy build\Windows\Release\Release\*.dll ..\onnxruntime-genai\ort\lib
copy build\Windows\Release\Release\onnxruntime.lib ..\onnxruntime-genai\ort\lib
curl -L https://repo1.maven.org/maven2/com/microsoft/onnxruntime/onnxruntime-android/1.19.2/onnxruntime-android-1.19.2.aar -o ort/onnxruntime-android-1.19.2.aar
cd ort
tar xvf onnxruntime-android-1.19.2.aar
cd ..
```

## Build the generate() API

#### Build ONNX Runtime for CUDA on Windows

```bash
build.bat --build_shared_lib --skip_tests --parallel --use_cuda --config Release
copy include\onnxruntime\core\session\onnxruntime_c_api.h ..\onnxruntime-genai\ort\include
copy include\onnxruntime\core\providers\cuda\*.h ..\onnxruntime-genai\ort\include
copy build\Windows\Release\Release\*.dll ..\onnxruntime-genai\ort\lib
copy build\Windows\Release\Release\onnxruntime.lib ..\onnxruntime-genai\ort\lib
```
This step assumes that you are in the root of the onnxruntime-genai repo, and you have followed the previous steps to copy the onnxruntime headers and binaries into the folder specified by <ORT_HOME>, which defaults to `onnxruntime-genai/ort`.

#### Build ONNX Runtime on Linux
All of the build commands below have a `--config` argument, which takes the following options:
- `Release` builds release binaries
- `Debug` build binaries with debug symbols
- `RelWithDebInfo` builds release binaries with debug info

```bash
./build.sh --build_shared_lib --skip_tests --parallel [--use_cuda] --config Release
cp include/onnxruntime/core/session/onnxruntime_c_api.h ../onnxruntime-genai/ort/include
cp build/Linux/Release/libonnxruntime*.so* ../onnxruntime-genai/ort/lib
```
### Build Python API

You may need to provide extra command line options for building with CUDA on Linux. An example full command is as follows.
#### Windows CPU build

```bash
./build.sh --parallel --build_shared_lib --use_cuda --cuda_version 11.8 --cuda_home /usr/local/cuda-11.8 --cudnn_home /usr/lib/x86_64-linux-gnu/ --config Release --build_wheel --skip_tests --cmake_extra_defines CMAKE_CUDA_ARCHITECTURES="80" --cmake_extra_defines CMAKE_CUDA_COMPILER=/usr/local/cuda-11.8/bin/nvcc
python build.py --config `Release`
```

Replace the values given above for different versions and locations of CUDA.

#### Build ONNX Runtime on Mac
#### Windows DirectML build

```bash
./build.sh --build_shared_lib --skip_tests --parallel --config Release
cp include/onnxruntime/core/session/onnxruntime_c_api.h ../onnxruntime-genai/ort/include
cp build/MacOS/Release/libonnxruntime*.dylib* ../onnxruntime-genai/ort/lib
python build.py --use_dml --config `Release`
```

## Build the generate() API

This step assumes that you are in the root of the onnxruntime-genai repo, and you have followed the previos steps to copy the onnxruntime headers and binaries into the folder specified by <ORT_HOME>, which defaults to `onnxruntime-genai/ort`.
#### Linux build

```bash
cd ../onnxruntime-genai
python build.py --config `Release`
```

### Build Python API

#### Build for Windows CPU
#### Linux CUDA build

```bash
python build.py
python build.py --use_cuda --config `Release`
```

#### Build for Windows DirectML
#### Mac build

```bash
python build.py --use_dml
python build.py --config `Release`
```

#### Build on Linux
### Build Java API

```bash
python build.py
python build.py --build_java --config Release
```

#### Build on Linux with CUDA

```bash
python build.py --use_cuda
```
### Build for Android

#### Build on Mac
If building on Windows, install `ninja`.

```bash
python build.py
pip install ninja
```

### Build Java API
Run the build script.

```bash
python build.py --build_java --config Release
python build.py --build_java --android --android_home <path to your Android SDK> --android_ndk_path <path to your NDK installation> --android_abi [armeabi-v7a|arm64-v8a|x86|x86_64] --config Release
```
Change config to Debug for debug builds.

## Install the library into your application

Expand All @@ -203,12 +137,28 @@ cd build/wheel
pip install *.whl
```

### Install .jar
### Install NuGet

_Coming soon_

### Install JAR

Copy `build/Windows/Release/src/java/build/libs/*.jar` into your application.

### Install Nuget package
### Install AAR

Copy `build/Android/Release/src/java/build/android/outputs/aar/onnxruntime-genai-release.aar` into your application.


### Install C/C++ header file and library

_Coming soon_
#### Windows

Use the header in `src\ort_genai.h` and the libraries in `build\Windows\Release`

#### Linux

Use the header in `src/ort_genai.h` and the libraries in `build/Linux/Release`



Loading