Installation#
Prerequisites#
The current version of OpenVINO™ Training Extensions was tested in the following environment:
Ubuntu 20.04
Python >= 3.10
[uv](astral-sh/uv) for dependency and environment management
Installing uv
#
To use OpenVINO™ Training Extensions with uv
, you first need to install the uv
tool.
You can install it in one of the following ways:
curl -LsSf https://astral.sh/uv/install.sh | sh
This method installs uv
globally as a fast and portable binary.
After installation, make sure uv
is available in your PATH
.
pip install uv
This installs uv
inside the currently active Python environment.
After installation, confirm it works:
uv --version
Install OpenVINO™ Training Extensions for users (CUDA/CPU)#
Install OpenVINO™ Training Extensions package:
A local source in development mode
# Create a virtual environment using uv
uv venv .otx --python 3.10 # or 3.11
source .otx/bin/activate
# Install from PyPI
uv pip install otx
# Clone the training_extensions repository:
git clone https://github.com/open-edge-platform/training_extensions.git
cd training_extensions
# Create a virtual environment with uv
uv venv .otx --python 3.10 # or 3.11
source .otx/bin/activate
# Install the package in editable mode with base dependencies
uv pip install -e .
# Install OTX in development mode
uv pip install -e .[dev]
2. Once the package is installed in the virtual environment, you can use the full OpenVINO™ Training Extensions command line functionality.
otx --help
Install OpenVINO™ Training Extensions for users (Intel GPUs)#
Install OpenVINO™ Training Extensions from source to use Intel XPU functionality:
git clone https://github.com/open-edge-platform/training_extensions.git
cd training_extensions
uv venv .otx --python 3.10 # or 3.11
source .otx/bin/activate
uv pip install -e . --extra-index-url https://download.pytorch.org/whl/test/xpu
Note
Please refer to the PyTorch XPU installation guide to install prerequisites and resolve any potential issues.
Once installed, use the command-line interface:
otx --help
Install OpenVINO™ Training Extensions for developers#
Install
tox
with thetox-uv
plugin using uv’s tool system:
uv tool install tox --with tox-uv
Create a development environment using
tox
:
# Replace '310' with '311' if using Python 3.11
tox devenv venv/otx -e unit-test-py310
source venv/otx/bin/activate
Now you’re ready to develop, test, and make changes — all reflected live in the editable install.
Note
By installing tox
with uv tool
, you ensure it runs in a reproducible and isolated environment,
with uv
used internally to manage dependencies for each test environment.
Install OpenVINO™ Training Extensions by using Docker#
1. By executing the following commands, it will build two
Docker images: otx:${OTX_VERSION}-cuda
and otx:${OTX_VERSION}-cuda-pretrained-ready
.
git clone https://github.com/open-edge-platform/training_extensions.git
cd docker
./build.sh
2. After that, you can check whether the images are built correctly such as
docker image ls | grep otx
Example output:
otx 2.0.0-cuda-pretrained-ready 4f3b5f98f97c 3 minutes ago 14.5GB
otx 2.0.0-cuda 8d14caccb29a 8 minutes ago 10.4GB
otx:${OTX_VERSION}-cuda
is a minimal Docker image with CUDA support.
otx:${OTX_VERSION}-cuda-pretrained-ready
includes pre-trained models on top of the base image.
Run tests#
To run tests locally, install development dependencies:
uv pip install -e '.[dev]'
pytest tests/
To run integration tests using tox:
uv tool install tox --with tox-uv
tox -e integration-test-all
Note
The first time tox is run, it will create virtual environments and install all required dependencies. This may take several minutes before the actual tests begin.
Troubleshooting#
If you encounter issues with uv pip, update uv:
pip install --upgrade uv
2. If you’re having issues installing torch or mmcv, check CUDA compatibility with your PyTorch version. Update your CUDA toolkit and drivers if needed. See CUDA 11.8 Installer.
If you’re behind a proxy server, set your proxy environment variable:
export HTTP_PROXY=http://<user>:<password>@<proxy>:<port>
uv pip install <package>
For CLI-related issues, check the help message:
otx --help
To see additional messages from jsonargparse, enable debug output:
export JSONARGPARSE_DEBUG=1 # 0: Off, 1: On