Skip to content


This module contains examples to demonstrate use of the Deep Java Library (DJL). You can find more examples from our djl-demo github repo.

The following examples are included for training:

The following examples are included for inference:

These examples focus on the overall experience of training and inference. We keep components that are reusable within separate modules for other users to take advantage of in their own applications. For examples and references on creating datasets, look at the basic dataset module. For examples and references on building models and translators, look in our basic model zoo.

You may be able to find more translator examples in our engine specific model zoos: Apache MXNet, PyTorch, and TensorFlow.

More examples and demos of applications featuring DJL are located in our demo repository.


  • You need to have Java Development Kit version 8 or later installed on your system. For more information, see Setup.
  • You should be familiar with the API documentation in the DJL Javadoc.

Getting started: 30 seconds to run an example

Building with the command line

This example supports building with both Gradle and Maven. To build, use either of the following commands:

  • Gradle build
cd examples

# for Linux/macOS:
./gradlew jar

# for Windows:
..\gradlew jar
  • Maven build
cd examples
mvn package -DskipTests

Run example code

With the gradle application plugin you can execute example code directly. For more information on running each example, see the example's documentation.

The following command executes an object detection example:

  • Gradle
cd examples

# for Linux/macOS:
./gradlew run

# for Windows:
..\gradlew run
  • Maven
cd examples
mvn clean package -DskipTests
mvn exec:java

Engine selection

DJL is engine agnostic, so it's capable of supporting different backends. DJL by default will select proper native library for you automatically and download those libraries from internet. If your production environment doesn't have network access, you can distribute DJL's offline native packages together with your application to avoid download engine native libraries at runtime.

With Apache MXNet, PyTorch, TensorFlow and ONNX Runtime, you can choose different builds of the native library. You can also see: