Examples¶
This module contains examples to demonstrate use of the Deep Java Library (DJL).
The following examples are included for training:
- Train your first model
- Transfer learning example
- Train SSD model example
- Multi-label dataset training example
The following examples are included for inference:
- Image classification example
- Single-shot object detection example
- Bert question and answer example
- Instance segmentation example
- Pose estimation example
- Action recognition example
These examples focus on the overall experience of training and inference. We keep components that are reusable within separate modules for other users to take advantage of in their own applications. For examples and references on creating datasets, look at the basic dataset module. For examples and references on building models and translators, look in our basic model zoo.
You may be able to find more translator examples in our engine specific model zoos: Apache MXNet, PyTorch, and TensorFlow.
More examples and demos of applications featuring DJL are located in our demo repository.
Prerequisites¶
- You need to have Java Development Kit version 8 or later installed on your system. For more information, see Setup.
- You should be familiar with the API documentation in the DJL Javadoc.
Getting started: 30 seconds to run an example¶
Building with the command line¶
This example supports building with both Gradle and Maven. To build, use either of the following commands:
-
Gradle build
```sh cd examples
for Linux/macOS:¶
./gradlew jar
for Windows:¶
..\gradlew jar ```
-
Maven build
sh cd examples mvn package -DskipTests
Run example code¶
With the gradle application
plugin you can execute example code directly.
For more information on running each example, see the example's documentation.
The following command executes an object detection example:
-
Gradle
```sh cd examples
for Linux/macOS:¶
./gradlew run
for Windows:¶
..\gradlew run ```
-
Maven
sh cd examples mvn clean package -DskipTests mvn exec:java
Engine selection¶
DJL is engine agnostic, so it's capable of supporting different backends.
With Apache MXNet, PyTorch, TensorFlow and ONNX Runtime, you can choose different builds of the native library.
We recommend the automatic engine selection which downloads the best engine for your platform and available hardware during the first runtime.
Activate the automatic selection by adding ai.djl.mxnet:mxnet-native-auto:1.7.0-backport
for MXNet and ai.djl.pytorch:pytorch-native-auto:1.7.0
for PyTorh as a dependency.
You can also see: