Maven 3.3+ and JDK 1.8+
Clone the triton-inference-server/common repository:
git clone https://github.com/triton-inference-server/common/ -b <common-repo-branch> common-repo
<common-repo-branch> should be the version of the Triton server that you intend to use (e.g. r21.05).
Copy *.proto files to Library src/main/proto
$ cd library
$ cp ../common-repo/protobuf/*.proto src/main/proto/
After copying the protobuf files, the library dir should look as below.
$ mvn compile
Once compiled, one should notice the generated *.java files under target folder
To run the examples clients, copy the above generated stub into examples folder
$ cd ..
$ cp -R library/target/generated-sources/protobuf/java/inference examples/src/main/java/inference
$ cp -R library/target/generated-sources/protobuf/grpc-java/inference/*.java examples/src/main/java/inference/
See the examples project which has scala and java sample client.
$ cd examples
$ mvn clean install
$ mvn exec:java -Dexec.mainClass=clients.SimpleJavaClient -Dexec.args="<host> <port>"
host where triton inference server is running
port default grpc port is 8001
$ mvn exec:java -Dexec.mainClass=clients.SimpleClient -Dexec.args="<host> <port>"
Both the examples run inference with respect to simple model. The scala example is more comprehensive and checks APIs like server ready and model ready
name: "OUTPUT0"
datatype: "INT32"
shape: 1
shape: 16
name: "OUTPUT1"
datatype: "INT32"
shape: 1
shape: 16
1 + 1 = 2
1 - 1 = 0
2 + 2 = 4
2 - 2 = 0
3 + 3 = 6
3 - 3 = 0
4 + 4 = 8
4 - 4 = 0
5 + 5 = 10
5 - 5 = 0
6 + 6 = 12
6 - 6 = 0
7 + 7 = 14
7 - 7 = 0
8 + 8 = 16
8 - 8 = 0
9 + 9 = 18
9 - 9 = 0
10 + 10 = 20
10 - 10 = 0
11 + 11 = 22
11 - 11 = 0
12 + 12 = 24
12 - 12 = 0
13 + 13 = 26
13 - 13 = 0
14 + 14 = 28
14 - 14 = 0
15 + 15 = 30
15 - 15 = 0
16 + 16 = 32
16 - 16 = 0