• Stars
    star
    22
  • Rank 1,013,498 (Top 21 %)
  • Language
    C++
  • License
    BSD 3-Clause "New...
  • Created about 3 years ago
  • Updated 3 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

OpenVINO backend for Triton.

More Repositories

1

server

The Triton Inference Server provides an optimized cloud and edge inferencing solution.
Python
7,321
star
2

pytriton

PyTriton is a Flask/FastAPI-like interface that simplifies Triton's deployment in Python environments.
Python
661
star
3

client

Triton Python, C++ and Java client libraries, and GRPC-generated client examples for go, java and scala.
C++
451
star
4

python_backend

Triton backend that enables pre-process, post-processing and other logic to be implemented in Python.
C++
444
star
5

tensorrtllm_backend

The Triton TensorRT-LLM Backend
Python
439
star
6

fastertransformer_backend

Python
409
star
7

tutorials

This repository contains tutorials and examples for Triton Inference Server
Python
403
star
8

model_analyzer

Triton Model Analyzer is a CLI tool to help with better understanding of the compute and memory requirements of the Triton Inference Server models.
Python
374
star
9

backend

Common source, scripts and utilities for creating Triton backends.
C++
231
star
10

model_navigator

Triton Model Navigator is a tool that provides the ability to automate the process of model deployment on the Triton Inference Server.
Python
148
star
11

dali_backend

The Triton backend that allows running GPU-accelerated data pre-processing pipelines implemented in DALI's python API.
C++
116
star
12

onnxruntime_backend

The Triton backend for the ONNX Runtime.
C++
109
star
13

pytorch_backend

The Triton backend for the PyTorch TorchScript models.
C++
93
star
14

vllm_backend

Python
84
star
15

core

The core library and APIs implementing the Triton Inference Server.
C++
78
star
16

fil_backend

FIL backend for the Triton Inference Server
Jupyter Notebook
63
star
17

common

Common source, scripts and utilities shared across all Triton repositories.
C++
53
star
18

hugectr_backend

Jupyter Notebook
48
star
19

tensorrt_backend

The Triton backend for TensorRT.
C++
40
star
20

tensorflow_backend

The Triton backend for TensorFlow.
C++
39
star
21

paddlepaddle_backend

C++
32
star
22

developer_tools

C++
15
star
23

stateful_backend

Triton backend for managing the model state tensors automatically in sequence batcher
C++
10
star
24

contrib

Community contributions to Triton that are not officially supported or maintained by the Triton project.
Python
8
star
25

third_party

Third-party source packages that are modified for use in Triton.
C
7
star
26

checksum_repository_agent

The Triton repository agent that verifies model checksums.
C++
6
star
27

identity_backend

Example Triton backend that demonstrates most of the Triton Backend API.
C++
6
star
28

redis_cache

TRITONCACHE implementation of a Redis cache
C++
5
star
29

repeat_backend

An example Triton backend that demonstrates sending zero, one, or multiple responses for each request.
C++
5
star
30

local_cache

Implementation of a local in-memory cache for Triton Inference Server's TRITONCACHE API
C++
2
star
31

square_backend

Simple Triton backend used for testing.
C++
2
star