There are no reviews yet. Be the first to send feedback to the community and the maintainers!
server
The Triton Inference Server provides an optimized cloud and edge inferencing solution.pytriton
PyTriton is a Flask/FastAPI-like interface that simplifies Triton's deployment in Python environments.tensorrtllm_backend
The Triton TensorRT-LLM Backendclient
Triton Python, C++ and Java client libraries, and GRPC-generated client examples for go, java and scala.tutorials
This repository contains tutorials and examples for Triton Inference Serverpython_backend
Triton backend that enables pre-process, post-processing and other logic to be implemented in Python.model_analyzer
Triton Model Analyzer is a CLI tool to help with better understanding of the compute and memory requirements of the Triton Inference Server models.fastertransformer_backend
backend
Common source, scripts and utilities for creating Triton backends.model_navigator
Triton Model Navigator is an inference toolkit designed for optimizing and deploying Deep Learning models with a focus on NVIDIA GPUs.vllm_backend
dali_backend
The Triton backend that allows running GPU-accelerated data pre-processing pipelines implemented in DALI's python API.onnxruntime_backend
The Triton backend for the ONNX Runtime.pytorch_backend
The Triton backend for the PyTorch TorchScript models.core
The core library and APIs implementing the Triton Inference Server.fil_backend
FIL backend for the Triton Inference Servercommon
Common source, scripts and utilities shared across all Triton repositories.tensorrt_backend
The Triton backend for TensorRT.hugectr_backend
triton_cli
Triton CLI is an open source command line interface that enables users to create, deploy, and profile models served by the Triton Inference Server.tensorflow_backend
The Triton backend for TensorFlow.paddlepaddle_backend
openvino_backend
OpenVINO backend for Triton.developer_tools
stateful_backend
Triton backend for managing the model state tensors automatically in sequence batcherredis_cache
TRITONCACHE implementation of a Redis cachechecksum_repository_agent
The Triton repository agent that verifies model checksums.contrib
Community contributions to Triton that are not officially supported or maintained by the Triton project.third_party
Third-party source packages that are modified for use in Triton.identity_backend
Example Triton backend that demonstrates most of the Triton Backend API.repeat_backend
An example Triton backend that demonstrates sending zero, one, or multiple responses for each request.square_backend
Simple Triton backend used for testing.Love Open Source and this site? Check out how you can help us