[go: up one dir, main page]

Skip to content

nanmi/onnx-tensorrt-template

Repository files navigation

ONNX TensorRT Template

This project base on tiny-tensorrt

News

It can speed up the whole pipeline on GPU, greatly improve the operation efficiency, and customize the pre-processing and post-processing on GPU - 2021-6-29

Features

  • Preprocess in GPU
  • Postprocess in GPU
  • run whole pipeline in GPU easily
  • Custom onnx model output node
  • Engine serialization and deserialization auto
  • INT8 support

System Requirements

cuda 10.0+

TensorRT 7

OpenCV 4.0+ (build with opencv-contrib module)

Installation

Make sure you had install dependencies list above

# clone project and submodule
git clone {this repo}

cd {this repo}

mkdir build && cd build && cmake .. && make

Then you can intergrate it into your own project with libtinytrt.so and Trt.h

Docs

example cxx code for how to use opencv gpu version in TensorRT inference.

About License

For the 3rd-party module and TensorRT, you need to follow their license

For the part I wrote, you can do anything you want

About

onnx tensorrt template

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published