校招、秋招、春招、实习好项目!带你从零实现一个高性能的深度学习推理库,支持大模型 llama2 、Unet、Yolov5、Resnet等模型的推理。Implement a high-performance deep learning inference library step by step
-
Updated
Oct 26, 2024 - C++
校招、秋招、春招、实习好项目!带你从零实现一个高性能的深度学习推理库,支持大模型 llama2 、Unet、Yolov5、Resnet等模型的推理。Implement a high-performance deep learning inference library step by step
FeatherCNN is a high performance inference engine for convolutional neural networks.
Adlik: Toolkit for Accelerating Deep Learning Inference
🔥 (yolov3 yolov4 yolov5 unet ...)A mini pytorch inference framework which inspired from darknet.
A library for high performance deep learning inference on NVIDIA GPUs.
High performance Cross-platform Inference-engine, you could run Anakin on x86-cpu,arm, nv-gpu, amd-gpu,bitmain and cambricon devices.
校招、秋招、春招、实习好项目,带你从零动手实现支持LLama2/3和Qwen2.5的大模型推理框架。
MIVisionX toolkit is a set of comprehensive computer vision and machine intelligence libraries, utilities, and applications bundled into a single toolkit. AMD MIVisionX also delivers a highly optimized open-source implementation of the Khronos OpenVX™ and OpenVX™ Extensions.
TinyTensor is a tool for running already trained NN (Neural Network) models to be able to use them for inference of various tasks such as image classification, semantic segmentation, etc.
Repository for OpenVINO's extra modules
DaisyKit is an easy AI toolkit with face mask detection, pose detection, background matting, barcode detection, face recognition and more. - with NCNN, OpenCV, Python wrappers
Neural network inference template for real-time cricital audio environments - presented at ADC23
Runs LLaMA with Extremely HIGH speed
Unified Rule Engine. Graph rewriting system for the AtomSpace. Used as reasoning engine for OpenCog.
library of C++ functions that support applications of Stan in Pharmacometrics
Custom C++ implementation of deep learning based OCR
RidgeRun Inference Framework
arm compute library implementation of efficient low precision neural network
Add a description, image, and links to the inference-engine topic page so that developers can more easily learn about it.
To associate your repository with the inference-engine topic, visit your repo's landing page and select "manage topics."