nihui's Projects
Mobile AI Compute Engine
MC3172-CMake Project
Minimal runtime core of Caffe, Forward only, GPU support and Memory efficiency.
MNN is a lightweight deep neural network inference engine.
Benchmarking Neural Network Inference on Mobile Devices
试图整理汇总 MtF 的相关资料,为大家提供更好的帮助~
⚡Super fast and lightweight anchor-free object detection model. 🔥Only 1.8mb and run 97FPS on cellphone🔥
ncnn is a high-performance neural network inference framework optimized for the mobile platform
ncnn android benchmark app
The mobilenetssd object detection android example
The squeezenet image classification android example
The style transfer android example
The YOLOv5 object detection android example
ncnn benchmark on various single board computers
Deploy nanodet, the super fast and lightweight object detection, in your web browser with ncnn and webassembly
Portrait segmentation in your web browser with ncnn and webassembly
Deploy SCRFD, an efficient high accuracy face detection approach, in your web browser with ncnn and webassembly
Deploy YOLOv5 in your web browser with ncnn and webassembly
Tencent NCNN with added CUDA support
Android paddleocr demo infer by ncnn
ncnnRay++ is a CMake based integration of raylib and the very popular Tencent ncnn Deep Learning library. ncnn is written in C++ designed (but not only) for edge computing devices. The project depends on the Vulkan SDK (Vulakn is Khronos' API for Graphics and Compute on GPUs).
Visualizer for neural network, deep learning and machine learning models