Topic: pruning Goto Github
Some thing interesting about pruning
Some thing interesting about pruning
pruning,Infrastructures™ for Machine Learning Training/Inference in Production.
User: 1duo
pruning,micronet, a model compression and deploy lib. compression: 1、quantization: quantization-aware-training(QAT), High-Bit(>2b)(DoReFa/Quantization and Training of Neural Networks for Efficient Integer-Arithmetic-Only Inference)、Low-Bit(≤2b)/Ternary and Binary(TWN/BNN/XNOR-Net); post-training-quantization(PTQ), 8-bit(tensorrt); 2、 pruning: normal、regular and group convolutional channel pruning; 3、 group convolution structure; 4、batch-normalization fuse for quantization. deploy: tensorrt, fp32/fp16/int8(ptq-calibration)、op-adapt(upsample)、dynamic_shape
User: 666dzy666
pruning,A PyTorch-based model pruning toolkit for pre-trained language models
User: airaria
Home Page: https://textpruner.readthedocs.io
pruning,TinyNeuralNetwork is an efficient and easy-to-use deep learning model compression framework.
Organization: alibaba
pruning,Pruning and other network surgery for trained Keras models.
User: benwhetton
pruning,A curated list of awesome edge machine learning resources, including research papers, inference engines, challenges, books, meetups and others.
Organization: bisonai
pruning,Awesome machine learning model compression research papers, tools, and learning material.
User: cedrickchee
pruning,(CVPR 2021, Oral) Dynamic Slimmable Network
User: changlin31
pruning,Embedded and mobile deep learning research resources
User: csarron
pruning,Config driven, easy backup cli for restic.
User: cupcakearmy
Home Page: https://autorestic.vercel.app/
pruning,《李宏毅深度学习教程》(李宏毅老师推荐👍),PDF下载地址:https://github.com/datawhalechina/leedl-tutorial/releases
Organization: datawhalechina
pruning,Tutorial notebooks for hls4ml
Organization: fastmachinelearning
Home Page: http://fastmachinelearning.org/hls4ml-tutorial/
pruning,A curated list of neural network pruning resources.
User: he-y
pruning,Filter Pruning via Geometric Median for Deep Convolutional Neural Networks Acceleration (CVPR 2019 Oral)
User: he-y
Home Page: https://arxiv.org/abs/1811.00250
pruning,Soft Filter Pruning for Accelerating Deep Convolutional Neural Networks
User: he-y
Home Page: https://arxiv.org/abs/1808.06866
pruning,[NeurIPS 2023] LLM-Pruner: On the Structural Pruning of Large Language Models. Support LLaMA, Llama-2, BLOOM, Vicuna, Baichuan, etc.
User: horseee
Home Page: https://arxiv.org/abs/2305.11627
pruning,Efficient computing methods developed by Huawei Noah's Ark Lab
Organization: huawei-noah
pruning,🤗 Optimum Intel: Accelerate inference with Intel optimization tools
Organization: huggingface
Home Page: https://huggingface.co/docs/optimum/main/en/intel/index
pruning,SOTA low-bit LLM quantization (INT8/FP8/INT4/FP4/NF4) & sparsity; leading model compression techniques on TensorFlow, PyTorch, and ONNX Runtime
Organization: intel
Home Page: https://intel.github.io/neural-compressor/
pruning,Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
Organization: intellabs
pruning,PyTorch Model Compression
Organization: j-marple-dev
pruning,Learning both Weights and Connections for Efficient Neural Networks https://arxiv.org/abs/1506.02626
User: jack-willturner
pruning,PyTorch Implementation of [1611.06440] Pruning Convolutional Neural Networks for Resource Efficient Inference
User: jacobgil
pruning,Pytorch implementation of our paper accepted by CVPR 2020 (Oral) -- HRank: Filter Pruning using High-Rank Feature Map
User: lmbxmu
Home Page: https://128.84.21.199/abs/2002.10179
pruning,A model compression and acceleration toolbox based on pytorch.
Organization: megvii-research
pruning,Reference ImageNet implementation of SelecSLS CNN architecture proposed in the SIGGRAPH 2020 paper "XNect: Real-time Multi-Person 3D Motion Capture with a Single RGB Camera". The repository also includes code for pruning the model based on implicit sparsity emerging from adaptive gradient descent methods, as detailed in the CVPR 2019 paper "On implicit filter level sparsity in Convolutional Neural Networks".
User: mehtadushy
pruning,FasterAI: Prune and Distill your models with FastAI and PyTorch
User: nathanhubens
Home Page: https://nathanhubens.github.io/fasterai/
pruning,Sparsity-aware deep learning inference runtime for CPUs
Organization: neuralmagic
Home Page: https://neuralmagic.com/deepsparse/
pruning,Libraries for applying sparsification recipes to neural networks with a few lines of code, enabling faster and smaller models
Organization: neuralmagic
pruning,Neural network model repository for highly sparse and sparse-quantized models with matching sparsification recipes
Organization: neuralmagic
pruning,ML model optimization product to accelerate inference.
Organization: neuralmagic
pruning,OpenMMLab Model Compression Toolbox and Benchmark.
Organization: open-mmlab
Home Page: https://mmrazor.readthedocs.io/en/latest/
pruning,Neural Network Compression Framework for enhanced OpenVINO™ inference
Organization: openvinotoolkit
pruning,PaddleSlim is an open-source library for deep model compression and architecture search.
Organization: paddlepaddle
Home Page: https://paddleslim.readthedocs.io/zh_CN/latest/
pruning,[ACL 2022] Structured Pruning Learns Compact and Accurate Models https://arxiv.org/abs/2204.00408
Organization: princeton-nlp
pruning,[ICLR 2024] Sheared LLaMA: Accelerating Language Model Pre-training via Structured Pruning
Organization: princeton-nlp
Home Page: https://arxiv.org/abs/2310.06694
pruning,AIMET is a library that provides advanced quantization and compression techniques for trained neural network models.
Organization: quic
Home Page: https://quic.github.io/aimet-pages/index.html
pruning,This repository contains a Pytorch implementation of the paper "The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks" by Jonathan Frankle and Michael Carbin that can be easily adapted to any model/dataset.
User: rahulvigneswaran
pruning,A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
Organization: sforaidl
Home Page: https://kd-lib.readthedocs.io/
pruning,Observations and notes to understand the workings of neural network models and other thought experiments using Tensorflow
User: shekkizh
pruning,YOLO ModelCompression MultidatasetTraining
User: spurslipu
pruning,mobilev2-yolov5s剪枝、蒸馏,支持ncnn,tensorRT部署。ultra-light but better performence!
User: syencil
pruning,yolov3 network slimming剪枝的一种实现
User: talebolano
pruning,A toolkit to optimize ML models for deployment for Keras and TensorFlow, including quantization and pruning.
Organization: tensorflow
Home Page: https://www.tensorflow.org/model_optimization
pruning,针对pytorch模型的自动化模型结构分析和修改工具集,包含自动分析模型结构的模型压缩算法库
Organization: thu-mig
pruning,OTOv1-v3, NeurIPS, ICLR, TMLR, DNN Training, Compression, Structured Pruning, Erasing Operators, CNN, Diffusion, LLM
User: tianyic
pruning,[CVPR 2023] Towards Any Structural Pruning; LLMs / SAM / Diffusion / Transformers / YOLOv8 / CNNs
User: vainf
Home Page: https://arxiv.org/abs/2301.12900
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.