Topic: transformers Goto Github
Some thing interesting about transformers
Some thing interesting about transformers
transformers,Run GPT model on the browser with WebGPU. An implementation of GPT inference in less than ~1500 lines of vanilla Javascript.
User: 0hq
Home Page: https://kmeans.org
transformers,CVPR 2024 论文和开源项目合集
User: amusi
transformers,Open-source offline translation library written in Python
User: argosopentech
Home Page: https://www.argosopentech.com
transformers,An easy-to-use LLMs quantization package with user-friendly apis, based on GPTQ algorithm.
Organization: autogptq
transformers,Tutorials on getting started with PyTorch and TorchText for sentiment analysis.
User: bentrevett
transformers,RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
User: blinkdl
transformers,🧠💬 Articles I wrote about machine learning, archived from MachineCurve.com.
User: christianversloot
transformers,中文语言理解测评基准 Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard
Organization: cluebenchmark
Home Page: http://www.CLUEbenchmarks.com
transformers,An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites
User: cmhungsteve
transformers,:mag: LLM orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.
Organization: deepset-ai
Home Page: https://haystack.deepset.ai
transformers,An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.
Organization: eleutherai
Home Page: https://www.eleuther.ai
transformers,An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library.
Organization: eleutherai
transformers,Fine-tuning ChatGLM-6B with PEFT | 基于 PEFT 的高效 ChatGLM 微调
User: hiyouga
transformers,Unify Efficient Fine-tuning of 100+ LLMs
User: hiyouga
transformers,Robust recipes to align language models with human and AI preferences
Organization: huggingface
Home Page: https://huggingface.co/HuggingFaceH4
transformers,🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
Organization: huggingface
Home Page: https://huggingface.co/docs/peft
transformers,💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
Organization: huggingface
Home Page: https://huggingface.co/docs/tokenizers
transformers,Fengshenbang-LM(封神榜大模型)是IDEA研究院认知计算与自然语言研究中心主导的大模型开源体系,成为中文AIGC和认知智能的基础设施。
Organization: idea-ccnl
transformers,OpenChat: Advancing Open-source Language Models with Imperfect Data
User: imoneoi
Home Page: https://openchat.team
transformers,Accelerate local LLM inference and finetuning (LLaMA, Mistral, ChatGLM, Qwen, Baichuan, Mixtral, Gemma, etc.) on Intel CPU and GPU (e.g., local PC with iGPU, discrete GPU such as Arc, Flex and Max). A PyTorch LLM library that seamlessly integrates with HuggingFace, LangChain, LlamaIndex, DeepSpeed, vLLM, FastChat, ModelScope, etc.
Organization: intel-analytics
Home Page: https://ipex-llm.readthedocs.io
transformers,A model library for exploring state-of-the-art deep learning topologies and techniques for optimizing Natural Language Processing neural networks
Organization: intellabs
Home Page: https://intellabs.github.io/nlp-architect
transformers,Seamlessly integrate LLMs into scikit-learn.
User: iryna-kondr
Home Page: https://beastbyte.ai/
transformers,BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
User: jessevig
transformers,State of the Art Natural Language Processing
Organization: johnsnowlabs
Home Page: https://sparknlp.org/
transformers,A machine learning software for extracting information from scholarly documents
User: kermitt2
Home Page: https://grobid.readthedocs.io
transformers,🧑🏫 60 Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
Organization: labmlai
Home Page: https://nn.labml.ai
transformers,🔥🔥🔥🔥 (Earlier YOLOv7 not official one) YOLO with Transformers and Instance Segmentation, with TensorRT acceleration! 🔥🔥🔥
User: lucasjinreal
transformers,Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch
User: lucidrains
transformers,Simple command line tool for text to image generation using OpenAI's CLIP and Siren (Implicit neural representation network). Technique was originally created by https://twitter.com/advadnoun
User: lucidrains
transformers,Implementation of MusicLM, Google's new SOTA model for music generation using attention networks, in Pytorch
User: lucidrains
transformers,Implementation of RLHF (Reinforcement Learning with Human Feedback) on top of the PaLM architecture. Basically ChatGPT but with PaLM
User: lucidrains
transformers,Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
User: lucidrains
transformers,A simple but complete full-attention transformer with a set of promising experimental features from various papers
User: lucidrains
transformers,Leveraging BERT and c-TF-IDF to create easily interpretable topics.
User: maartengr
Home Page: https://maartengr.github.io/BERTopic/
transformers,Unified embedding generation and search engine. Also available on cloud - cloud.marqo.ai
Organization: marqo-ai
Home Page: https://www.marqo.ai/
transformers,18 Lessons, Get Started Building with Generative AI 🔗 https://microsoft.github.io/generative-ai-for-beginners/
Organization: microsoft
Home Page: https://microsoft.github.io/generative-ai-for-beginners/
transformers,Context aware, pluggable and customizable data protection and de-identification SDK for text and images
Organization: microsoft
Home Page: https://microsoft.github.io/presidio
transformers,💡 All-in-one open-source embeddings database for semantic search, LLM orchestration and language model workflows
Organization: neuml
Home Page: https://neuml.github.io/txtai
transformers,This repository contains demos I made with the Transformers library by HuggingFace.
User: nielsrogge
transformers,Chinese version of CLIP which achieves Chinese cross-modal retrieval and representation generation.
Organization: ofa-sys
transformers,OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference
Organization: openvinotoolkit
Home Page: https://docs.openvino.ai
transformers,👑 Easy-to-use and powerful NLP and LLM library with 🤗 Awesome model zoo, supporting wide-range of NLP tasks from research to industrial applications, including 🗂Text Classification, 🔍 Neural Search, ❓ Question Answering, ℹ️ Information Extraction, 📄 Document Intelligence, 💌 Sentiment Analysis etc.
Organization: paddlepaddle
Home Page: https://paddlenlp.readthedocs.io
transformers,Prompt Engineering | Prompt Versioning | Use GPT or other prompt based models to get structured output. Join our discord for Prompt-Engineering, LLMs and other latest research
Organization: promptslab
Home Page: https://discord.gg/m88xfYMbK6
transformers,This repository is a curated collection of links to various courses and resources about Artificial Intelligence (AI)
User: skalskip
transformers,A PyTorch-based Speech Toolkit
Organization: speechbrain
Home Page: http://speechbrain.github.io
transformers,Machine Learning Engineering Open Book
User: stas00
Home Page: https://stasosphere.com/machine-learning/
transformers,🔮 SuperDuperDB: Bring AI to your database! Build, deploy and manage any AI application directly with your existing data infrastructure, without moving your data. Including streaming inference, scalable model training and vector search.
Organization: superduperdb
Home Page: https://superduperdb.com
transformers,Transformers for Information Retrieval, Text Classification, NER, QA, Language Modelling, Language Generation, T5, Multi-Modal, and Conversational AI
User: thilinarajapakse
Home Page: https://simpletransformers.ai/
transformers,Prompt Engineering, Generative AI, and LLM Guide by Learn Prompting | Join our discord for the largest Prompt Engineering learning community
User: trigaten
Home Page: https://learnprompting.org
transformers,State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
User: xenova
Home Page: https://huggingface.co/docs/transformers.js
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.