Topic: self-attention Goto Github
Some thing interesting about self-attention
Some thing interesting about self-attention
self-attention,Awesome Transformers (self-attention) in Computer Vision
User: alohays
self-attention,Representation learning on dynamic graphs using self-attention networks
User: aravindsankar28
self-attention,DSMIL: Dual-stream multiple instance learning networks for tumor detection in Whole Slide Image
User: binli123
self-attention,Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
User: brightmart
self-attention,[AAAI2023] A PyTorch implementation of PDFormer: Propagation Delay-aware Dynamic Long-range Transformer for Traffic Flow Prediction.
Organization: buaabigscity
self-attention,Neat (Neural Attention) Vision, is a visualization tool for the attention mechanisms of deep-learning models for Natural Language Processing (NLP) tasks. (framework-agnostic)
User: cbaziotis
self-attention,An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites
User: cmhungsteve
self-attention,Universal Graph Transformer Self-Attention Networks (TheWebConf WWW 2022) (Pytorch and Tensorflow)
User: daiquocnguyen
self-attention,《李宏毅深度学习教程》(李宏毅老师推荐👍,苹果书🍎),PDF下载地址:https://github.com/datawhalechina/leedl-tutorial/releases
Organization: datawhalechina
self-attention,Pytorch implementation of the Graph Attention Network model by Veličković et. al (2017, https://arxiv.org/abs/1710.10903)
User: diego999
self-attention,Recent Transformer-based CV and related works.
User: dirtyharrylyl
self-attention,[TNSRE 2021] "An Attention-based Deep Learning Approach for Sleep Stage Classification with Single-Channel EEG"
User: emadeldeen24
Home Page: https://ieeexplore.ieee.org/document/9417097
self-attention,Tensorflow implementation of "A Structured Self-Attentive Sentence Embedding"
User: flrngel
self-attention,[NeurIPS 2021 Spotlight] & [IJCV 2024] SOFT: Softmax-free Transformer with Linear Complexity
Organization: fudan-zvg
self-attention,Datasets, tools, and benchmarks for representation learning of code.
Organization: github
Home Page: https://arxiv.org/abs/1909.09436
self-attention,Multi-turn dialogue baselines written in PyTorch
User: gmftbygmftby
self-attention,My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!
User: gordicaleksa
Home Page: https://youtube.com/c/TheAIEpiphany
self-attention,Variants of Vision Transformer and its downstream tasks
User: guanrunwei
self-attention,A Pytorch Implementation of "Attention is All You Need" and "Weighted Transformer Network for Machine Translation"
User: jayparks
self-attention,[NeurIPS'22] Tokenized Graph Transformer (TokenGT), in PyTorch
User: jw9730
self-attention,A PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
User: kaituoxu
self-attention,A Structured Self-attentive Sentence Embedding
User: kaushalshetty
self-attention,PyTorch Implementation of Google's Parallel Tacotron 2: A Non-Autoregressive Neural TTS Model with Differentiable Duration Modeling
User: keonlee9420
self-attention,Important paper implementations for Question Answering using PyTorch
User: kushalj001
self-attention,An implementation of DeepMind's Relational Recurrent Neural Networks (NeurIPS 2018) in PyTorch.
User: l0sg
self-attention,Implementing Lambda Networks using Pytorch
User: leaderj1001
self-attention,Implementing Stand-Alone Self-Attention in Vision Models using Pytorch
User: leaderj1001
self-attention,TransformerCPI: Improving compound–protein interaction prediction by sequence-based deep learning with self-attention mechanism and label reversal experiments(BIOINFORMATICS 2020) https://doi.org/10.1093/bioinformatics/btaa524
User: lifanchen-simm
self-attention,The implementation of DeBERTa
Organization: microsoft
self-attention,This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
User: monk1337
self-attention,[ECCV 2024] Official PyTorch implementation of RoPE-ViT "Rotary Position Embedding for Vision Transformer"
Organization: naver-ai
Home Page: https://arxiv.org/abs/2403.13298
self-attention,Official PyTorch implementation of Fully Attentional Networks
Organization: nvlabs
Home Page: https://arxiv.org/abs/2204.12451
self-attention,[ICLR 2024] Official PyTorch implementation of FasterViT: Fast Vision Transformers with Hierarchical Attention
Organization: nvlabs
Home Page: https://arxiv.org/abs/2306.06189
self-attention,[ICML 2023] Official PyTorch implementation of Global Context Vision Transformers
Organization: nvlabs
Home Page: https://arxiv.org/abs/2206.09959
self-attention,Official PyTorch Implementation of MambaVision: A Hybrid Mamba-Transformer Vision Backbone
Organization: nvlabs
Home Page: https://arxiv.org/abs/2407.08083
self-attention,Graph Attention Networks (https://arxiv.org/abs/1710.10903)
User: petarv-
Home Page: https://petar-v.com/GAT/
self-attention,Text classification using deep learning models in Pytorch
User: prakashpandey9
self-attention,list of efficient attention modules
User: separius
self-attention,2018百度机器阅读理解技术竞赛
User: shiningliang
self-attention,CCNet: Criss-Cross Attention for Semantic Segmentation (TPAMI 2020 & ICCV 2019).
User: speedinghzl
self-attention,Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Organization: the-ai-summer
Home Page: https://theaisummer.com/
self-attention,Keras, PyTorch, and NumPy Implementations of Deep Learning Architectures for NLP
User: tixierae
Home Page: http://arxiv.org/abs/1808.09772
self-attention,PyTorch implementation of the model presented in "Satellite Image Time Series Classification with Pixel-Set Encoders and Temporal Self-Attention"
User: vsainteuf
self-attention,PyTorch implementation of U-TAE and PaPs for satellite image time series panoptic segmentation.
User: vsainteuf
self-attention,[MIR-2023-Survey] A continuously updated paper list for multi-modal pre-trained big models
User: wangxiao5791509
self-attention,The official PyTorch implementation of the paper "SAITS: Self-Attention-based Imputation for Time Series". A fast and state-of-the-art (SOTA) deep-learning neural network model for efficient time-series imputation (impute multivariate incomplete time series containing NaN missing data/values with machine learning). https://arxiv.org/abs/2202.08516
User: wenjiedu
Home Page: https://doi.org/10.1016/j.eswa.2023.119619
self-attention,Code for the paper "MASTER: Multi-Aspect Non-local Network for Scene Text Recognition" (Pattern Recognition 2021)
User: wenwenyu
Home Page: https://arxiv.org/abs/1910.02562
self-attention,A PyTorch implementation of ViTGAN based on paper ViTGAN: Training GANs with Vision Transformers.
User: wilile26811249
self-attention,(ICLR 2022 Spotlight) Official PyTorch implementation of "How Do Vision Transformers Work?"
User: xxxnell
Home Page: https://arxiv.org/abs/2202.06709
self-attention,The GitHub repository for the paper "Informer" accepted by AAAI 2021.
User: zhouhaoyi
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.