Vincent, B. T. (2015) Bayesian accounts of covert selective attention: a tutorial review
-
Updated
Mar 11, 2015 - MATLAB
Vincent, B. T. (2015) Bayesian accounts of covert selective attention: a tutorial review
Deep Learning and Machine Learning mini-projects. Current Project: Deepmind Attentive Reader (rc-data)
An implementation of the Show, Attend and Tell paper in Tensorflow, for the OpenAI Im2LaTeX suggested problem
Exploration of various deep neural networks for Question Answering and Reading Comprehension
"Recurrent Models of Visual Attention" in TensorFlow
Four styles of encoder decoder model by Python, Theano, Keras and Seq2Seq
论文“Attention-over-Attention Neural Networks for Reading Comprehension”中AoA模型实现
Hierarchical Attention Networks for Chinese Sentiment Classification
Bidirectional GRU with attention mechanism on imdb sentimental analysis dataset
Residual Attention Network for Image Classification
Plug-and-Play version of Attention implemented in Tensorflow
Sequence to Sequence and attention from scratch using Tensorflow
PyTorch Implementation of Japanese Chatbot
Deep-learning model presented in "DataStories at SemEval-2017 Task 6: Siamese LSTM with Attention for Humorous Text Comparison".
A TensorFlow Implementation of the Transformer: Attention Is All You Need
Add a description, image, and links to the attention topic page so that developers can more easily learn about it.
To associate your repository with the attention topic, visit your repo's landing page and select "manage topics."