Natural Language Processing

Surgery on an Attentional Neural Network

Surgery on an Attentional Neural Network

Customising an LSTM model to better understand Attention in Sequence to Sequence text prediction

Explaining the concept of ‘Attention’ in Natural Language Processing Models by removing part of the memory function of a Recurrent Neural Network Encoder-Decoder