We implemented QANet from scratch and improved baseline BiDAF. We also used an ensemble of BiDAF and QANet models to achieve EM/F1 of 69.47/71.96, ranking #3 on the leaderboard as of Mar 4, 2022.
-
Updated
May 9, 2023 - Jupyter Notebook
We implemented QANet from scratch and improved baseline BiDAF. We also used an ensemble of BiDAF and QANet models to achieve EM/F1 of 69.47/71.96, ranking #3 on the leaderboard as of Mar 4, 2022.
An open-source implementation of the paper ``A Structured Self-Attentive Sentence Embedding'' published by IBM and MILA.
This repository is reimplementation of Transformer model which was introduced in 2017 NeurIPS paper "Attention is all you need"
Implementation of different object detection neural networks, in order to detect pedestrians.
Learning parities with various neural network architectures
PyTorch Re-implementation of RNATracker Model.
Implementation of a variety of transformer/MSA based architectures.
A collection of my Jupyter notebooks, showcasing my exploration and learning journey in the field of Computer Vision
Promoter detection of DNA sequences using Transformers from scratch and DNABERT
Deep Interpretable Mortality Model for ICU Risk Prediction
NER for Chinese electronic medical records. Use doc2vec, self_attention and multi_attention.
Focal Modulation Network Implementation
Example of stock price forecasting for S&P500 index (Pytorch, data crawling with Selenium)
Simple character level Transformer
A machine comprehension project for my M.S. degree experiment.
Public repository of our 1st place work at the SIMAH competition held at ECML-PKDD 2019
All the attention modules proposed yet.
Here we try to understand how transformer works and try to replicate architecture from paper published. Also we will train simple architecture on dummy dataset.
Revolutionize text summarization with this Transformer model, leveraging state-of-the-art techniques. Trained on news articles, it produces concise summaries effortlessly. Explore cutting-edge capabilities for your summarization needs.
Investigating the affect of Brain-Inspired features on CNNs (for the task of Image Classification)
Add a description, image, and links to the self-attention topic page so that developers can more easily learn about it.
To associate your repository with the self-attention topic, visit your repo's landing page and select "manage topics."