×
Home
Research
Overview
Themes
Publications
Software & Datasets
projects
Deployable AI
Collaborations
People
Faculty
Collaborators
Researchers
staff
Management
Alumni
Blogs
News & Events
Work with Us
Education
Online Courses
Training Programmes
Certificate Programmes
Gallery
About Us
Quick Links
For Students
For Academicians
For Industry
Upcoming Events
Contact
ATTENTION
publications
On the weak link between importance and prunability of attention heads
Aakriti Budhraja
,
Madhura Pande
,
Preksha Nema
and more
In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing . Association for Computational Linguistics.
On the Importance of Local Information in Transformer Based Models
Madhura Pande
,
Aakriti Budhraja
,
Preksha Nema
and more
arXiv:2008.05828
Towards Transparent and Explainable Attention Models
Akash Kumar Mohankumar
,
Preksha Nema
,
Sharan Narasimhan
and more
In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Association for Computational Linguistics.
Attend, Adapt and Transfer: Attentive Deep Architecture for Adaptive Transfer from multiple sources
Janarthanan Rajendran
,
Aravind Lakshminarayanan
,
Mitesh M Khapra
and more
arXiv preprint arXiv:1510.02879