About us
People
faculty
Collaborators
Management
Alumni
Resources
Blogs
Events
News
Newsletter
Education
Online Courses
Training Programmes
Certificate Programmes
Careers
Work with us
Internships
research
Overview
Themes
Publications
Software & Datasets
projects
Deployable AI
Collaborations
QUICK LINKS
For Students
For Academicians
For Industry
Upcoming Events
contact
Attention
Attention
On the weak link between importance and prunability of attention heads
On the Importance of Local Information in Transformer Based Models
Towards Transparent and Explainable Attention Models
Attend, Adapt and Transfer: Attentive Deep Architecture for Adaptive Transfer from multiple sources