Scattered or Connected? An Optimized Parameter-efficient
Type: Paper
Introduction
-
What is the name of the IAA paper?
Scattered or Connected? An Optimized Parameter-efficient Tuning Approach for Information Retrieval
-
What are the main contributions of the IAA paper?
- Show how parameter efficient methods for IR are not learning efficient
- ie lower performance or convergence
- Theoretical reasons for learning innefficiency
- Insertion method to smooth loss landscape and
- Show how parameter efficient methods for IR are not learning efficient
Method
-
IAA biencoder results: lags
full finetuning
on many datasets -
IAA cross encoder results: slightly lags
full finetuning
- IAA proposes to improve parameter efficient methods by
adding skip connections to improve gradient flow
- “aside” module
- bottleneck architecture
- What are the three IAA variants?
- different ways of inserting bottleneck layers
Results
-
IAA BN method biencoder results: gives
tiny improvement
-
IAA result summary
- better match or exceed full finetuning performance
- faster convergence
- L adapter works best
- Doesn’t beat methods with more advanced hard negative mining and data augmentation
Conclusions
This paper presents some good theoretical analysis and. It’s clear that if such results were to hold for regimes outside of small encoder models on information retrieval datasets, it would have the potential to make PET vastly more competitive.
Reference
@inproceedings{Ma_2022,
doi = {10.1145/3511808.3557445},
url = {https://doi.org/10.1145%2F3511808.3557445},
year = 2022,
month = {oct},
publisher = ,
author = {Xinyu Ma and Jiafeng Guo and Ruqing Zhang and Yixing Fan and Xueqi Cheng},
title = {Scattered or Connected? An Optimized Parameter-efficient Tuning Approach for Information Retrieval},
booktitle = {Proceedings of the 31st {ACM} International Conference on Information and Knowledge Management}
}