Title | Citations | PageRank | Year |
---|---|---|---|
bert2BERT: Towards Reusable Pretrained Language Models | 0 | 0.34 | 2022 |
Improving task-agnostic BERT distillation with layer mapping search | 0 | 0.34 | 2021 |
DynaBERT: Dynamic BERT with Adaptive Width and Depth | 0 | 0.34 | 2020 |
HyperText: Endowing FastText with Hyperbolic Geometry | 0 | 0.34 | 2020 |