Abstract | ||
---|---|---|
In the last year, new models and methods for pretraining and transfer learning have driven striking performance improvements across a range of language understanding tasks. The GLUE benchmark, introduced a little over one year ago, offers a single-number metric that summarizes progress on a diverse set of such tasks, but performance on the benchmark has recently surpassed the level of non-expert humans, suggesting limited headroom for further research. In this paper we present SuperGLUE, a new benchmark styled after GLUE with a new set of more difficult language understanding tasks, a software toolkit, and a public leaderboard. SuperGLUE is available at super.gluebenchmark.com. |
Year | Venue | Keywords |
---|---|---|
2019 | ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019) | transfer learning,last year |
Field | DocType | Volume |
Software engineering,Computer science,Transfer of learning,Software,Headroom (audio signal processing),Artificial intelligence,Natural language processing,Language understanding,General-purpose language | Journal | 32 |
ISSN | Citations | PageRank |
1049-5258 | 7 | 0.44 |
References | Authors | |
0 | 8 |
Name | Order | Citations | PageRank |
---|---|---|---|
Alex Wang | 1 | 71 | 5.27 |
Yada Pruksachatkun | 2 | 8 | 1.81 |
Nikita Nangia | 3 | 132 | 6.56 |
Amanpreet Singh | 4 | 109 | 8.34 |
julian michael | 5 | 78 | 5.08 |
Felix Hill | 6 | 346 | 17.90 |
Omer Levy | 7 | 1387 | 56.96 |
Samuel R. Bowman | 8 | 906 | 44.99 |