Machine Translation on WMT2014 English-German
vs.
open | ||||
---|---|---|---|---|
Transformer Cycle (Rev) | 2021 | 35.14 | - | |
Noisy back-translation | 2018 | 35.00 | 146 EFLOPs | |
Transformer+Rep (Uni) | 2021 | 33.89 | - | |
T5-11B | 2019 | 32.10 | - | |
BiBERT | 2021 | 31.26 | 1.02 ZFLOPs | |
Transformer + R-Drop | 2021 | 30.91 | - | |
BERT-fused NMT | 2020 | 30.75 | 114 EFLOPs | |
Data Diversification - Transformer | 2019 | 30.70 | - | |
Mask Attention Network (big) | 2021 | 30.40 | - | |
Transformer (ADMIN init) | 2020 | 30.10 | - |
Want to contribute?
You have access to our database where you can point out any errors or suggest changes
Go to database