wmt-2020-pl-en
Translate from Polish to English. [ver. 1.0.0]
This is a long list of all submissions, if you want to see only the best, click leaderboard.
| # | submitter | when | ver. | description | dev-0 BLEU | test-A BLEU | |
|---|---|---|---|---|---|---|---|
| 253 | Marcin Kostrzewski | 2023-06-14 22:17 | 1.0.0 | s444409 simple lstm model lstm | 0.00 | 0.00 | |
| 214 | s444465 | 2023-06-14 20:50 | 1.0.0 | 444465 simple lstm model | 0.28 | 0.33 | |
| 191 | Mikołaj Pokrywka | 2023-06-04 15:47 | 1.0.0 | wip | 99.95 | 2.74 | |
| 273 | [anonymized] | 2021-03-13 16:37 | 1.0.0 | my brilliant solution fairseq | N/A | N/A | |
| 272 | [anonymized] | 2021-03-13 15:46 | 1.0.0 | my brilliant solution moses postprocessing | N/A | N/A | |
| 271 | [anonymized] | 2021-03-13 15:29 | 1.0.0 | my brilliant solution moses | N/A | N/A | |
| 170 | [anonymized] | 2021-02-23 18:38 | 1.0.0 | solution moses | 8.02 | 7.22 | |
| 17 | [anonymized] | 2021-02-23 18:30 | 1.0.0 | solution stupid | 33.62 | 33.21 | |
| 55 | kubapok | 2021-02-23 11:14 | 1.0.0 | lightconv_spe | 19.00 | 18.37 | |
| 56 | kubapok | 2021-02-23 10:26 | 1.0.0 | lightconv_bpe_10k | 18.75 | 18.33 | |
| 5 | [anonymized] | 2021-02-20 14:40 | 1.0.0 | with usage of google_trans_new stupid | 33.65 | 33.27 | |
| 105 | [anonymized] | 2021-02-18 20:24 | 1.0.0 | final postprocessing moses postprocessing | 7.45 | 10.30 | |
| 153 | [anonymized] | 2021-02-18 18:57 | 1.0.0 | pp | 7.45 | 8.39 | |
| 64 | [anonymized] | 2021-02-18 13:13 | 1.0.0 | fix moses postprocessing | 14.86 | 15.34 | |
| 270 | [anonymized] | 2021-02-18 10:47 | 1.0.0 | add sourcefile (postprocessing with googletrans and language-tool" moses postprocessing | 14.32 | N/A | |
| 61 | [anonymized] | 2021-02-18 03:39 | 1.0.0 | add source code moses unks | 15.79 | 16.36 | |
| 184 | [anonymized] | 2021-02-17 23:58 | 1.0.0 | first try | 7.20 | 6.18 | |
| 152 | [anonymized] | 2021-02-17 17:04 | 1.0.0 | added time sleep | 7.45 | 8.39 | |
| 151 | [anonymized] | 2021-02-17 17:03 | 1.0.0 | Merge branch 'hg_postprocessing_pl' of ssh://gonito.net/galazkah/wmt-2020-pl-en into hg_postprocessing_pl | 7.45 | 8.39 | |
| 150 | [anonymized] | 2021-02-17 17:00 | 1.0.0 | added postprocessing but google api timed out; will fix the output when it resets | 7.45 | 8.39 | |
| 43 | [anonymized] | 2021-02-17 09:51 | 1.0.0 | Translate done using M2M-100 model fairseq m2m-100 just-inference | 29.50 | 29.45 | |
| 91 | [anonymized] | 2021-02-16 19:24 | 1.0.0 | added simple post processing moses postprocessing | 10.64 | 11.03 | |
| 100 | [anonymized] | 2021-02-16 17:03 | 1.0.0 | moses 2nd solution moses | 10.22 | 10.64 | |
| 252 | [anonymized] | 2021-02-16 12:46 | 1.0.0 | fairseq first training, 14 epochs, max senstence length = 80, 10k bpe units | 0.00 | 0.00 | |
| 220 | [anonymized] | 2021-02-11 18:10 | 1.0.0 | GRU attention with bpe bpe pytorch-nn attention | N/A | 0.13 | |
| 4 | [anonymized] | 2021-02-10 12:23 | 1.0.0 | basic stupid | 33.51 | 33.31 | |
| 57 | kubapok | 2021-02-10 07:29 | 1.0.0 | fairseq transformer bpe10k epoch18 | 16.99 | 16.96 | |
| 211 | [anonymized] | 2021-02-10 06:35 | 1.0.0 | fairseq-short | 0.47 | 0.56 | |
| 209 | [anonymized] | 2021-02-10 06:28 | 1.0.0 | fairseq-long BPE=5000 train-size=100000 | 0.63 | 0.75 | |
| 198 | [anonymized] | 2021-02-09 22:21 | 1.0.0 | Add 34 pytorch-nn fairseq | 1.65 | 2.37 | |
| 204 | [anonymized] | 2021-02-09 22:09 | 1.0.0 | Add bpe bpe pytorch-nn gru | 1.65 | 1.62 | |
| 199 | [anonymized] | 2021-02-09 21:59 | 1.0.0 | better solution fairseq | 2.44 | 2.37 | |
| 62 | kubapok | 2021-02-09 21:51 | 1.0.0 | fairseq bpe 10k beam 50 | 15.52 | 15.90 | |
| 63 | kubapok | 2021-02-09 21:22 | 1.0.0 | fairseq bpe 10k | 15.27 | 15.48 | |
| 207 | [anonymized] | 2021-02-09 13:03 | 1.0.0 | lr 0.2, drop 0.2, input 100k, epochs 3 fairseq | 0.75 | 1.07 | |
| 206 | [anonymized] | 2021-02-09 12:50 | 1.0.0 | fairseq v2 | 0.75 | 1.07 | |
| 251 | [anonymized] | 2021-02-09 11:18 | 1.0.0 | fairseq v1 | 0.00 | 0.00 | |
| 250 | [anonymized] | 2021-02-08 21:32 | 1.0.0 | Basic GRU with attention, longer training bpe pytorch-nn attention | N/A | 0.00 | |
| 249 | [anonymized] | 2021-02-08 21:06 | 1.0.0 | Basic GRU with attention, longer training bpe pytorch-nn attention | N/A | 0.00 | |
| 248 | [anonymized] | 2021-02-08 20:39 | 1.0.0 | Basic GRU with attention bpe pytorch-nn attention | N/A | 0.00 | |
| 247 | [anonymized] | 2021-02-08 16:05 | 1.0.0 | Very simple GRU Attention, updated bpe pytorch-nn attention | N/A | 0.00 | |
| 246 | [anonymized] | 2021-02-08 14:32 | 1.0.0 | Very simple GRU Attention, updated bpe pytorch-nn attention | N/A | 0.00 | |
| 245 | [anonymized] | 2021-02-08 13:33 | 1.0.0 | Very simple GRU Attention bpe pytorch-nn attention | N/A | 0.00 | |
| 210 | [anonymized] | 2021-02-07 19:41 | 1.0.0 | fairseq lr=0.01 lines=200k epochs=2 fairseq | 0.49 | 0.61 | |
| 197 | [anonymized] | 2021-02-07 17:12 | 1.0.0 | lstm lr 0,005 drop 0.35 | N/A | 2.46 | |
| 244 | [anonymized] | 2021-02-07 16:22 | 1.0.0 | LSTM, lr=0.4 | N/A | 0.00 | |
| 243 | [anonymized] | 2021-02-07 15:56 | 1.0.0 | lstm, lr 0.45 | N/A | 0.00 | |
| 200 | [anonymized] | 2021-02-07 14:14 | 1.0.0 | Add fairseq long version | 1.75 | 2.08 | |
| 208 | [anonymized] | 2021-02-07 14:13 | 1.0.0 | Add fairseq short version | 0.63 | 0.75 | |
| 242 | [anonymized] | 2021-02-06 23:59 | 1.0.0 | fairseq v1 | N/A | 0.00 | |
| 218 | [anonymized] | 2021-02-05 15:49 | 1.0.0 | test lstm attention bpe 6 bpe pytorch-nn attention | 0.28 | 0.20 | |
| 241 | [anonymized] | 2021-02-05 00:37 | 1.0.0 | fsd | 0.19 | 0.00 | |
| 240 | [anonymized] | 2021-02-04 22:08 | 1.0.0 | try please | 0.09 | 0.00 | |
| 239 | [anonymized] | 2021-02-04 20:55 | 1.0.0 | try2 | 0.19 | 0.00 | |
| 238 | [anonymized] | 2021-02-04 18:23 | 1.0.0 | try? | 0.17 | 0.00 | |
| 222 | [anonymized] | 2021-02-04 16:08 | 1.0.0 | pl-en-gru | 0.19 | 0.13 | |
| 221 | [anonymized] | 2021-02-03 21:25 | 1.0.0 | pl-en-gru bpe pytorch-nn attention | 0.00 | 0.13 | |
| 219 | [anonymized] | 2021-02-03 13:59 | 1.0.0 | my brilliant solution-11 bpe pytorch-nn attention | 0.00 | 0.17 | |
| 237 | [anonymized] | 2021-02-03 13:43 | 1.0.0 | test lstm attention bpe test 1 | 0.00 | 0.00 | |
| 236 | [anonymized] | 2021-02-03 09:58 | 1.0.0 | gru attention test2 bpe 2 | 0.10 | 0.00 | |
| 235 | [anonymized] | 2021-02-03 08:54 | 1.0.0 | gru attention bpe test | 0.00 | 0.00 | |
| 224 | [anonymized] | 2021-02-03 08:26 | 1.0.0 | final pytorch-nn gru attention | 0.09 | 0.09 | |
| 203 | [anonymized] | 2021-02-03 02:24 | 1.0.0 | zadanie 1poprawka bpe pytorch-nn attention | 1.65 | 1.62 | |
| 234 | [anonymized] | 2021-02-03 02:14 | 1.0.0 | zadanie 1 | 0.00 | 0.00 | |
| 226 | [anonymized] | 2021-02-03 01:54 | 1.0.0 | my not so brilliant solution | N/A | 0.00 | |
| 201 | [anonymized] | 2021-02-02 21:54 | 1.0.0 | solution bpe pytorch-nn attention | 1.65 | 1.72 | |
| 202 | p/tlen | 2021-02-02 20:45 | 1.0.0 | BiLSTM encoder-GRU decoder epochs=10 hidden-size=256 max-length=40 vocab-size=10000 self-made bilstm pytorch-nn | 1.65 | 1.62 | |
| 269 | p/tlen | 2021-02-02 20:44 | 1.0.0 | LSTM encoder-GRU decoder epochs=10 hidden-size=256 max-length=40 vocab-size=10000 self-made bilstm pytorch-nn | 1.65 | N/A | |
| 213 | p/tlen | 2021-02-02 20:43 | 1.0.0 | LSTM encoder-GRU decoder epochs=10 hidden-size=256 max-length=40 vocab-size=10000 self-made bilstm pytorch-nn | 1.65 | 0.50 | |
| 205 | [anonymized] | 2021-01-31 00:10 | 1.0.0 | v1 bpe pytorch-nn attention | 0.92 | 1.56 | |
| 233 | [anonymized] | 2021-01-28 07:38 | 1.0.0 | gru attention bpe pytorch-nn attention | 0.27 | 0.00 | |
| 217 | [anonymized] | 2021-01-27 08:11 | 1.0.0 | final pytorch-nn gru | 0.17 | 0.22 | |
| 216 | [anonymized] | 2021-01-27 08:10 | 1.0.0 | test v6 | 0.17 | 0.22 | |
| 215 | [anonymized] | 2021-01-27 08:08 | 1.0.0 | test v5 | 0.00 | 0.22 | |
| 232 | [anonymized] | 2021-01-27 07:41 | 1.0.0 | Right out.tsv lstm pytorch-nn | 0.00 | 0.00 | |
| 231 | [anonymized] | 2021-01-27 07:30 | 1.0.0 | LSTM 632000pairs lstm pytorch-nn | 0.00 | 0.00 | |
| 212 | p/tlen | 2021-01-27 06:04 | 1.0.0 | LSTM encoder-GRU decoder epochs=100 hidden-size=256 max-length=30 vocab-size=10000 self-made lstm pytorch-nn | 0.56 | 0.50 | |
| 268 | p/tlen | 2021-01-27 06:01 | 1.0.0 | LSTM encoder-GRU decoder epochs=100 hidden-size=256 max-length=30 vocab-size=10000 self-made lstm pytorch-nn | 0.56 | N/A | |
| 230 | [anonymized] | 2021-01-27 02:20 | 1.0.0 | test v3 | 0.00 | 0.00 | |
| 229 | [anonymized] | 2021-01-27 02:13 | 1.0.0 | test v2 | 0.00 | 0.00 | |
| 228 | [anonymized] | 2021-01-27 01:22 | 1.0.0 | Pierwszy test | 0.00 | 0.00 | |
| 52 | [anonymized] | 2021-01-27 00:03 | 1.0.0 | 1st_again stupid | 31.31 | 24.47 | |
| 267 | [anonymized] | 2021-01-26 23:43 | 1.0.0 | 1st_again | N/A | N/A | |
| 266 | [anonymized] | 2021-01-26 13:47 | 1.0.0 | basic | 33.51 | N/A | |
| 36 | [anonymized] | 2021-01-25 13:32 | 1.0.0 | TAU01 googletrans stupid | 31.51 | 31.06 | |
| 225 | [anonymized] | 2021-01-25 12:40 | 1.0.0 | TAU01 stupid | 0.05 | 0.08 | |
| 157 | [anonymized] | 2021-01-12 23:58 | 1.0.0 | moses moses | 1.77 | 8.34 | |
| 38 | [anonymized] | 2021-01-12 18:16 | 1.0.0 | Added dev-0 correctly stupid | 31.28 | 30.77 | |
| 136 | [anonymized] | 2020-12-09 10:52 | 1.0.0 | Fixes. moses postprocessing | 3.47 | 8.71 | |
| 265 | [anonymized] | 2020-12-09 09:45 | 1.0.0 | add dev-0 | 30.56 | N/A | |
| 123 | [anonymized] | 2020-12-08 13:56 | 1.0.0 | moses moses | 30.31 | 9.85 | |
| 122 | [anonymized] | 2020-12-08 13:42 | 1.0.0 | pl to eng translation stupid | 30.31 | 9.85 | |
| 60 | [anonymized] | 2020-12-04 22:37 | 1.0.0 | moses full train simple postprocessing moses postprocessing | 19.32 | 16.40 | |
| 264 | [anonymized] | 2020-12-02 18:23 | 1.0.0 | basic stupid | N/A | N/A | |
| 263 | [anonymized] | 2020-12-02 18:20 | 1.0.0 | basic stupid | N/A | N/A | |
| 262 | [anonymized] | 2020-12-02 09:25 | 1.0.0 | Trigram slef-made self-made lm trigram | N/A | N/A | |
| 261 | [anonymized] | 2020-12-02 09:22 | 1.0.0 | Solution based on OOV and using bing train-size=333333 moses postprocessing | N/A | N/A | |
| 161 | [anonymized] | 2020-12-01 23:02 | 1.0.0 | solu moses unks | 8.05 | 8.18 | |
| 162 | [anonymized] | 2020-12-01 22:26 | 1.0.0 | solution moses postprocessing | 8.04 | 8.14 | |
| 80 | [anonymized] | 2020-11-27 11:21 | 1.0.0 | added result | 11.82 | 12.57 | |
| 133 | [anonymized] | 2020-11-26 07:01 | 1.0.0 | Dodano pliki źródłowe moses postprocessing | 8.96 | 8.83 | |
| 112 | [anonymized] | 2020-11-25 19:59 | 1.0.0 | Add GTrans postprocessing moses postprocessing | 10.11 | 10.13 | |
| 172 | [anonymized] | 2020-11-25 10:52 | 1.0.0 | added file made for postprocessing my words moses postprocessing | 8.34 | 7.22 | |
| 130 | [anonymized] | 2020-11-25 00:23 | 1.0.0 | Add Moses train-size=20000 train-size=20000 moses | 7.30 | 9.47 | |
| 160 | [anonymized] | 2020-11-24 23:28 | 1.0.0 | Add Moses train-size=10000 moses | 6.78 | 8.19 | |
| 94 | [anonymized] | 2020-11-24 20:17 | 1.0.0 | Moses train size 150k (fast align instead of giza) moses fast-align | 10.57 | 10.90 | |
| 171 | [anonymized] | 2020-11-18 12:09 | 1.0.0 | another change in outputs moses postprocessing | 8.34 | 7.22 | |
| 99 | [anonymized] | 2020-11-18 08:32 | 1.0.0 | Postprocessing done with python grammar correction lib moses postprocessing | 10.22 | 10.64 | |
| 3 | [anonymized] | 2020-11-18 07:32 | 1.0.0 | add python translate script that improved BLEU moses postprocessing | 31.51 | 33.33 | |
| 46 | [anonymized] | 2020-11-18 03:15 | 1.0.0 | 2nd try moses postprocessing | 32.93 | 26.07 | |
| 82 | [anonymized] | 2020-11-18 03:02 | 1.0.0 | simple manual postprocessing by detokenization, detraucesing, searching spaces etc. on start line and googletrans stupid moses unks postprocessing | 15.70 | 12.22 | |
| 45 | [anonymized] | 2020-11-18 02:43 | 1.0.0 | first try | 31.31 | 26.07 | |
| 260 | [anonymized] | 2020-11-18 02:31 | 1.0.0 | first try | 31.31 | N/A | |
| 178 | [anonymized] | 2020-11-18 00:38 | 1.0.0 | last try | 6.02 | 7.14 | |
| 145 | [anonymized] | 2020-11-18 00:07 | 1.0.0 | 30 k - post-processing moses postprocessing | 10.14 | 8.66 | |
| 144 | [anonymized] | 2020-11-18 00:01 | 1.0.0 | dodane z mojej galezi MK - Marek Aureliusz | 10.13 | 8.66 | |
| 174 | [anonymized] | 2020-11-17 23:44 | 1.0.0 | TAU to logia MK | 8.21 | 7.15 | |
| 175 | [anonymized] | 2020-11-17 23:38 | 1.0.0 | moje blyskotliwe rozwiazanie | 8.09 | 7.15 | |
| 126 | [anonymized] | 2020-11-17 23:05 | 1.0.0 | Add postprocessing moses postprocessing | 30.59 | 9.66 | |
| 121 | [anonymized] | 2020-11-17 23:03 | 1.0.0 | postprocessing_2 moses postprocessing | 30.31 | 9.85 | |
| 142 | [anonymized] | 2020-11-17 22:57 | 1.0.0 | test | 30.59 | 8.69 | |
| 129 | [anonymized] | 2020-11-17 22:52 | 1.0.0 | postprocessing | 30.31 | 9.49 | |
| 190 | [anonymized] | 2020-11-17 22:42 | 1.0.0 | Moses unks drop-unks trainsize 25k lines moses unks | 4.39 | 4.58 | |
| 137 | [anonymized] | 2020-11-17 22:24 | 1.0.0 | Add postprocessing | 30.59 | 8.70 | |
| 135 | [anonymized] | 2020-11-17 22:06 | 1.0.0 | moses moses | 30.31 | 8.71 | |
| 108 | [anonymized] | 2020-11-17 21:40 | 1.0.0 | deleted remaining polish words moses postprocessing | 10.19 | 10.18 | |
| 177 | [anonymized] | 2020-11-17 20:39 | 1.0.0 | result on 10k set after postprocessing pt.2 moses | 6.16 | 7.14 | |
| 104 | [anonymized] | 2020-11-17 20:19 | 1.0.0 | simple postprocess moses postprocessing | 10.37 | 10.53 | |
| 79 | [anonymized] | 2020-11-17 20:02 | 1.0.0 | added postprocessing moses postprocessing | 11.82 | 12.57 | |
| 107 | [anonymized] | 2020-11-17 19:29 | 1.0.0 | fast align on 100000 lines moses fast-align | 9.86 | 10.18 | |
| 134 | [anonymized] | 2020-11-17 19:18 | 1.0.0 | Undo | 30.59 | 8.71 | |
| 20 | [anonymized] | 2020-11-17 18:58 | 1.0.0 | add postprocessing moses postprocessing | 16.00 | 32.41 | |
| 115 | [anonymized] | 2020-11-17 18:39 | 1.0.0 | new_lm_v1.0_42M train-size=165000 lm moses | 9.59 | 10.09 | |
| 93 | [anonymized] | 2020-11-17 17:58 | 1.0.0 | Moses train size 150k (fast align instead of giza) moses fast-align | 10.57 | 10.90 | |
| 169 | [anonymized] | 2020-11-17 16:55 | 1.0.0 | result on 10k set after postprocessing moses postprocessing | 6.61 | 7.22 | |
| 132 | [anonymized] | 2020-11-17 12:27 | 1.0.0 | postprocess moses postprocessing | 8.96 | 8.83 | |
| 156 | [anonymized] | 2020-11-17 09:55 | 1.0.0 | moses10k moses | N/A | 8.34 | |
| 89 | [anonymized] | 2020-11-15 18:56 | 1.0.0 | moses train-size=25000, simple-postprocessing moses simple postprocessing | 10.75 | 11.06 | |
| 102 | [anonymized] | 2020-11-15 18:30 | 1.0.0 | moses train-size=25000, simple-postprocessing moses simple postprocessing | 10.75 | 10.59 | |
| 103 | [anonymized] | 2020-11-15 17:56 | 1.0.0 | moses train-size=25000, simple-postprocessing moses simple postprocessing | 10.75 | 10.59 | |
| 120 | [anonymized] | 2020-11-14 17:33 | 1.0.0 | Bigger lm 1.9mln lines lm moses | 9.64 | 9.92 | |
| 193 | [anonymized] | 2020-11-14 15:51 | 1.0.0 | Added lm based on 18mln lines of data lm moses | 2.12 | 2.72 | |
| 66 | [anonymized] | 2020-11-13 20:16 | 1.0.0 | moses simple postprocessing moses postprocessing | 8.10 | 14.79 | |
| 72 | [anonymized] | 2020-11-13 20:14 | 1.0.0 | moses full training size with dev tuning for missing score for dev moses | 8.03 | 14.28 | |
| 81 | [anonymized] | 2020-11-13 18:43 | 1.0.0 | moses train size=158150, shuffled, without tunning moses | 7.82 | 12.55 | |
| 125 | [anonymized] | 2020-11-11 22:05 | 1.0.0 | moses train-size=25000 moses | 9.64 | 9.71 | |
| 84 | [anonymized] | 2020-11-11 12:24 | 1.0.0 | results after my postprocessing train-size=65500 moses postprocessing | 11.90 | 11.96 | |
| 87 | [anonymized] | 2020-11-11 11:23 | 1.0.0 | results after postprocessing train-size=65500 moses postprocessing | 11.46 | 11.44 | |
| 59 | [anonymized] | 2020-11-10 23:52 | 1.0.0 | Moses - 145422824 - 632600 lm moses | 8.77 | 16.57 | |
| 58 | [anonymized] | 2020-11-10 23:48 | 1.0.0 | Moses - 145422824 - 632600 | 8.77 | 16.57 | |
| 90 | [anonymized] | 2020-11-10 20:58 | 1.0.0 | results before postprocessing train-size=65500 moses postprocessing | 10.96 | 11.06 | |
| 65 | [anonymized] | 2020-11-10 10:15 | 1.0.0 | Moses - 006 - 632600 | 8.06 | 14.81 | |
| 166 | [anonymized] | 2020-11-09 20:18 | 1.0.0 | Solution based on OOV and using bing train-size=333333 moses postprocessing | 7.56 | 7.78 | |
| 88 | [anonymized] | 2020-11-07 10:54 | 1.0.0 | moses fast_align train-size=65500 moses fast-align | 11.21 | 11.17 | |
| 167 | [anonymized] | 2020-11-05 07:04 | 1.0.0 | Solution based on moses and 333333 lines of input. train-size=333333 moses | 7.07 | 7.29 | |
| 2 | [anonymized] | 2020-11-04 20:09 | 1.0.0 | results after some postprocessing moses postprocessing | 31.51 | 33.33 | |
| 148 | [anonymized] | 2020-11-04 17:00 | 1.0.0 | moses 30k moses | 7.51 | 8.45 | |
| 149 | [anonymized] | 2020-11-04 16:53 | 1.0.0 | moses solution 30k lines moses | 7.45 | 8.39 | |
| 141 | [anonymized] | 2020-11-04 12:14 | 1.0.0 | Clean | 30.59 | 8.69 | |
| 140 | [anonymized] | 2020-11-04 11:35 | 1.0.0 | Add pipenv. | 30.59 | 8.69 | |
| 139 | [anonymized] | 2020-11-04 11:07 | 1.0.0 | Add .gitignore | 30.59 | 8.69 | |
| 86 | [anonymized] | 2020-11-04 11:00 | 1.0.0 | moses train-size=100000 moses | 11.67 | 11.64 | |
| 138 | [anonymized] | 2020-11-04 10:57 | 1.0.0 | Fixes in.tsv. Add new transalte in moses. moses | 30.59 | 8.69 | |
| 97 | [anonymized] | 2020-11-04 10:43 | 1.0.0 | Moses train size 150k moses | 10.52 | 10.71 | |
| 259 | [anonymized] | 2020-11-04 10:02 | 1.0.0 | Fix translate. | 30.59 | N/A | |
| 96 | [anonymized] | 2020-11-04 09:42 | 1.0.0 | Moses train size 50k moses | 10.76 | 10.81 | |
| 258 | [anonymized] | 2020-11-04 09:19 | 1.0.0 | Add moses result moses | 30.59 | N/A | |
| 176 | [anonymized] | 2020-11-04 09:17 | 1.0.0 | My solution for 10k lines moses | 7.89 | 7.15 | |
| 143 | [anonymized] | 2020-11-04 09:11 | 1.0.0 | My solution for 30k line moses | 9.76 | 8.66 | |
| 159 | [anonymized] | 2020-11-04 05:52 | 1.0.0 | Moses 2 25k lines train moses train | 7.58 | 8.26 | |
| 1 | [anonymized] | 2020-11-04 02:45 | 1.0.0 | v2 with fixed google trans limitations stupid | 33.77 | 33.33 | |
| 173 | [anonymized] | 2020-11-04 02:38 | 1.0.0 | Moses for 50k docs improvement moses | 11.19 | 7.17 | |
| 182 | [anonymized] | 2020-11-04 02:21 | 1.0.0 | Moses for 50k docs moses | 11.19 | 6.41 | |
| 147 | [anonymized] | 2020-11-04 02:10 | 1.0.0 | zadanie-2.1.2 train-size=22000 moses | 7.74 | 8.48 | |
| 127 | [anonymized] | 2020-11-04 01:04 | 1.0.0 | zadanie-2.1.1 train-size=100000 moses | 8.51 | 9.65 | |
| 168 | [anonymized] | 2020-11-03 23:52 | 1.0.0 | moses, 10000 lines moses | 8.02 | 7.22 | |
| 113 | [anonymized] | 2020-11-03 23:48 | 1.0.0 | moses 100000 moses | 9.80 | 10.10 | |
| 10 | [anonymized] | 2020-11-03 23:32 | 1.0.0 | moses | 30.59 | 33.22 | |
| 18 | [anonymized] | 2020-11-03 23:26 | 1.0.0 | moses | 30.31 | 33.18 | |
| 85 | [anonymized] | 2020-11-03 23:10 | 1.0.0 | result on 100k moses | 10.97 | 11.77 | |
| 118 | [anonymized] | 2020-11-03 23:07 | 1.0.0 | result on 30k moses | 9.26 | 10.00 | |
| 23 | [anonymized] | 2020-11-03 22:43 | 1.0.0 | moses, 30000 lines moses | 9.83 | 32.27 | |
| 110 | [anonymized] | 2020-11-03 21:41 | 1.0.0 | moses 30000 moses | 9.84 | 10.15 | |
| 131 | [anonymized] | 2020-11-03 21:16 | 1.0.0 | Moses 1 50k lines training moses train | 8.37 | 8.93 | |
| 192 | [anonymized] | 2020-11-03 20:18 | 1.0.0 | sub 2 moses | 2.07 | 2.72 | |
| 165 | [anonymized] | 2020-11-03 19:37 | 1.0.0 | sub 1 moses | 7.01 | 7.82 | |
| 194 | [anonymized] | 2020-11-03 19:08 | 1.0.0 | Test first approach | 30.59 | 2.65 | |
| 164 | [anonymized] | 2020-11-03 19:04 | 1.0.0 | moses solution 1 moses | 7.99 | 8.13 | |
| 183 | [anonymized] | 2020-11-03 19:03 | 1.0.0 | moses moses | 8.23 | 6.36 | |
| 31 | [anonymized] | 2020-11-03 19:01 | 1.0.0 | Test push | 30.59 | 31.62 | |
| 155 | [anonymized] | 2020-11-03 17:50 | 1.0.0 | moses solution 2 moses | 7.98 | 8.36 | |
| 71 | [anonymized] | 2020-11-03 17:41 | 1.0.0 | moses full training size with dev tuning moses | 30.00 | 14.28 | |
| 68 | [anonymized] | 2020-11-03 14:25 | 1.0.0 | Moses - 13 - 632600 moses | 8.11 | 14.70 | |
| 69 | [anonymized] | 2020-11-03 14:22 | 1.0.0 | Moses - 14 - 120000 | 8.21 | 14.54 | |
| 163 | [anonymized] | 2020-11-03 09:56 | 1.0.0 | 150k solution moses | 8.31 | 8.13 | |
| 196 | [anonymized] | 2020-11-03 09:52 | 1.0.0 | 30k solution moses | 2.02 | 2.53 | |
| 67 | [anonymized] | 2020-11-03 08:01 | 1.0.0 | Moses - 13 - 632600 moses | 8.11 | 14.70 | |
| 73 | [anonymized] | 2020-11-03 01:40 | 1.0.0 | Moses - 12 - 480000 moses | 8.11 | 14.27 | |
| 76 | [anonymized] | 2020-11-02 21:11 | 1.0.0 | Moses - 11 - 240000 moses | 8.26 | 13.44 | |
| 111 | [anonymized] | 2020-11-02 20:54 | 1.0.0 | moses_v1.0_165k train-size=165000 moses | 9.43 | 10.14 | |
| 158 | [anonymized] | 2020-11-02 20:49 | 1.0.0 | moses_v1.0_23k train-size=23000 | 7.18 | 8.28 | |
| 124 | [anonymized] | 2020-11-02 19:15 | 1.0.0 | 120k train size moses | 9.45 | 9.84 | |
| 187 | [anonymized] | 2020-11-02 18:01 | 1.0.0 | Moses train-size=15k moses | 5.12 | 5.79 | |
| 98 | [anonymized] | 2020-11-02 17:36 | 1.0.0 | Moses - 10 - 30000 | 7.47 | 10.65 | |
| 188 | [anonymized] | 2020-11-02 17:02 | 1.0.0 | 50k train-size moses | 4.75 | 5.43 | |
| 77 | [anonymized] | 2020-11-02 17:01 | 1.0.0 | moses solution train-size=297483 moses | 13.08 | 12.96 | |
| 83 | [anonymized] | 2020-11-02 13:48 | 1.0.0 | Moses - 9 - 120000 | 7.96 | 12.05 | |
| 95 | [anonymized] | 2020-11-02 10:45 | 1.0.0 | Moses - 9 - 60000 moses | 7.71 | 10.89 | |
| 114 | [anonymized] | 2020-11-02 10:22 | 1.0.0 | Moses - 8 - 30000 moses | 7.57 | 10.10 | |
| 78 | [anonymized] | 2020-11-01 18:23 | 1.0.0 | moses solution train-size=199980 moses | 12.67 | 12.66 | |
| 128 | [anonymized] | 2020-11-01 17:54 | 1.0.0 | Moses - 7 - 30000 | 7.17 | 9.62 | |
| 109 | [anonymized] | 2020-11-01 16:22 | 1.0.0 | Moses - 7 - 30000 | 7.51 | 10.16 | |
| 119 | [anonymized] | 2020-11-01 15:35 | 1.0.0 | Moses - 6 - 632600 | 7.42 | 9.99 | |
| 116 | [anonymized] | 2020-11-01 14:52 | 1.0.0 | Moses - 5 - 30000 | 7.39 | 10.06 | |
| 75 | [anonymized] | 2020-11-01 09:07 | 1.0.0 | Moses - 4 - 632600 | 8.02 | 14.13 | |
| 74 | [anonymized] | 2020-11-01 09:02 | 1.0.0 | Moses - 4 - 632600 | 8.02 | 14.13 | |
| 106 | [anonymized] | 2020-11-01 02:32 | 1.0.0 | Moses - 3 - 30000 | 7.49 | 10.21 | |
| 117 | [anonymized] | 2020-11-01 01:45 | 1.0.0 | Moses - 2 - 30000 | 7.38 | 10.06 | |
| 181 | [anonymized] | 2020-11-01 01:01 | 1.0.0 | Moses - 1 - 30000 | 4.80 | 6.58 | |
| 180 | [anonymized] | 2020-11-01 01:00 | 1.0.0 | Moses - 1 | 4.80 | 6.58 | |
| 186 | [anonymized] | 2020-10-31 14:02 | 1.0.0 | translated with moses (train-size=30k) moses | 5.36 | 5.98 | |
| 146 | [anonymized] | 2020-10-31 13:53 | 1.0.0 | moses solution train-size=131000 moses | 8.66 | 8.63 | |
| 185 | [anonymized] | 2020-10-31 10:22 | 1.0.0 | Solution based on moses and 333333 lines of input. train-size=333333 moses | 5.34 | 6.05 | |
| 70 | kubapok | 2020-10-30 08:43 | 1.0.0 | moses smt | 15.00 | 14.53 | |
| 51 | [anonymized] | 2020-10-29 18:26 | 1.0.0 | First try | 31.31 | 24.47 | |
| 257 | [anonymized] | 2020-10-29 18:03 | 1.0.0 | 2nd try | 31.31 | N/A | |
| 179 | [anonymized] | 2020-10-29 18:02 | 1.0.0 | Solution based on moses and 111,111 lines of input. train-size=111111 moses | 6.12 | 6.73 | |
| 256 | [anonymized] | 2020-10-29 17:25 | 1.0.0 | First try | N/A | N/A | |
| 255 | [anonymized] | 2020-10-28 20:41 | 1.0.0 | Laboratorium 1 | 2.04 | N/A | |
| 154 | [anonymized] | 2020-10-28 20:34 | 1.0.0 | first test stupid | 11.35 | 8.36 | |
| 92 | [anonymized] | 2020-10-28 20:02 | 1.0.0 | moses solution train-size=65500 moses | 10.93 | 10.95 | |
| 9 | [anonymized] | 2020-10-28 19:03 | 1.0.0 | my brilliant solution stupid | 33.65 | 33.24 | |
| 12 | [anonymized] | 2020-10-28 16:46 | 1.0.0 | add config stupid | 33.39 | 33.21 | |
| 11 | [anonymized] | 2020-10-28 16:25 | 1.0.0 | add tag | 33.39 | 33.21 | |
| 8 | [anonymized] | 2020-10-28 11:05 | 1.0.0 | Zadanie 1 - tlumaczenie z google translator stupid | 33.65 | 33.24 | |
| 28 | [anonymized] | 2020-10-28 10:36 | 1.0.0 | deepl_solution stupid simple | 31.51 | 32.13 | |
| 33 | [anonymized] | 2020-10-28 10:36 | 1.0.0 | add out.tsv - translated with gonito_prods://www.deepl.com/en/translator stupid | 30.91 | 31.31 | |
| 16 | [anonymized] | 2020-10-28 10:20 | 1.0.0 | pl to end full stupid | 30.31 | 33.21 | |
| 22 | [anonymized] | 2020-10-28 09:53 | 1.0.0 | Solution based on Bing translate stupid | 33.24 | 32.27 | |
| 24 | [anonymized] | 2020-10-28 07:40 | 1.0.0 | translated via bing | 33.24 | 32.27 | |
| 189 | [anonymized] | 2020-10-28 06:52 | 1.0.0 | MarianNMT-2 stupid | 4.40 | 5.31 | |
| 30 | [anonymized] | 2020-10-28 00:22 | 1.0.0 | Test. stupid | 30.59 | 31.62 | |
| 29 | [anonymized] | 2020-10-28 00:15 | 1.0.0 | Update. | 30.59 | 31.62 | |
| 53 | [anonymized] | 2020-10-28 00:06 | 1.0.0 | tau_1.2 stupid | 26.29 | 24.06 | |
| 39 | [anonymized] | 2020-10-27 23:48 | 1.0.0 | zadanie1 googletrans stupid | N/A | 30.64 | |
| 40 | [anonymized] | 2020-10-27 23:14 | 1.0.0 | Added dev-0 stupid | N/A | 30.55 | |
| 32 | [anonymized] | 2020-10-27 23:04 | 1.0.0 | secound_try stupid | 31.14 | 31.47 | |
| 37 | [anonymized] | 2020-10-27 23:02 | 1.0.0 | zadanie 1 stupid | 30.78 | 30.98 | |
| 223 | [anonymized] | 2020-10-27 22:50 | 1.0.0 | tau_1.1 | 0.10 | 0.10 | |
| 42 | [anonymized] | 2020-10-27 22:47 | 1.0.0 | googletrans stupid | 30.00 | 29.56 | |
| 25 | [anonymized] | 2020-10-27 22:43 | 1.0.0 | translation_done stupid | 32.67 | 32.26 | |
| 41 | [anonymized] | 2020-10-27 22:41 | 1.0.0 | googletrans stupid | 30.00 | 29.68 | |
| 7 | [anonymized] | 2020-10-27 22:20 | 1.0.0 | Add Google Translate solution stupid | 33.65 | 33.24 | |
| 227 | [anonymized] | 2020-10-27 22:12 | 1.0.0 | tau_1 | 0.18 | 0.00 | |
| 6 | [anonymized] | 2020-10-27 21:51 | 1.0.0 | Zadanie na TAU ! stupid | 33.65 | 33.24 | |
| 21 | [anonymized] | 2020-10-27 20:30 | 1.0.0 | translated via bing stupid | 33.24 | 32.27 | |
| 26 | [anonymized] | 2020-10-27 18:21 | 1.0.0 | using selenium to open google translate website and translate | 32.54 | 32.25 | |
| 101 | [anonymized] | 2020-10-27 17:21 | 1.0.0 | Add my results. | 11.70 | 10.63 | |
| 35 | [anonymized] | 2020-10-27 16:57 | 1.0.0 | added result.py stupid | 31.79 | 31.26 | |
| 34 | [anonymized] | 2020-10-27 16:50 | 1.0.0 | added out.tsv for dev-0 and test-A | 31.79 | 31.26 | |
| 15 | [anonymized] | 2020-10-27 15:25 | 1.0.0 | solution stupid | 33.62 | 33.21 | |
| 49 | [anonymized] | 2020-10-27 10:16 | 1.0.0 | 1-textblob | 24.51 | 24.86 | |
| 48 | [anonymized] | 2020-10-27 10:15 | 1.0.0 | 1 - textblob | 24.51 | 24.86 | |
| 27 | [anonymized] | 2020-10-27 09:40 | 1.0.0 | Using selenium to open google translate and translate a text | 32.45 | 32.15 | |
| 44 | [anonymized] | 2020-10-26 23:47 | 1.0.0 | MarianMT model stupid simple marian pretrained | 28.82 | 28.81 | |
| 13 | [anonymized] | 2020-10-26 21:17 | 1.0.0 | GPT-2 Fine tuning for two days, just kidding, its just googletrans stupid | 33.74 | 33.21 | |
| 50 | [anonymized] | 2020-10-26 20:41 | 1.0.0 | Second result with ibm stupid | 25.71 | 24.50 | |
| 254 | [anonymized] | 2020-10-26 20:29 | 1.0.0 | First result with ibm usage | 25.71 | N/A | |
| 54 | [anonymized] | 2020-10-26 13:05 | 1.0.0 | first_try | 28.03 | 22.56 | |
| 47 | [anonymized] | 2020-10-26 12:07 | 1.0.0 | googletrans self-made simple | 28.23 | 25.39 | |
| 14 | [anonymized] | 2020-10-26 08:31 | 1.0.0 | googletrans solution stupid | 33.64 | 33.21 | |
| 195 | [anonymized] | 2020-10-26 08:29 | 1.0.0 | stupid solution | 2.50 | 2.54 | |
| 19 | [anonymized] | 2020-10-26 07:26 | 1.0.0 | first solution stupid | 33.03 | 32.45 |