Dev Builds » 20240307-1853

Use this dev build

NCM plays each Stockfish dev build 20,000 times against Stockfish 15. This yields an approximate Elo difference and establishes confidence in the strength of the dev builds.

Summary

Host Duration Avg Base NPS Games WLD Standard Elo Ptnml(0-2) Gamepair Elo
ncm-dbt-01 06:47:12 577942 4000 1491 608 1901 +77.98 ± 4.93 1 118 890 979 12 +162.35 ± 11.36
ncm-dbt-02 06:48:13 585601 4006 1501 582 1923 +81.15 ± 4.87 2 101 888 1000 12 +170.07 ± 11.35
ncm-dbt-03 06:47:00 584475 4000 1473 616 1911 +75.61 ± 4.94 2 120 909 957 12 +157.02 ± 11.23
ncm-dbt-04 06:47:22 568765 3994 1463 594 1937 +76.82 ± 5.01 0 129 888 962 18 +158.14 ± 11.39
ncm-dbt-05 06:46:55 582682 4000 1509 573 1918 +82.83 ± 5.02 1 121 835 1027 16 +173.0 ± 11.77
20000 7437 2973 9590 +78.88 ± 2.22 6 589 4410 4925 70 +164.07 ± 5.11

Test Detail

ID Host Base NPS Games WLD Standard Elo Ptnml(0-2) Gamepair Elo CLI PGN
418711 ncm-dbt-02 586816 6 3 1 2 +120.22 ± 109.7 0 0 1 2 0 +279.24 ± 576.89
418710 ncm-dbt-04 566138 494 171 72 251 +70.58 ± 14.89 0 23 103 120 1 +145.85 ± 33.7
418709 ncm-dbt-01 576128 500 194 77 229 +82.83 ± 13.7 0 13 108 128 1 +174.55 ± 32.75
418708 ncm-dbt-05 578753 500 187 68 245 +84.3 ± 14.47 0 15 105 126 4 +172.78 ± 33.32
418707 ncm-dbt-03 585253 500 192 80 228 +79.17 ± 14.05 0 16 107 126 1 +165.8 ± 33.0
418706 ncm-dbt-02 585126 500 185 67 248 +83.57 ± 13.41 0 10 114 124 2 +174.55 ± 31.61
418705 ncm-dbt-04 570469 500 190 74 236 +82.1 ± 13.68 0 14 106 130 0 +174.55 ± 33.13
418704 ncm-dbt-01 577233 500 181 86 233 +66.82 ± 14.2 0 19 119 110 2 +135.76 ± 31.16
418703 ncm-dbt-03 584958 500 171 70 259 +71.16 ± 13.34 0 12 127 109 2 +145.54 ± 29.68
418702 ncm-dbt-05 585211 500 188 70 242 +83.57 ± 14.45 0 18 97 134 1 +176.33 ± 34.76
418701 ncm-dbt-02 586393 500 188 83 229 +74.06 ± 13.59 1 11 121 116 1 +155.54 ± 30.59
418700 ncm-dbt-04 569230 500 181 72 247 +76.97 ± 14.56 0 18 108 121 3 +157.24 ± 32.87
418699 ncm-dbt-03 584579 500 187 82 231 +74.06 ± 13.44 0 13 120 116 1 +153.86 ± 30.8
418698 ncm-dbt-01 577110 500 190 82 228 +76.25 ± 13.05 0 10 123 116 1 +158.93 ± 30.17
418697 ncm-dbt-05 579124 500 181 63 256 +83.57 ± 13.56 0 12 109 128 1 +176.33 ± 32.55
418696 ncm-dbt-02 585970 500 180 68 252 +79.17 ± 13.45 0 12 115 122 1 +165.8 ± 31.55
418695 ncm-dbt-04 569190 500 180 74 246 +74.79 ± 13.76 0 15 115 119 1 +155.54 ± 31.67
418694 ncm-dbt-01 577602 500 189 90 221 +69.71 ± 14.31 1 16 118 113 2 +143.89 ± 31.26
418693 ncm-dbt-02 583950 500 189 69 242 +85.04 ± 13.44 0 11 109 129 1 +179.9 ± 32.5
418692 ncm-dbt-03 585548 500 185 76 239 +76.98 ± 14.27 1 16 106 127 0 +164.07 ± 33.19
418691 ncm-dbt-05 587877 500 188 74 238 +80.63 ± 14.23 0 15 109 123 3 +165.8 ± 32.64
418690 ncm-dbt-04 566849 500 186 65 249 +85.78 ± 14.36 0 14 105 127 4 +176.33 ± 33.3
418689 ncm-dbt-02 586012 500 192 77 231 +81.37 ± 14.4 1 13 109 124 3 +169.27 ± 32.62
418688 ncm-dbt-01 578877 500 179 62 259 +82.83 ± 13.7 0 13 108 128 1 +174.55 ± 32.75
418687 ncm-dbt-03 584790 500 174 67 259 +75.52 ± 14.65 1 17 108 122 2 +157.24 ± 32.87
418686 ncm-dbt-05 582193 500 191 87 222 +73.34 ± 14.85 1 19 107 121 2 +152.18 ± 33.05
418685 ncm-dbt-04 569669 500 197 78 225 +84.3 ± 14.33 0 13 110 122 5 +171.02 ± 32.42
418684 ncm-dbt-01 578794 500 192 81 227 +78.43 ± 14.17 0 17 106 126 1 +164.07 ± 33.19
418683 ncm-dbt-03 584622 500 196 85 219 +78.43 ± 13.42 0 11 119 118 2 +162.35 ± 30.85
418682 ncm-dbt-02 581028 500 187 71 242 +82.1 ± 13.98 0 15 105 129 1 +172.78 ± 33.32
418681 ncm-dbt-05 582318 500 190 81 229 +76.97 ± 13.98 0 16 110 123 1 +160.64 ± 32.5
418680 ncm-dbt-04 568514 500 170 81 249 +62.51 ± 14.03 0 20 122 107 1 +127.76 ± 30.73
418679 ncm-dbt-03 583196 500 191 84 225 +75.52 ± 13.94 0 16 112 121 1 +157.24 ± 32.18
418678 ncm-dbt-05 585042 500 185 70 245 +81.37 ± 13.81 0 15 105 130 0 +172.78 ± 33.32
418677 ncm-dbt-01 579166 500 184 63 253 +85.78 ± 14.51 0 17 97 134 2 +179.9 ± 34.76
418676 ncm-dbt-02 587494 500 194 73 233 +85.78 ± 13.77 0 13 104 132 1 +181.7 ± 33.45
418675 ncm-dbt-04 570068 500 188 78 234 +77.7 ± 13.71 0 12 119 116 3 +158.93 ± 30.91
418674 ncm-dbt-01 578630 500 182 67 251 +81.36 ± 13.81 0 13 111 124 2 +169.27 ± 32.25
418673 ncm-dbt-03 582861 500 177 72 251 +74.06 ± 14.6 0 19 110 118 3 +150.51 ± 32.56
418672 ncm-dbt-02 587622 500 183 73 244 +77.7 ± 14.15 0 16 110 122 2 +160.64 ± 32.5
418671 ncm-dbt-05 580945 500 199 60 241 +99.2 ± 14.15 0 11 93 142 4 +209.91 ± 35.49

Commit

Commit ID bd579ab5d1a931a09a62f2ed33b5149ada7bc65f
Author Linmiao Xu
Date 2024-03-07 18:53:48 UTC
Update default main net to nn-1ceb1ade0001.nnue Created by retraining the previous main net `nn-b1a57edbea57.nnue` with: - some of the same options as before: - ranger21, more WDL skipping, 15% more loss when Q is too high - removal of the huge 514G pre-interleaved binpack - removal of SF-generated dfrc data (dfrc99-16tb7p-filt-v2.min.binpack) - interleaving many binpacks at training time - training with some bestmove capture positions where SEE < 0 - increased usage of torch.compile to speed up training by up to 40% ```yaml experiment-name: 2560--S10-dfrc0-to-dec2023-skip-more-wdl-15p-more-loss-high-q-see-ge0-sk28 nnue-pytorch-branch: linrock/nnue-pytorch/r21-more-wdl-skip-15p-more-loss-high-q-skip-see-ge0-torch-compile-more start-from-engine-test-net: True early-fen-skipping: 28 training-dataset: # similar, not the exact same as: # https://github.com/official-stockfish/Stockfish/pull/4635 - /data/S5-5af/leela96.v2.min.binpack - /data/S5-5af/test60-2021-11-12-novdec-12tb7p.v6-dd.min.binpack - /data/S5-5af/test77-2021-12-dec-16tb7p.v6-dd.min.binpack - /data/S5-5af/test78-2022-01-to-05-jantomay-16tb7p.v6-dd.min.binpack - /data/S5-5af/test78-2022-06-to-09-juntosep-16tb7p.v6-dd.min.binpack - /data/S5-5af/test79-2022-04-apr-16tb7p.v6-dd.min.binpack - /data/S5-5af/test79-2022-05-may-16tb7p.v6-dd.min.binpack - /data/S5-5af/test80-2022-06-jun-16tb7p.v6-dd.min.unmin.binpack - /data/S5-5af/test80-2022-07-jul-16tb7p.v6-dd.min.binpack - /data/S5-5af/test80-2022-08-aug-16tb7p.v6-dd.min.binpack - /data/S5-5af/test80-2022-09-sep-16tb7p.v6-dd.min.unmin.binpack - /data/S5-5af/test80-2022-10-oct-16tb7p.v6-dd.min.binpack - /data/S5-5af/test80-2022-11-nov-16tb7p.v6-dd.min.binpack - /data/S5-5af/test80-2023-01-jan-16tb7p.v6-sk20.min.binpack - /data/S5-5af/test80-2023-02-feb-16tb7p.v6-dd.min.binpack - /data/S5-5af/test80-2023-03-mar-2tb7p.min.unmin.binpack - /data/S5-5af/test80-2023-04-apr-2tb7p.binpack - /data/S5-5af/test80-2023-05-may-2tb7p.min.dd.binpack # https://github.com/official-stockfish/Stockfish/pull/4782 - /data/S6-1ee1aba5ed/test80-2023-06-jun-2tb7p.binpack - /data/S6-1ee1aba5ed/test80-2023-07-jul-2tb7p.min.binpack # https://github.com/official-stockfish/Stockfish/pull/4972 - /data/S8-baff1edbea57/test80-2023-08-aug-2tb7p.v6.min.binpack - /data/S8-baff1edbea57/test80-2023-09-sep-2tb7p.binpack - /data/S8-baff1edbea57/test80-2023-10-oct-2tb7p.binpack # https://github.com/official-stockfish/Stockfish/pull/5056 - /data/S9-b1a57edbea57/test80-2023-11-nov-2tb7p.binpack - /data/S9-b1a57edbea57/test80-2023-12-dec-2tb7p.binpack num-epochs: 800 lr: 4.375e-4 gamma: 0.995 start-lambda: 1.0 end-lambda: 0.7 ``` This particular net was reached at epoch 759. Use of more torch.compile decorators in nnue-pytorch model.py than in the previous main net training run sped up training by up to 40% on Tesla gpus when using recent pytorch compiled with cuda 12: https://github.com/linrock/nnue-tools/blob/7fb9831/Dockerfile Skipping positions with bestmove captures where static exchange evaluation is >= 0 is based on the implementation from Sopel's NNUE training & experimentation log: https://docs.google.com/document/d/1gTlrr02qSNKiXNZ_SuO4-RjK4MXBiFlLE6jvNqqMkAY Experiment 293 - only skip captures with see>=0 Positions with bestmove captures where score == 0 are always skipped for compatibility with minimized binpacks, since the original minimizer sets scores to 0 for slight improvements in compression. The trainer branch used was: https://github.com/linrock/nnue-pytorch/tree/r21-more-wdl-skip-15p-more-loss-high-q-skip-see-ge0-torch-compile-more Binpacks were renamed to be sorted chronologically by default when sorted by name. The binpack data are otherwise the same as binpacks with similar names in the prior naming convention. Training data can be found at: https://robotmoon.com/nnue-training-data/ Passed STC: https://tests.stockfishchess.org/tests/view/65e3ddd1f2ef6c733362ae5c LLR: 2.92 (-2.94,2.94) <0.00,2.00> Total: 149792 W: 39153 L: 38661 D: 71978 Ptnml(0-2): 675, 17586, 37905, 18032, 698 Passed LTC: https://tests.stockfishchess.org/tests/view/65e4d91c416ecd92c162a69b LLR: 2.94 (-2.94,2.94) <0.50,2.50> Total: 64416 W: 16517 L: 16135 D: 31764 Ptnml(0-2): 38, 7218, 17313, 7602, 37 closes https://github.com/official-stockfish/Stockfish/pull/5090 Bench: 1373183
Copyright 2011–2025 Next Chess Move LLC