Dev Builds » 20230922-1726

Use this dev build

NCM plays each Stockfish dev build 20,000 times against Stockfish 15. This yields an approximate Elo difference and establishes confidence in the strength of the dev builds.

Summary

Host Duration Avg Base NPS Games WLD Standard Elo Ptnml(0-2) Gamepair Elo
ncm-dbt-01 06:54:57 584735 4000 1369 713 1918 +57.5 ± 5.03 2 180 985 826 7 +117.36 ± 10.8
ncm-dbt-02 06:55:49 586763 4012 1379 715 1918 +58.04 ± 5.11 2 196 949 854 5 +118.92 ± 11.04
ncm-dbt-03 06:57:20 587828 4016 1382 731 1903 +56.82 ± 5.02 1 188 983 831 5 +116.08 ± 10.82
ncm-dbt-04 06:55:49 572186 3996 1305 692 1999 +53.72 ± 5.06 1 194 1005 787 11 +108.23 ± 10.68
ncm-dbt-05 06:55:58 583577 3976 1336 711 1929 +55.07 ± 5.08 2 190 985 803 8 +111.9 ± 10.8
20000 6771 3562 9667 +56.23 ± 2.26 8 948 4907 4101 36 +114.49 ± 4.84

Test Detail

ID Host Base NPS Games WLD Standard Elo Ptnml(0-2) Gamepair Elo CLI PGN
403506 ncm-dbt-02 584748 12 5 3 4 +58.39 ± 68.56 0 0 4 2 0 +120.33 ± 161.99
403505 ncm-dbt-03 586985 16 6 5 5 +21.74 ± 73.63 0 1 5 2 0 +43.66 ± 157.0
403504 ncm-dbt-05 582778 476 149 86 241 +46.26 ± 13.38 0 19 137 82 0 +94.21 ± 28.33
403503 ncm-dbt-04 573042 496 153 92 251 +42.95 ± 13.23 0 22 143 83 0 +87.25 ± 27.8
403502 ncm-dbt-01 585717 500 175 91 234 +58.93 ± 13.72 0 19 129 101 1 +119.89 ± 29.69
403501 ncm-dbt-02 587962 500 164 86 250 +54.65 ± 14.63 0 28 116 106 0 +112.14 ± 31.71
403500 ncm-dbt-03 589155 500 168 81 251 +61.07 ± 13.96 0 20 124 105 1 +124.6 ± 30.44
403499 ncm-dbt-05 583782 500 178 88 234 +63.23 ± 14.61 1 21 116 111 1 +130.94 ± 31.66
403498 ncm-dbt-04 573527 500 160 90 250 +48.96 ± 14.34 0 28 124 98 0 +99.95 ± 30.58
403497 ncm-dbt-01 584034 500 172 91 237 +56.78 ± 14.86 0 28 114 107 1 +115.23 ± 32.0
403496 ncm-dbt-02 584243 500 174 95 231 +55.36 ± 15.06 0 31 109 110 0 +113.68 ± 32.72
403495 ncm-dbt-03 586520 500 191 97 212 +66.1 ± 13.3 0 15 126 109 0 +137.37 ± 29.98
403494 ncm-dbt-05 579166 500 169 87 244 +57.5 ± 14.36 0 24 121 104 1 +116.78 ± 30.96
403493 ncm-dbt-04 571310 500 171 74 255 +68.27 ± 15.21 0 25 106 116 3 +137.37 ± 33.21
403492 ncm-dbt-01 580033 500 173 89 238 +58.93 ± 12.83 0 14 138 98 0 +121.46 ± 28.19
403491 ncm-dbt-02 587962 500 185 102 213 +58.21 ± 13.83 1 17 131 100 1 +119.89 ± 29.37
403490 ncm-dbt-03 586181 500 154 98 248 +39.08 ± 14.44 0 32 131 86 1 +77.71 ± 29.68
403489 ncm-dbt-05 584453 500 162 90 248 +50.38 ± 14.68 0 30 118 102 0 +102.97 ± 31.44
403488 ncm-dbt-04 572276 500 158 83 259 +52.51 ± 13.98 0 24 127 99 0 +107.54 ± 30.1
403487 ncm-dbt-01 584495 500 169 91 240 +54.65 ± 14.89 1 26 118 104 1 +112.14 ± 31.42
403486 ncm-dbt-02 587707 500 175 90 235 +59.64 ± 14.73 0 26 114 109 1 +121.46 ± 32.0
403485 ncm-dbt-03 587070 500 175 95 230 +56.07 ± 14.83 0 28 115 106 1 +113.68 ± 31.86
403484 ncm-dbt-04 572759 500 176 97 227 +55.36 ± 14.39 0 24 125 99 2 +110.6 ± 30.39
403483 ncm-dbt-01 583698 500 173 90 237 +58.21 ± 14.53 1 22 121 105 1 +119.89 ± 30.94
403482 ncm-dbt-05 581902 500 168 95 237 +51.09 ± 13.9 0 24 129 97 0 +104.49 ± 29.82
403481 ncm-dbt-02 585548 500 160 80 260 +56.07 ± 14.29 0 24 123 102 1 +113.68 ± 30.67
403480 ncm-dbt-03 587749 500 169 76 255 +65.38 ± 14.56 0 24 109 117 0 +135.76 ± 32.75
403479 ncm-dbt-04 570869 500 165 92 243 +51.09 ± 15.11 0 29 123 94 4 +98.44 ± 30.73
403478 ncm-dbt-01 584874 500 173 85 242 +61.79 ± 13.71 0 18 127 104 1 +126.18 ± 29.94
403477 ncm-dbt-02 588132 500 190 78 232 +79.17 ± 13.75 1 11 114 123 1 +167.53 ± 31.71
403476 ncm-dbt-05 586943 500 158 103 239 +38.37 ± 14.13 0 31 133 86 0 +77.71 ± 29.4
403475 ncm-dbt-03 588302 500 183 86 231 +68.27 ± 13.39 0 15 123 112 0 +142.26 ± 30.43
403474 ncm-dbt-04 570669 500 155 78 267 +53.93 ± 14.05 1 19 134 94 2 +109.07 ± 29.01
403473 ncm-dbt-05 580613 500 177 71 252 +74.79 ± 14.2 0 17 112 119 2 +153.86 ± 32.2
403472 ncm-dbt-02 587622 500 152 92 256 +41.89 ± 14.87 0 35 120 95 0 +85.04 ± 31.17
403471 ncm-dbt-01 586350 500 173 81 246 +64.66 ± 14.26 0 21 117 111 1 +132.54 ± 31.5
403470 ncm-dbt-03 587579 500 161 99 240 +43.3 ± 14.42 1 27 132 89 1 +88.0 ± 29.49
403469 ncm-dbt-04 573042 500 167 86 247 +56.78 ± 14.05 0 23 123 104 0 +116.78 ± 30.65
403468 ncm-dbt-05 588984 500 175 91 234 +58.93 ± 15.22 1 24 119 102 4 +116.78 ± 31.26
403467 ncm-dbt-02 586943 500 174 89 237 +59.64 ± 14.46 0 24 118 107 1 +121.46 ± 31.39
403466 ncm-dbt-01 588686 500 161 95 244 +46.13 ± 14.85 0 32 121 96 1 +92.46 ± 31.03
403465 ncm-dbt-03 590911 500 175 94 231 +56.78 ± 14.6 0 26 118 105 1 +115.23 ± 31.41

Commit

Commit ID 70ba9de85cddc5460b1ec53e0a99bee271e26ece
Author Linmiao Xu
Date 2023-09-22 17:26:16 UTC
Update NNUE architecture to SFNNv8: L1-2560 nn-ac1dbea57aa3.nnue Creating this net involved: - a 6-stage training process from scratch. The datasets used in stages 1-5 were fully minimized. - permuting L1 weights with https://github.com/official-stockfish/nnue-pytorch/pull/254 A strong epoch after each training stage was chosen for the next. The 6 stages were: ``` 1. 400 epochs, lambda 1.0, default LR and gamma UHOx2-wIsRight-multinet-dfrc-n5000 (135G) nodes5000pv2_UHO.binpack data_pv-2_diff-100_nodes-5000.binpack wrongIsRight_nodes5000pv2.binpack multinet_pv-2_diff-100_nodes-5000.binpack dfrc_n5000.binpack 2. 800 epochs, end-lambda 0.75, LR 4.375e-4, gamma 0.995, skip 12 LeelaFarseer-T78juntoaugT79marT80dec.binpack (141G) T60T70wIsRightFarseerT60T74T75T76.binpack test78-junjulaug2022-16tb7p.no-db.min.binpack test79-mar2022-16tb7p.no-db.min.binpack test80-dec2022-16tb7p.no-db.min.binpack 3. 800 epochs, end-lambda 0.725, LR 4.375e-4, gamma 0.995, skip 20 leela93-v1-dfrc99-v2-T78juntosepT80jan-v6dd-T78janfebT79aprT80aprmay.min.binpack leela93-filt-v1.min.binpack dfrc99-16tb7p-filt-v2.min.binpack test78-juntosep2022-16tb7p-filter-v6-dd.min-mar2023.binpack test80-jan2023-3of3-16tb7p-filter-v6-dd.min-mar2023.binpack test78-janfeb2022-16tb7p.min.binpack test79-apr2022-16tb7p.min.binpack test80-apr2022-16tb7p.min.binpack test80-may2022-16tb7p.min.binpack 4. 800 epochs, end-lambda 0.7, LR 4.375e-4, gamma 0.995, skip 24 leela96-dfrc99-v2-T78juntosepT79mayT80junsepnovjan-v6dd-T80mar23-v6-T60novdecT77decT78aprmayT79aprT80may23.min.binpack leela96-filt-v2.min.binpack dfrc99-16tb7p-filt-v2.min.binpack test78-juntosep2022-16tb7p-filter-v6-dd.min-mar2023.binpack test79-may2022-16tb7p.filter-v6-dd.min.binpack test80-jun2022-16tb7p.filter-v6-dd.min.binpack test80-sep2022-16tb7p.filter-v6-dd.min.binpack test80-nov2022-16tb7p.filter-v6-dd.min.binpack test80-jan2023-3of3-16tb7p-filter-v6-dd.min-mar2023.binpack test80-mar2023-2tb7p.v6-sk16.min.binpack test60-novdec2021-16tb7p.min.binpack test77-dec2021-16tb7p.min.binpack test78-aprmay2022-16tb7p.min.binpack test79-apr2022-16tb7p.min.binpack test80-may2023-2tb7p.min.binpack 5. 960 epochs, end-lambda 0.7, LR 4.375e-4, gamma 0.995, skip 28 Increased max-epoch to 960 near the end of the first 800 epochs 5af11540bbfe dataset: https://github.com/official-stockfish/Stockfish/pull/4635 6. 1000 epochs, end-lambda 0.7, LR 4.375e-4, gamma 0.995, skip 28 Increased max-epoch to 1000 near the end of the first 800 epochs 1ee1aba5ed dataset: https://github.com/official-stockfish/Stockfish/pull/4782 ``` L1 weights permuted with: ```bash python3 serialize.py $nnue $nnue_permuted \ --features=HalfKAv2_hm \ --ft_optimize \ --ft_optimize_data=/data/fishpack32.binpack \ --ft_optimize_count=10000 ``` Speed measurements from 100 bench runs at depth 13 with profile-build x86-64-avx2: ``` sf_base = 1329051 +/- 2224 (95%) sf_test = 1163344 +/- 2992 (95%) diff = -165706 +/- 4913 (95%) speedup = -12.46807% +/- 0.370% (95%) ``` Training data can be found at: https://robotmoon.com/nnue-training-data/ Local elo at 25k nodes per move (vs. L1-2048 nn-1ee1aba5ed4c.nnue) ep959 : 16.2 +/- 2.3 Failed 10+0.1 STC: https://tests.stockfishchess.org/tests/view/6501beee2cd016da89abab21 LLR: -2.92 (-2.94,2.94) <0.00,2.00> Total: 13184 W: 3285 L: 3535 D: 6364 Ptnml(0-2): 85, 1662, 3334, 1440, 71 Failed 180+1.8 VLTC: https://tests.stockfishchess.org/tests/view/6505cf9a72620bc881ea908e LLR: -2.94 (-2.94,2.94) <0.00,2.00> Total: 64248 W: 16224 L: 16374 D: 31650 Ptnml(0-2): 26, 6788, 18640, 6650, 20 Passed 60+0.6 th 8 VLTC SMP (STC bounds): https://tests.stockfishchess.org/tests/view/65084a4618698b74c2e541dc LLR: 2.95 (-2.94,2.94) <0.00,2.00> Total: 90630 W: 23372 L: 23033 D: 44225 Ptnml(0-2): 13, 8490, 27968, 8833, 11 Passed 60+0.6 th 8 VLTC SMP: https://tests.stockfishchess.org/tests/view/6501d45d2cd016da89abacdb LLR: 2.95 (-2.94,2.94) <0.50,2.50> Total: 137804 W: 35764 L: 35276 D: 66764 Ptnml(0-2): 31, 13006, 42326, 13522, 17 closes https://github.com/official-stockfish/Stockfish/pull/4795 bench 1246812
Copyright 2011–2025 Next Chess Move LLC