Dev Builds » 20230606-1917

You are viewing an old NCM Stockfish dev build test. You may find the most recent dev build tests using Stockfish 15 as the baseline here.

Use this dev build

NCM plays each Stockfish dev build 20,000 times against Stockfish 7. This yields an approximate Elo difference and establishes confidence in the strength of the dev builds.

Summary

Host Duration Avg Base NPS Games Wins Losses Draws Elo
ncm-et-3 09:28:36 1955145 3337 2861 3 473 +444.68 ± 15.66
ncm-et-4 09:28:33 1950941 3322 2862 5 455 +449.39 ± 15.98
ncm-et-9 09:28:31 1953062 3343 2837 4 502 +433.25 ± 15.19
ncm-et-10 09:28:35 1945926 3332 2838 7 487 +435.98 ± 15.44
ncm-et-13 09:28:41 1952481 3337 2879 1 457 +452.65 ± 15.92
ncm-et-15 09:28:43 1956364 3329 2864 9 456 +446.2 ± 15.97
20000 17141 29 2830 +443.57 ± 6.39

Test Detail

ID Host Started (UTC) Duration Base NPS Games Wins Losses Draws Elo CLI PGN
191230 ncm-et-4 2023-06-09 05:38 00:54:50 1947883 322 275 0 47 +441.54 ± 50.63
191227 ncm-et-15 2023-06-09 05:36 00:56:06 1955308 329 277 1 51 +422.99 ± 48.59
191225 ncm-et-10 2023-06-09 05:36 00:56:22 1945405 332 291 0 41 +472.67 ± 54.49
191223 ncm-et-13 2023-06-09 05:35 00:57:29 1944710 337 297 0 40 +480.0 ± 55.23
191222 ncm-et-3 2023-06-09 05:34 00:58:14 1946070 337 289 1 47 +442.27 ± 50.76
191221 ncm-et-9 2023-06-09 05:34 00:58:21 1955265 343 298 1 44 +457.37 ± 52.58
191219 ncm-et-10 2023-06-09 04:12 01:23:21 1971735 500 430 1 69 +446.7 ± 41.57
191218 ncm-et-4 2023-06-09 04:12 01:25:18 1952687 500 427 0 73 +441.5 ± 40.29
191217 ncm-et-15 2023-06-09 04:10 01:25:43 1955720 500 429 1 70 +444.09 ± 41.26
191215 ncm-et-9 2023-06-09 04:09 01:24:12 1956526 500 433 1 66 +454.76 ± 42.55
191214 ncm-et-3 2023-06-09 04:09 01:24:46 1960157 500 431 0 69 +452.04 ± 41.5
191212 ncm-et-13 2023-06-09 04:07 01:27:19 1944946 500 422 0 78 +429.05 ± 38.92
191208 ncm-et-10 2023-06-09 02:46 01:25:09 1946731 500 420 1 79 +421.93 ± 38.73
191207 ncm-et-4 2023-06-09 02:45 01:26:00 1956178 500 436 0 64 +466.03 ± 43.18
191206 ncm-et-13 2023-06-09 02:44 01:22:23 1960338 500 424 0 76 +433.94 ± 39.45
191205 ncm-et-3 2023-06-09 02:43 01:24:47 1956050 500 430 0 70 +449.35 ± 41.19
191204 ncm-et-9 2023-06-09 02:43 01:25:45 1951955 500 421 1 78 +424.28 ± 38.99
191203 ncm-et-15 2023-06-09 02:42 01:27:26 1957133 500 433 4 63 +446.7 ± 43.57
191196 ncm-et-10 2023-06-09 01:20 01:25:21 1951125 500 423 1 76 +429.05 ± 39.52
191195 ncm-et-15 2023-06-09 01:18 01:23:07 1964149 500 438 0 62 +471.92 ± 43.9
191194 ncm-et-3 2023-06-09 01:17 01:25:40 1956942 500 438 0 62 +471.92 ± 43.9
191193 ncm-et-4 2023-06-09 01:17 01:27:34 1937281 500 419 2 79 +417.32 ± 38.77
191192 ncm-et-13 2023-06-09 01:16 01:27:24 1949343 500 437 0 63 +468.95 ± 43.54
191191 ncm-et-9 2023-06-09 01:16 01:26:33 1945911 500 421 0 79 +426.65 ± 38.66
191184 ncm-et-10 2023-06-08 23:54 01:25:21 1946456 500 429 0 71 +446.7 ± 40.89
191183 ncm-et-3 2023-06-08 23:53 01:23:36 1961119 500 426 1 73 +436.43 ± 40.36
191182 ncm-et-15 2023-06-08 23:53 01:24:51 1953159 500 431 0 69 +452.04 ± 41.5
191181 ncm-et-13 2023-06-08 23:50 01:25:04 1953337 500 438 0 62 +471.92 ± 43.9
191180 ncm-et-4 2023-06-08 23:50 01:26:07 1944845 500 437 1 62 +466.04 ± 43.97
191179 ncm-et-9 2023-06-08 23:49 01:25:41 1939403 500 423 0 77 +431.48 ± 39.18
191172 ncm-et-10 2023-06-08 22:28 01:25:42 1943642 500 427 2 71 +436.43 ± 40.99
191171 ncm-et-3 2023-06-08 22:27 01:25:32 1958015 500 424 1 75 +431.48 ± 39.8
191170 ncm-et-13 2023-06-08 22:26 01:23:16 1960331 500 422 1 77 +426.65 ± 39.25
191169 ncm-et-15 2023-06-08 22:26 01:26:14 1950418 500 426 2 72 +433.94 ± 40.69
191168 ncm-et-4 2023-06-08 22:26 01:23:52 1959390 500 431 2 67 +446.7 ± 42.25
191167 ncm-et-9 2023-06-08 22:24 01:24:04 1965401 500 425 1 74 +433.94 ± 40.08
191154 ncm-et-4 2023-06-08 21:00 01:24:52 1958323 500 437 0 63 +468.95 ± 43.54
191153 ncm-et-10 2023-06-08 21:00 01:27:19 1916390 500 418 2 80 +415.05 ± 38.52
191152 ncm-et-13 2023-06-08 21:00 01:25:46 1954362 500 439 0 61 +474.93 ± 44.28
191151 ncm-et-3 2023-06-08 21:00 01:26:01 1947666 500 423 0 77 +431.48 ± 39.18
191150 ncm-et-9 2023-06-08 21:00 01:23:55 1956978 500 416 0 84 +415.04 ± 37.43
191149 ncm-et-15 2023-06-08 21:00 01:25:16 1958664 500 430 1 69 +446.7 ± 41.57

Commit

Commit ID 373359b44d0947cce2628a9a8c9b432a458615a8
Author Linmiao Xu
Date 2023-06-06 19:17:36 UTC
Update default net to nn-0dd1cebea573.nnue Created by retraining an earlier epoch of the experiment leading to the first SFNNv6 net on a more-randomized version of the nn-e1fb1ade4432.nnue dataset mixed with unfiltered T80 apr2023 data. Trained using early-fen-skipping 28 and max-epoch 960. The trainer settings and epochs used in the 5-step training sequence leading here were: 1. train from scratch for 400 epochs, lambda 1.0, constant LR 9.75e-4, T79T77-filter-v6-dd.min.binpack 2. retrain ep379, max-epoch 800, end-lambda 0.75, T60T70wIsRightFarseerT60T74T75T76.binpack 3. retrain ep679, max-epoch 800, end-lambda 0.75, skip 28, nn-e1fb1ade4432 dataset 4. retrain ep799, max-epoch 800, end-lambda 0.7, skip 28, nn-e1fb1ade4432 dataset 5. retrain ep439, max-epoch 960, end-lambda 0.7, skip 28, shuffled nn-e1fb1ade4432 + T80 apr2023 This net was epoch 559 of the final (step 5) retraining: ```bash python3 easy_train.py \ --experiment-name L1-1536-Re4-leela96-dfrc99-T60novdec-v2-T80juntonovjanfebT79aprmayT78jantosepT77dec-v6dd-T80apr-shuffled-sk28 \ --training-dataset /data/leela96-dfrc99-T60novdec-v2-T80juntonovjanfebT79aprmayT78jantosepT77dec-v6dd-T80apr.binpack \ --nnue-pytorch-branch linrock/nnue-pytorch/misc-fixes-L1-1536 \ --early-fen-skipping 28 \ --start-lambda 1.0 \ --end-lambda 0.7 \ --max_epoch 960 \ --start-from-engine-test-net False \ --start-from-model /data/L1-1536-Re3-nn-epoch439.nnue \ --engine-test-branch linrock/Stockfish/L1-1536 \ --lr 4.375e-4 \ --gamma 0.995 \ --tui False \ --seed $RANDOM \ --gpus "0," ``` During data preparation, most binpacks were unminimized by removing positions with score 32002 (`VALUE_NONE`). This makes the tradeoff of increasing dataset filesize on disk to increase the randomness of positions in interleaved datasets. The code used for unminimizing is at: https://github.com/linrock/Stockfish/tree/tools-unminify For preparing the dataset used in this experiment: ```bash python3 interleave_binpacks.py \ leela96-filt-v2.binpack \ dfrc99-16tb7p-eval-filt-v2.binpack \ filt-v6-dd-min/test60-novdec2021-12tb7p-filter-v6-dd.min-mar2023.unmin.binpack \ filt-v6-dd-min/test80-aug2022-16tb7p-filter-v6-dd.min-mar2023.unmin.binpack \ filt-v6-dd-min/test80-sep2022-16tb7p-filter-v6-dd.min-mar2023.unmin.binpack \ filt-v6-dd-min/test80-jun2022-16tb7p-filter-v6-dd.min-mar2023.unmin.binpack \ filt-v6-dd/test80-jul2022-16tb7p-filter-v6-dd.binpack \ filt-v6-dd/test80-oct2022-16tb7p-filter-v6-dd.binpack \ filt-v6-dd/test80-nov2022-16tb7p-filter-v6-dd.binpack \ filt-v6-dd-min/test80-jan2023-3of3-16tb7p-filter-v6-dd.min-mar2023.unmin.binpack \ filt-v6-dd-min/test80-feb2023-16tb7p-filter-v6-dd.min-mar2023.unmin.binpack \ filt-v6-dd/test79-apr2022-16tb7p-filter-v6-dd.binpack \ filt-v6-dd/test79-may2022-16tb7p-filter-v6-dd.binpack \ filt-v6-dd-min/test78-jantomay2022-16tb7p-filter-v6-dd.min-mar2023.unmin.binpack \ filt-v6-dd/test78-juntosep2022-16tb7p-filter-v6-dd.binpack \ filt-v6-dd/test77-dec2021-16tb7p-filter-v6-dd.binpack \ test80-apr2023-2tb7p.binpack \ /data/leela96-dfrc99-T60novdec-v2-T80juntonovjanfebT79aprmayT78jantosepT77dec-v6dd-T80apr.binpack ``` T80 apr2023 data was converted using lc0-rescorer with ~2tb of tablebases and can be found at: https://robotmoon.com/nnue-training-data/ Local elo at 25k nodes per move vs. nn-e1fb1ade4432.nnue (L1 size 1024): nn-epoch559.nnue : 25.7 +/- 1.6 Passed STC: https://tests.stockfishchess.org/tests/view/647cd3b87cf638f0f53f9cbb LLR: 2.95 (-2.94,2.94) <0.00,2.00> Total: 59200 W: 16000 L: 15660 D: 27540 Ptnml(0-2): 159, 6488, 15996, 6768, 189 Passed LTC: https://tests.stockfishchess.org/tests/view/647d58de726f6b400e4085d8 LLR: 2.95 (-2.94,2.94) <0.50,2.50> Total: 58800 W: 16002 L: 15657 D: 27141 Ptnml(0-2): 44, 5607, 17748, 5962, 39 closes https://github.com/official-stockfish/Stockfish/pull/4606 bench 2141197
Copyright 2011–2024 Next Chess Move LLC