Dev Builds » 20220514-1047

Use this dev build

NCM plays each Stockfish dev build 20,000 times against Stockfish 15. This yields an approximate Elo difference and establishes confidence in the strength of the dev builds.

Summary

Host Duration Avg Base NPS Games WLD Standard Elo Ptnml(0-2) Gamepair Elo
ncm-dbt-01 06:52:11 583167 4000 1076 967 1957 +9.47 ± 5.17 5 391 1097 504 3 +19.3 ± 10.23
ncm-dbt-02 06:52:17 585017 4000 1051 1012 1937 +3.39 ± 5.13 5 415 1120 456 4 +6.95 ± 10.1
ncm-dbt-03 06:52:42 585046 3998 1091 988 1919 +8.95 ± 5.24 5 406 1072 513 3 +18.27 ± 10.37
ncm-dbt-04 06:52:27 568733 4002 1056 1007 1939 +4.25 ± 5.17 3 425 1096 474 3 +8.51 ± 10.24
ncm-dbt-05 06:51:33 579853 4000 1074 1004 1922 +6.08 ± 5.36 7 436 1042 510 5 +12.51 ± 10.54
20000 5348 4978 9674 +6.43 ± 2.33 25 2073 5427 2457 18 +13.1 ± 4.6

Test Detail

ID Host Base NPS Games WLD Standard Elo Ptnml(0-2) Gamepair Elo CLI PGN
432723 ncm-dbt-04 568832 2 0 1 1 -189.7 ± 55.98 0 1 0 0 0 -1129.65 ± 376.02
432722 ncm-dbt-01 583866 500 144 123 233 +14.6 ± 14.14 0 44 141 65 0 +29.25 ± 28.47
432721 ncm-dbt-03 583992 498 135 123 240 +8.37 ± 15.09 0 54 130 64 1 +15.36 ± 29.9
432720 ncm-dbt-02 585464 500 136 123 241 +9.04 ± 14.71 1 48 139 61 1 +18.08 ± 28.75
432719 ncm-dbt-05 580696 500 125 115 260 +6.95 ± 15.03 0 56 128 66 0 +13.9 ± 30.16
432718 ncm-dbt-04 565981 500 130 123 247 +4.86 ± 14.6 0 54 135 61 0 +9.73 ± 29.28
432717 ncm-dbt-01 584201 500 135 115 250 +13.91 ± 14.21 1 42 143 64 0 +29.25 ± 28.2
432716 ncm-dbt-03 585464 500 135 118 247 +11.82 ± 15.55 1 53 125 70 1 +23.66 ± 30.53
432715 ncm-dbt-05 582110 500 137 129 234 +5.56 ± 14.92 2 50 136 62 0 +13.9 ± 29.15
432714 ncm-dbt-04 569310 500 142 141 217 +0.69 ± 15.11 0 60 130 59 1 -0.0 ± 29.91
432713 ncm-dbt-02 585337 500 123 130 247 -4.86 ± 14.86 0 63 131 56 0 -9.73 ± 29.79
432712 ncm-dbt-03 584832 500 142 118 240 +16.69 ± 14.17 0 42 143 64 1 +32.05 ± 28.2
432711 ncm-dbt-01 583405 500 141 118 241 +15.99 ± 15.14 0 50 128 71 1 +30.65 ± 30.16
432710 ncm-dbt-05 576414 500 131 138 231 -4.86 ± 14.48 0 60 137 53 0 -9.73 ± 29.02
432709 ncm-dbt-04 569430 500 130 117 253 +9.04 ± 14.71 0 52 133 65 0 +18.08 ± 29.53
432708 ncm-dbt-02 586858 500 140 117 243 +15.99 ± 14.51 0 46 135 69 0 +32.05 ± 29.26
432707 ncm-dbt-03 585126 500 151 126 223 +17.39 ± 15.49 1 50 122 77 0 +36.26 ± 30.91
432706 ncm-dbt-05 572235 500 143 122 235 +14.6 ± 14.78 1 46 134 69 0 +30.65 ± 29.39
432705 ncm-dbt-01 584958 500 133 128 239 +3.47 ± 14.22 1 47 150 50 2 +5.56 ± 27.29
432704 ncm-dbt-02 584706 500 132 125 243 +4.86 ± 15.1 2 51 136 60 1 +11.12 ± 29.15
432703 ncm-dbt-04 567086 500 130 129 241 +0.69 ± 14.87 0 58 134 57 1 -0.0 ± 29.41
432702 ncm-dbt-03 587664 500 133 132 235 +0.7 ± 14.36 3 46 148 53 0 +5.56 ± 27.57
432701 ncm-dbt-02 585253 500 121 137 242 -11.12 ± 13.71 1 56 151 42 0 -20.87 ± 27.13
432700 ncm-dbt-05 582778 500 128 124 248 +2.78 ± 15.89 2 58 126 62 2 +5.56 ± 30.41
432699 ncm-dbt-01 582235 500 115 117 268 -1.39 ± 15.42 1 62 125 62 0 -1.39 ± 30.53
432698 ncm-dbt-04 569629 500 136 110 254 +18.08 ± 13.75 0 39 146 65 0 +36.26 ± 27.78
432697 ncm-dbt-03 586520 500 140 121 239 +13.21 ± 14.02 0 44 143 63 0 +26.46 ± 28.21
432696 ncm-dbt-05 579124 500 137 126 237 +7.64 ± 15.09 1 52 133 63 1 +15.3 ± 29.53
432695 ncm-dbt-02 582903 500 129 128 243 +0.7 ± 13.96 1 49 148 52 0 +2.78 ± 27.57
432694 ncm-dbt-01 583908 500 133 121 246 +8.34 ± 14.78 1 50 135 64 0 +18.08 ± 29.27
432693 ncm-dbt-04 568037 500 117 128 255 -7.64 ± 14.46 0 62 137 51 0 -15.3 ± 29.02
432692 ncm-dbt-03 582277 500 138 134 228 +2.78 ± 15.05 0 59 128 63 0 +5.56 ± 30.16
432691 ncm-dbt-05 582485 500 135 124 241 +7.65 ± 15.58 1 57 122 70 0 +16.69 ± 30.9
432690 ncm-dbt-04 570148 500 146 143 211 +2.09 ± 14.23 2 47 147 54 0 +6.95 ± 27.7
432689 ncm-dbt-02 584201 500 140 127 233 +9.03 ± 14.32 0 47 145 56 2 +15.3 ± 27.96
432688 ncm-dbt-01 579992 500 133 115 252 +12.51 ± 14.36 0 47 138 65 0 +25.06 ± 28.88
432687 ncm-dbt-01 582778 500 142 130 228 +8.34 ± 14.65 1 49 137 63 0 +18.08 ± 29.01
432686 ncm-dbt-03 584495 500 117 116 267 +0.69 ± 14.74 0 58 133 59 0 +1.39 ± 29.53
432685 ncm-dbt-04 570148 500 125 115 260 +6.95 ± 15.03 1 52 134 62 1 +13.9 ± 29.4
432684 ncm-dbt-05 582987 500 138 126 236 +8.34 ± 15.51 0 57 126 65 2 +13.9 ± 30.41
432683 ncm-dbt-02 585421 500 130 125 245 +3.47 ± 14.61 0 55 135 60 0 +6.95 ± 29.28

Commit

Commit ID c079acc26f93acc2eda08c7218c60559854f52f0
Author Tomasz Sobczyk
Date 2022-05-14 10:47:22 UTC
Update NNUE architecture to SFNNv5. Update network to nn-3c0aa92af1da.nnue. Architecture changes: Duplicated activation after the 1024->15 layer with squared crelu (so 15->15*2). As proposed by vondele. Trainer changes: Added bias to L1 factorization, which was previously missing (no measurable improvement but at least neutral in principle) For retraining linearly reduce lambda parameter from 1.0 at epoch 0 to 0.75 at epoch 800. reduce max_skipping_rate from 15 to 10 (compared to vondele's outstanding PR) Note: This network was trained with a ~0.8% error in quantization regarding the newly added activation function. This will be fixed in the released trainer version. Expect a trainer PR tomorrow. Note: The inference implementation cuts a corner to merge results from two activation functions. This could possibly be resolved nicer in the future. AVX2 implementation likely not necessary, but NEON is missing. First training session invocation: python3 train.py \ ../nnue-pytorch-training/data/nodes5000pv2_UHO.binpack \ ../nnue-pytorch-training/data/nodes5000pv2_UHO.binpack \ --gpus "$3," \ --threads 4 \ --num-workers 8 \ --batch-size 16384 \ --progress_bar_refresh_rate 20 \ --random-fen-skipping 3 \ --features=HalfKAv2_hm^ \ --lambda=1.0 \ --max_epochs=400 \ --default_root_dir ../nnue-pytorch-training/experiment_$1/run_$2 Second training session invocation: python3 train.py \ ../nnue-pytorch-training/data/T60T70wIsRightFarseerT60T74T75T76.binpack \ ../nnue-pytorch-training/data/T60T70wIsRightFarseerT60T74T75T76.binpack \ --gpus "$3," \ --threads 4 \ --num-workers 8 \ --batch-size 16384 \ --progress_bar_refresh_rate 20 \ --random-fen-skipping 3 \ --features=HalfKAv2_hm^ \ --start-lambda=1.0 \ --end-lambda=0.75 \ --gamma=0.995 \ --lr=4.375e-4 \ --max_epochs=800 \ --resume-from-model /data/sopel/nnue/nnue-pytorch-training/data/exp367/nn-exp367-run3-epoch399.pt \ --default_root_dir ../nnue-pytorch-training/experiment_$1/run_$2 Passed STC: LLR: 2.95 (-2.94,2.94) <0.00,2.50> Total: 27288 W: 7445 L: 7178 D: 12665 Ptnml(0-2): 159, 3002, 7054, 3271, 158 https://tests.stockfishchess.org/tests/view/627e8c001919125939623644 Passed LTC: LLR: 2.95 (-2.94,2.94) <0.50,3.00> Total: 21792 W: 5969 L: 5727 D: 10096 Ptnml(0-2): 25, 2152, 6294, 2406, 19 https://tests.stockfishchess.org/tests/view/627f2a855734b18b2e2ece47 closes https://github.com/official-stockfish/Stockfish/pull/4020 Bench: 6481017
Copyright 2011–2025 Next Chess Move LLC