Dev Builds » 20220514-1047

You are viewing an old NCM Stockfish dev build test. You may find the most recent dev build tests using Stockfish 15 as the baseline here.

Use this dev build

NCM plays each Stockfish dev build 20,000 times against Stockfish 7. This yields an approximate Elo difference and establishes confidence in the strength of the dev builds.

Summary

Host Duration Avg Base NPS Games Wins Losses Draws Elo
ncm-et-3 11:23:22 1960509 3375 2883 5 487 +439.89 ± 15.44
ncm-et-4 11:23:32 1904843 3255 2840 13 402 +461.04 ± 17.03
ncm-et-9 11:23:39 1960948 3346 2839 7 500 +431.95 ± 15.24
ncm-et-10 11:15:43 1956523 3300 2827 3 470 +443.77 ± 15.71
ncm-et-13 11:23:33 1963177 3369 2869 10 490 +434.71 ± 15.4
ncm-et-15 11:23:16 1959784 3355 2888 8 459 +447.26 ± 15.92
20000 17146 46 2808 +442.79 ± 6.42

Test Detail

ID Host Started (UTC) Duration Base NPS Games Wins Losses Draws Elo CLI PGN
158479 ncm-et-10 2022-05-16 10:11 00:57:17 1962163 281 245 1 35 +460.78 ± 59.37
158478 ncm-et-3 2022-05-16 10:11 00:57:20 1963381 289 249 0 40 +451.48 ± 55.14
158477 ncm-et-4 2022-05-16 10:10 00:58:29 1646063 236 204 0 32 +455.31 ± 62.1
158476 ncm-et-13 2022-05-16 10:08 00:59:49 1964459 297 252 2 43 +426.36 ± 53.19
158475 ncm-et-15 2022-05-16 10:08 01:00:24 1962462 296 251 1 44 +429.77 ± 52.52
158474 ncm-et-9 2022-05-16 10:08 01:00:34 1961534 298 250 0 48 +423.01 ± 50.01
158473 ncm-et-15 2022-05-16 08:28 01:39:47 1964301 500 438 1 61 +468.96 ± 44.35
158472 ncm-et-4 2022-05-16 08:27 01:41:36 1967236 500 435 2 63 +457.52 ± 43.63
158471 ncm-et-9 2022-05-16 08:27 01:39:46 1961859 500 416 1 83 +412.8 ± 37.74
158470 ncm-et-13 2022-05-16 08:27 01:40:23 1962303 500 429 1 70 +444.09 ± 41.26
158469 ncm-et-10 2022-05-16 08:27 01:43:01 1953281 500 428 0 72 +444.08 ± 40.59
158468 ncm-et-3 2022-05-16 08:27 01:42:43 1956485 500 425 1 74 +433.94 ± 40.08
158425 ncm-et-4 2022-05-15 20:55 00:04:51 1892364 19 17 0 2 +501.94 ± 428.28
158424 ncm-et-10 2022-05-15 20:54 00:05:04 1952523 20 17 0 3 +436.32 ± 451.5
158423 ncm-et-9 2022-05-15 20:49 00:10:24 1963691 48 40 0 8 +416.51 ± 139.54
158422 ncm-et-15 2022-05-15 20:46 00:13:18 1952997 59 50 0 9 +433.23 ± 129.44
158421 ncm-et-13 2022-05-15 20:44 00:15:15 1963844 72 62 0 10 +450.81 ± 121.28
158420 ncm-et-3 2022-05-15 20:41 00:18:38 1961227 86 72 0 14 +420.99 ± 98.08
158419 ncm-et-4 2022-05-15 19:11 01:42:46 1956030 500 425 5 70 +424.28 ± 41.24
158418 ncm-et-10 2022-05-15 19:11 01:42:44 1958775 500 415 0 85 +412.8 ± 37.2
158417 ncm-et-9 2022-05-15 19:05 01:43:39 1957399 500 432 2 66 +449.35 ± 42.58
158416 ncm-et-15 2022-05-15 19:03 01:42:32 1957273 500 430 1 69 +446.7 ± 41.57
158415 ncm-et-13 2022-05-15 19:03 01:41:07 1962921 500 429 3 68 +438.95 ± 41.92
158414 ncm-et-3 2022-05-15 19:00 01:40:24 1962771 500 421 1 78 +424.28 ± 38.99
158413 ncm-et-10 2022-05-15 17:27 01:43:11 1957116 500 434 0 66 +460.32 ± 42.48
158412 ncm-et-4 2022-05-15 17:26 01:44:53 1943031 500 441 2 57 +474.93 ± 45.97
158411 ncm-et-9 2022-05-15 17:21 01:42:54 1962763 500 420 1 79 +421.93 ± 38.73
158410 ncm-et-13 2022-05-15 17:19 01:42:51 1957249 500 416 2 82 +410.58 ± 38.02
158409 ncm-et-3 2022-05-15 17:19 01:40:33 1960317 500 430 1 69 +446.7 ± 41.57
158408 ncm-et-15 2022-05-15 17:18 01:43:49 1958929 500 424 1 75 +431.48 ± 39.8
158407 ncm-et-10 2022-05-15 15:43 01:39:20 1964172 499 432 1 66 +454.39 ± 42.55
158406 ncm-et-4 2022-05-15 15:42 01:42:46 1957717 500 449 0 51 +507.87 ± 48.67
158405 ncm-et-9 2022-05-15 15:39 01:42:00 1960921 500 425 1 74 +433.94 ± 40.08
158404 ncm-et-13 2022-05-15 15:38 01:40:23 1964649 500 424 1 75 +431.48 ± 39.8
158403 ncm-et-3 2022-05-15 15:36 01:42:07 1957553 500 423 1 76 +429.05 ± 39.52
158402 ncm-et-15 2022-05-15 15:36 01:41:50 1962007 500 421 1 78 +424.28 ± 38.99
158401 ncm-et-4 2022-05-15 13:57 01:44:38 1928703 500 431 2 67 +446.7 ± 42.25
158400 ncm-et-9 2022-05-15 13:56 01:41:40 1957251 500 439 1 60 +471.92 ± 44.73
158399 ncm-et-10 2022-05-15 13:56 01:42:38 1954657 500 429 1 70 +444.09 ± 41.26
158398 ncm-et-13 2022-05-15 13:55 01:41:56 1962304 500 431 0 69 +452.04 ± 41.5
158397 ncm-et-3 2022-05-15 13:54 01:41:19 1956341 500 422 1 77 +426.65 ± 39.25
158396 ncm-et-15 2022-05-15 13:54 01:41:27 1959694 500 429 1 70 +444.09 ± 41.26
158395 ncm-et-3 2022-05-15 12:13 01:40:18 1966002 500 441 0 59 +481.09 ± 45.07
158394 ncm-et-15 2022-05-15 12:13 01:40:09 1960614 500 445 2 53 +487.45 ± 47.75
158393 ncm-et-10 2022-05-15 12:13 01:42:28 1949502 500 427 0 73 +441.5 ± 40.29
158392 ncm-et-13 2022-05-15 12:13 01:41:49 1967694 500 426 1 73 +436.43 ± 40.36
158391 ncm-et-9 2022-05-15 12:13 01:42:42 1962168 500 417 1 82 +415.05 ± 37.98
158390 ncm-et-4 2022-05-15 12:13 01:43:33 1947607 500 438 2 60 +466.04 ± 44.75

Commit

Commit ID c079acc26f93acc2eda08c7218c60559854f52f0
Author Tomasz Sobczyk
Date 2022-05-14 10:47:22 UTC
Update NNUE architecture to SFNNv5. Update network to nn-3c0aa92af1da.nnue. Architecture changes: Duplicated activation after the 1024->15 layer with squared crelu (so 15->15*2). As proposed by vondele. Trainer changes: Added bias to L1 factorization, which was previously missing (no measurable improvement but at least neutral in principle) For retraining linearly reduce lambda parameter from 1.0 at epoch 0 to 0.75 at epoch 800. reduce max_skipping_rate from 15 to 10 (compared to vondele's outstanding PR) Note: This network was trained with a ~0.8% error in quantization regarding the newly added activation function. This will be fixed in the released trainer version. Expect a trainer PR tomorrow. Note: The inference implementation cuts a corner to merge results from two activation functions. This could possibly be resolved nicer in the future. AVX2 implementation likely not necessary, but NEON is missing. First training session invocation: python3 train.py \ ../nnue-pytorch-training/data/nodes5000pv2_UHO.binpack \ ../nnue-pytorch-training/data/nodes5000pv2_UHO.binpack \ --gpus "$3," \ --threads 4 \ --num-workers 8 \ --batch-size 16384 \ --progress_bar_refresh_rate 20 \ --random-fen-skipping 3 \ --features=HalfKAv2_hm^ \ --lambda=1.0 \ --max_epochs=400 \ --default_root_dir ../nnue-pytorch-training/experiment_$1/run_$2 Second training session invocation: python3 train.py \ ../nnue-pytorch-training/data/T60T70wIsRightFarseerT60T74T75T76.binpack \ ../nnue-pytorch-training/data/T60T70wIsRightFarseerT60T74T75T76.binpack \ --gpus "$3," \ --threads 4 \ --num-workers 8 \ --batch-size 16384 \ --progress_bar_refresh_rate 20 \ --random-fen-skipping 3 \ --features=HalfKAv2_hm^ \ --start-lambda=1.0 \ --end-lambda=0.75 \ --gamma=0.995 \ --lr=4.375e-4 \ --max_epochs=800 \ --resume-from-model /data/sopel/nnue/nnue-pytorch-training/data/exp367/nn-exp367-run3-epoch399.pt \ --default_root_dir ../nnue-pytorch-training/experiment_$1/run_$2 Passed STC: LLR: 2.95 (-2.94,2.94) <0.00,2.50> Total: 27288 W: 7445 L: 7178 D: 12665 Ptnml(0-2): 159, 3002, 7054, 3271, 158 https://tests.stockfishchess.org/tests/view/627e8c001919125939623644 Passed LTC: LLR: 2.95 (-2.94,2.94) <0.50,3.00> Total: 21792 W: 5969 L: 5727 D: 10096 Ptnml(0-2): 25, 2152, 6294, 2406, 19 https://tests.stockfishchess.org/tests/view/627f2a855734b18b2e2ece47 closes https://github.com/official-stockfish/Stockfish/pull/4020 Bench: 6481017
Copyright 2011–2024 Next Chess Move LLC