Site views: 1695797

Last update: January, 6 2020.

 

Benchmarks:

 


The strongest Android engine so far on Blitz (3 minutes + 3 seconds):


CorChess 121820 wins JCER - Android Engines Tournament (Chess Engines Diary) 2021.01.17-18

CorChess wins the recent JCER tournament for Android engines. You can find details and games here.

 

 

---------------○---------------

 

 


Sugar AI 1.00 vs Stockfish Polyglot:


 

Participants:

- SugaR.AI.1.00.bmi2 (Marco Zerbinati compile)
- Stockfish Polyglot 010121,x64 BMI2

Every engine with default settings except for number of threads; SugaR using the experience file found in its source (170 MB).

Graphical Interface: Komodo 14.

Operating system: Windows 10 Professional.

Tournament settings:
2 threads per engine
Hashtable: 1 GB for each engine (not shared)
Syzygy 6 MEN ending tablebases
Ponder on
Time per game: 1 minute + 1 second per move
No opening book.
NNUE activated on every engine.
80 games to play.

Hardware:
CPU: Intel i7-4771
Hard Disk: SSD Evo 512 GB
RAM: 32 GB DDR3 RipJaws

 

Results:


Sugar 1.00 vs Stockfish Polyglot 010121

 

Download here the games in Chessbase and PGN format.

 

 

---------------○---------------

 

 


Engines Tournament December 2020:


Participants:

- CF_EXT 081220 (Chessman compile)
- Cfish 091220
- CorChess NNUE 1.3 291120 (IIvec compile)
- BrainLearn 12.1 (amchess compile)
- CiChess 021220 (Chessman compile)
- Stockfish Polyglot 211120
- Raubfisch X44_nn_sl

Every engine with default settings except for number of threads and BrainLearn with Self Q-Learn activated, with my 1.5 MBs experience file.

Graphical Interface: Komodo 14.

Operating system: Windows 10 Professional.

Tournament settings:
3 threads per engine
Hashtable: 1 GB for each engine (not shared)
Syzygy 6 MEN ending tablebases
Ponder off (I'll have to bear this until I'll change the old, consumed hardware)
Time per game: 3 minutes
No opening book.
NNUE activated on every engine.
420 games to play.

Hardware:
CPU: Intel i7-4771
Hard Disk: SSD Evo 512 GB
RAM: 32 GB DDR3 RipJaws

 

Results:


Chess Engines Tournament December 2020

 

Download here the games in Chessbase and PGN format.

 

CF_EXT managed to recover towards at the end and suprisingly reached the first place! I was expecting CorChess to win... well, they have the same score, I can say both of them are winners. What the table suggests is that the more the engine is updated the more is strong, mostly because of a better NNUE net I guess. If I made a 2.000 games tournament the differences would be more evident and, maybe, another engine could be the winner.

 

 

---------------○---------------

 

 

Stockfish normal vs Stockfish NNUE:

I must say I was shocked by the gap between the NNUE version (with the actual, 08-17-2020, best Sergio's network: 20200720-1017.bin) and the normal Stockfish. Before saying everything else I must specify that these two engines were compiled by me, one using the normal, standard AlphaBeta chess analysis and the other one the NNUE neural network system + (I guess) the AlphaBeta stuff. In other words: same engine but different default NNUE settings.

The difference has been noticeable both in fast gaming and in 5 minutes, that is unusual for engines based on neural networks (they often lose for time at the game ending).

The reasons behind this jump over by the NNUE engines is unknown to me.

I've tried them 10 days ago or something on Playchess with the Goi book but they made a shitty play, with blunders like 0.00 to suddendly +2.50 (me having black) or 0.00 to -1.97 (me having white) so I thought it was just another way to drag attention by some main chess site like in the case of Lc0 (this last one underperforming on Playchess and playing very bad both in Blitz games and analysis). But in the meanwhile something apparently has been fixed at Stockfish house and the use of neural network resulted in an astonishing jump up in strength. ... Just when you thought the opening lines couldn't still improve that much...

The book I used is the 1 move dummy.ctg based on the move #1 of all the three White main openings made by the Goi book. You'll find dummy.ctg in the games package (be aware that is just an 1 move opening book, not any sort of Goi). I didn't make so much games because the difference was too much, thus evident.

 

Stockfish Beta normal vs Stockfish Beta NNUE test, 1 minute.

 

Stockfish Beta normal vs Stockfish Beta NNUE test, 2 minutes.

 

Stockfish Beta normal vs Stockfish Beta NNUE test, 5 minutes.

 

Games + dummy.ctg archive.

 

 

---------------○---------------

 

 

Goi 6.2.3 ABK vs Perfect 2019 ABK:

Time per game: 3 minutes
Processor: Intel i7-4771, 4 threads for each engine
Hashtable size: 1024 MBs
Permanent brain (ponder): OFF
Interface used: Arena 3.5.1
Engine used: BrainFish 240720 64 BMI2

 

-----------------Goi 6.2.3-----------------
Goi 6.2.3 - Perfect2019 : 28.5/50 12 wins - 5 losses - 35 draws 57% +35 Elo
-----------------Perfect2019-----------------
Perfect2019 - Goi 6.2.3 : 21.5/50 5 wins - 12 losses - 35 draws 43% -35 Elo

Download here the Arena files of the tournament.