Leela Chess Zero

Leela Chess Zero (abbreviated as LCZero, lc0) is a free, open-source, and neural network-based chess engine and distributed computing project. Development has been spearheaded by programmer Gary Linscott, who is also a developer for the Stockfish chess engine. Leela Chess Zero was adapted from the Leela Zero Go engine,[1] which in turn was based on Google's AlphaGo Zero project,[2] also to verify the methods in the AlphaZero paper as applied to the game of chess.

Leela Chess Zero
Original author(s)Gian-Carlo Pascutto, Gary Linscott
Developer(s)Gary Linscott, Alexander Lyashuk, Folkert Huizinga, others
Initial release9 January 2018 (2018-01-09)
Stable release
v0.25.1 / 15 March 2020 (2020-03-15)
Repository
Written inC++
Operating systemWindows, Mac, Linux, Ubuntu, Android
TypeChess engine
LicenseGPL-3.0
Websitelczero.org

Like Leela Zero and AlphaGo Zero, Leela Chess Zero starts with no intrinsic chess-specific knowledge other than the basic rules of the game.[1] Leela Chess Zero then learns how to play chess by reinforcement learning from repeated self-play, using a distributed computing network coordinated at the Leela Chess Zero website.

As of 2020, Leela Chess Zero had played over 300 million games against itself,[3] and is capable of play at a level that is comparable with Stockfish, the leading conventional chess program.[4][5]

History

The Leela Chess Zero project was first announced on TalkChess.com on January 9, 2018.[1][6] This revealed Leela Chess Zero as the open-source, self-learning chess engine it would come to be known as, with a goal of creating a strong chess engine.[7] Within the first few months of training, Leela Chess Zero had already reached the Grandmaster level, surpassing the strength of early releases of Rybka, Stockfish, and Komodo, despite evaluating orders of magnitude fewer positions while using MCTS.

In December 2018, the AlphaZero team published a new paper in Science magazine revealing previously undisclosed details of the architecture and training parameters used for AlphaZero.[8] These changes were soon incorporated into Leela Chess Zero and increased both its strength and training efficiency.[9]

The work on Leela Chess Zero has informed the similar AobaZero project for shogi.[10]

The engine has been rewritten and carefully iterated upon since its inception, and now runs on multiple backends, allowing it to effectively utilize different types of hardware, both CPU and GPU.[11]

The engine supports the Fischer Random Chess variant, and a network is being trained to test the viability of such a network as of May 2020.[11]

Program and use

The method used by its designers to make Leela Chess Zero self-learn and play chess at above human level is reinforcement learning. This is a machine-learning algorithm, mirrored from AlphaZero used by the Leela Chess Zero training binary to maximize reward through self-play.[1][8] As an open-source distributed computing project, volunteer users run Leela Chess Zero to play hundreds of millions of games which are fed to the reinforcement algorithm.[3] In order to contribute to the advancement of the Leela Chess Zero engine, the latest non-release candidate (non-rc) version of the Engine as well as the Client must be downloaded. The Client is needed to connect to the current server of Leela Chess Zero, where all of the information from the self-play chess games are stored, to obtain the latest network, generate self-play games, and upload the training data back to the server.[12]

In order to play against the Leela Chess Zero engine on a machine, 2 components are needed: the engine binary, and a network (The engine binary is distinct from the client, in that the client is used as a training platform for the engine). The network contains Leela Chess Zero's evaluation function that is needed to evaluate positions.[12] Older networks can also be downloaded and used by placing those networks in the folder with the lc0 binary.

Self-Play Elo

Self-play Elo is used to gauge relative network strength to look for anomalies and general changes in network strength, and can be used as a diagnostic tool when there are significant changes. Through test match games that are played with minimal temperature-based variation, lc0 engine clients test the most recent version against other recent versions of the same network's run, which is then sent the training server to create an overall Elo assessment.

Standard Elo formulas are used to calculate relative Elo strength between the two players. More recent Self-Play Elo calculations use match game results against multiple network versions to calculate a more accurate Elo value.

There are several unintended consequences of the Self-Play approach to gauging strength and are as follows:

  • Differing scales of initial Elo inflation in training runs due to periods of lower/higher self-improvement and adversarial play.
  • Strength measured this way is not objective and is relative to previous networks, allowing for a false illusion of gained strength since networks are trained to beat and anticipate the actions of their past selves.
  • Overfitting against previous network versions of Lc0 continuously adds small amounts of Self-Play Elo to the cumulative measured Elo. Overfitting in this manner is generally seen more clearly when training smaller networks.
  • There is no direct 1-to-1 correlation between self-play elo and the strength against Alpha-Beta engines, and no known correlation to strength against humans.
  • Behavioral changes in networks between runs affect inflation.

An example of Self-Play elo inflation is the Test 71.4 run (named as 714xxx nets), a Fischer Random Chess run, which nearly has 4000 cumulative self-play elo 76 nets after the start of its run. Self-Play Elo estimates of this run can be roughly compared with other runs to gauge the impracticality of pure cumulative self-play elo. A pure self-play elo comparison with one of the Test 60 networks 3000 nets into the run reveals that 63000 can consistently beat 714070 in head-to-head matches at most, if not all "fair" time controls. Yet, 63000 nets from the Test 60 run have a self-play elo around 2900, while the Self-Play Elo of early Test 71.4 is already near 4000. This contradiction of self-play Elo strength is enough to credit the claim that self-play Elo is not an objective measure of strength, nor is it one which allows one to easily compare network strength to Human strength.

Self-play rating for the engine could be used as a rough approximation of conventional Human Elo ratings, however no universal conversion formula exists for many reasons. These include but are not limited to the scale of initial inflation of self-play Elo and the late-term self-play Elo inflation between trained runs, differing time controls, differing systems of Elo measurement between chess tournament platforms, allocated resources to the engine, network size and structure, a network's training data set, and the multiple factors of which strength is given by the binary of the engine.

Setting up the engine to play a single node with ``--minibatch-size=1`` and ``go nodes 1`` for each played move creates deterministic play, and Self-Play elo on such settings will always yield the same result between 2 of the same networks on the same start position--always win, always loss, or always draw. Self-play elo is not reliable for determining strength in these deterministic circumstances.

Variants

In season 15 of the Top Chess Engine Championship, the engine AllieStein competed alongside Leela. AllieStein is a combination of two different spinoffs from Leela: Allie, which uses the same evaluation network as Leela, but has a unique search algorithm for exploring different lines of play, and Stein, an evaluation network which has been trained using supervised learning based on existing game data featuring other engines (as opposed to the unsupervised learning which Leela uses). While neither of these projects would be admitted to TCEC separately due to their similarity to Leela, the combination of Allie's search algorithm with the Stein network, called AllieStein, is unique enough to warrant it competing alongside mainstream Lc0 (The TCEC rules require that a neural network-based engine has at least 2 unique components out of 3 essential features: The code that evaluates a network, the network itself, and the search algorithm. While AllieStein uses the same code to evaluate its network as Lc0, since the other two components are fresh, AllieStein is considered a distinct engine). [13]

Competition results

In April 2018, Leela Chess Zero became the first neural network engine to enter the Top Chess Engine Championship (TCEC), during season 12 in the lowest division, division 4.[14] Leela did not perform well: in 28 games, it won one, drew two, and lost the remainder; its sole victory came from a position in which its opponent, Scorpio 2.82, crashed in three moves.[15] However, it improved quickly. In July 2018, Leela placed seventh out of eight competitors at the 2018 World Computer Chess Championship.[16] In August 2018, it won division 4 of TCEC season 13 with a record of 14 wins, 12 draws, and 2 losses.[17][18] In Division 3, Leela scored 16/28 points, finishing third behind Ethereal, who scored 22.5/28 points, and Arasan on tiebreak.[19][17]

By September 2018, Leela had become competitive with the strongest engines in the world. In the 2018 Chess.com Computer Chess Championship (CCCC),[20] Leela placed fifth out of 24 entrants. The top eight engines advanced to round 2, where Leela placed fourth.[21][22] Leela then won the 30-game match against Komodo to secure third place in the tournament.[23][24] Concurrently, Leela participated in the TCEC cup, a new event in which engines from different TCEC divisions can play matches against one another. Leela defeated higher-division engines Laser, Ethereal and Fire before finally being eliminated by Stockfish in the semi-finals.[25]

In October and November 2018, Leela participated in the Chess.com Computer Chess Championship Blitz Battle.[26] Leela finished third behind Stockfish and Komodo.[27]

In December 2018, Leela participated in season 14 of the Top Chess Engine Championship. Leela dominated divisions 3, 2, and 1, easily finishing first in all of them. In the premier division, Stockfish dominated while Houdini, Komodo and Leela competed for second place. It came down to a final-round game where Leela needed to hold Stockfish to a draw with black to finish second ahead of Komodo. It successfully managed this and therefore contested the superfinal against Stockfish. It narrowly lost the superfinal against Stockfish with a 49.5-50.5 final score.[28]

In February 2019, Leela scored its first major tournament win when it defeated Houdini in the final of the second TCEC cup. Leela did not lose a game the entire tournament.[29][30] In April 2019, Leela won the Chess.com Computer Chess Championship 7: Blitz Bonanza, becoming the first neural-network project to take the title.[31]

In May 2019, Leela defended its TCEC cup title, this time defeating Stockfish in the final 5.5-4.5 (+2 =7 -1) after Stockfish blundered a 7-man tablebase draw.[32] Leela also won the Superfinal of season 15 of the Top Chess Engine Championship 53.5-46.5 (+14 -7 =79) versus Stockfish.[33][34]

Season 16 of TCEC saw Leela finish in 3rd place in premier division, missing qualification for the superfinal to Stockfish and new neural network engine AllieStein. Leela did not suffer any losses in the Premier division, the only engine to do so, and defeated Stockfish in one of the six games they played. However, Leela only managed to score 9 wins, while AllieStein and Stockfish both scored 14 wins. This inability to defeat weaker engines led to Leela finishing 3rd, half a point behind AllieStein and a point behind Stockfish.[35] In the fourth TCEC cup, Leela was seeded first as the defending champion, which placed it on the opposite half of the brackets as AllieStein and Stockfish. Leela was able to qualify for the finals, where it faced Stockfish. After seven draws, Stockfish won the eighth game to win the match.[36]

In Season 17 of TCEC, held in January-April 2020, Leela regained the championship by defeating Stockfish 52.5-47.5.[37]

Results summary

Top Chess Engine Championship (TCEC)[38]
SeasonDivision 4Division 3Division 2Division 1Division PSuperfinal
12 (2018)8th
13 (2018)1st3rd
14 (2018)1st1st1st2nd2nd
15 (2019)2nd1st
16 (2019)3rd
17 (2020)1st1st
Chess.com Computer Chess Championship (CCCC)
EventYearTime ControlsResultRef
CCC 1: Rapid Rumble201815+53rd[39]
CCC 2: Blitz Battle20185+23rd[40]
CCC 3: Rapid Redux201930+52nd[41]
CCC 4: Bullet Brawl20191+22nd[42]
CCC 5: Escalation201910+52nd[43]
CCC 6: Winter Classic201910+102nd[44]
CCC 7: Blitz Bonanza20195+21st[31]
CCC 8: Deep Dive201915+52nd[5]
CCC 9: The Gauntlet20195+2, 10+53rd[45]
CCC 10: Double Digits201910+33rd[46]
CCC 11201930+51st[47]
CCC 12: Bullet Madness!20201+11st[48]
CCC 1320203+2, 5+5, 10+5, 15+51st[49][50]

Notable games

References

  1. "Leela Chess Zero: Full Elo Graph". Lczero.org. 7 March 2019. Retrieved 7 March 2019.
  2. "leela-zero". GitHub. Retrieved 27 April 2018.
  3. "LCZero". lczero.org. Retrieved 2019-05-28.
  4. "Lc0 Wins Computer Chess Championship, Makes History". Chess.com. Retrieved 2019-05-29.
  5. Pete (pete). "Stockfish Strikes Back, Tops Lc0 In Computer Chess Championship". Chess.com. Retrieved 2019-05-29.
  6. "Announcing lczero". TalkChess.com. Retrieved 11 June 2018.
  7. "Announcing lczero - TalkChess.com". www.talkchess.com. Retrieved 2019-03-21.
  8. Silver, David; Hubert, Thomas; Schrittwieser, Julian; et al. (6 December 2018). "A general reinforcement learning algorithm that masters chess, shogi, and Go through self-play" (PDF). Science. 362 (6419): 1140–1144. doi:10.1126/science.aar6404.
  9. "AlphaZero paper, and Lc0 v0.19.1". 7 December 2018. Retrieved 14 February 2019.
  10. Kobayashi, Yuki (2019-09-15), GitHub - kobanium/aobazero: Aoba Zero., retrieved 2019-09-25
  11. "leela-chess-zero". GitHub. Retrieved 11 May 2020.
  12. The rewritten engine, originally for tensorflow. Now all other backends have been ported here.: LeelaChessZero/lc0, LCZero, 2019-03-20, retrieved 2019-03-21
  13. "Allie+Stein, the new neural network based engine entering TCEC S15".
  14. "Breaking: Leela Chess Zero enters TCEC Season 12". Chessdom. 18 April 2018.
  15. See the season 12 archives at http://tcec.chessdom.com/archive.php Archived 2015-05-03 at the Wayback Machine
  16. "World Computer Chess Championship 2018". ICGA. Retrieved 19 July 2018.
  17. See the season 13 archives at http://tcec.chessdom.com/archive.php Archived 2015-05-03 at the Wayback Machine
  18. "Leela Chess Zero wins the gold medal in TCEC Div 4 | Chessdom". Retrieved 2019-03-21.
  19. "Ethereal chess engine wins the gold at TCEC Div 3 | Chessdom". Retrieved 2019-03-21.
  20. "Chess.com Computer Chess Championship".
  21. "CCCC stage 2 ended. Leela 4th with a good performance! Stockfish undefeated!". LCZero Blog. 26 September 2018. Retrieved 26 September 2018.
  22. Cilento, Pete (26 September 2018). "Stockfish, Houdini Battle For Computer Chess Championship; Komodo vs Lc0 For 3rd". Chess.com. Retrieved 9 October 2018.
  23. "Leela wins the match series against Komodo and wins a Pawn odds game against Stockfish!". LCZero Blog. 3 October 2018. Retrieved 9 October 2018.
  24. Cilento, Pete (4 October 2018). "Stockfish Wins Computer Chess Championship Rapid; Lc0 Finishes 3rd". Chess.com. Retrieved 9 October 2018.
  25. See the TCEC Cup 1 archives at http://tcec.chessdom.com/archive.php Archived 2015-05-03 at the Wayback Machine
  26. Cilento, Pete (11 October 2018). "Computer Chess Championship Returns For Blitz Battle". Chess.com. Retrieved 22 November 2018.
  27. Cilento, Pete (19 November 2018). "Stockfish Wins Computer Chess Championship Blitz". Chess.com. Retrieved 22 November 2018.
  28. See the season 14 archives at http://tcec.chessdom.com/archive.php Archived 2015-05-03 at the Wayback Machine
  29. See the TCEC Cup 2 archives at http://legacy-tcec.chessdom.com/archive.php
  30. "Leela won the TCEC CUP!". LCZero Blog. 4 February 2019. Retrieved 12 February 2019.
  31. Cilento, Pete (17 April 2019). "Lc0 Wins Computer Chess Championship, Makes History". Chess.com. Retrieved 18 April 2019.
  32. See the game score at https://cd.tcecbeta.club/archive.html?season=cup3&round=fl&game=9%5B%5D
  33. "Lc0 won TCEC 15". LCZero Blog. 28 May 2019. Retrieved 28 May 2019.
  34. Högy, Kevin (2 June 2019). "A new age in computer chess? Lc0 beats Stockfish!". chess24. Retrieved 25 June 2019.
  35. "Season 16, Div P archive". Retrieved 30 September 2019.
  36. "TCEC Cup 4 archive". Retrieved 18 November 2019.
  37. "TCEC final report".
  38. https://www.tcec-chess.com/archive.html
  39. Cilento, Pete. "Stockfish Wins Computer Chess Championship Rapid; Lc0 Finishes 3rd". Chess.com. Retrieved 2019-06-20.
  40. Cilento, Pete. "Stockfish Wins Computer Chess Championship Blitz". Chess.com. Retrieved 2019-06-20.
  41. Cilento, Pete. "Stockfish Wins Rapid Computer Championship Over Lc0; Bullet Chess Next". Chess.com. Retrieved 2019-06-20.
  42. Cilento, Pete. "Stockfish Wins Computer Chess Championship Bullet; 'Escalation' Next". Chess.com. Retrieved 2019-06-20.
  43. Cilento, Pete. "Computer Chess Championship Plays Blitz After Stockfish Defends Title". Chess.com. Retrieved 2019-06-20.
  44. Cilento, Pete (7 August 2019). "Stockfish Wins Computer Chess Championship As Neural Networks Play Catch-Up". Chess.com. Retrieved 19 September 2019.
  45. "Computer Chess Championship". Chess.com. Retrieved 19 September 2019.
  46. "Computer Chess Championship". Chess.com. Retrieved 23 December 2019.
  47. "Computer Chess Championship". Chess.com. Retrieved 23 January 2020.
  48. "Computer Chess Championship". Chess.com. Retrieved 14 April 2020.
  49. Doggers, Peter (18 April 2020). "Leela Chess Zero Beats Stockfish 106-94 In 13th Chess.com Computer Chess Championship". Chess.com. Retrieved 24 April 2020.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.