Current result: 9 draws, one win with Stockfish as White, and one win with Leela as White. Drawn.
There is also this one from a couple of years ago: https://www.chess.com/news/view/computer-chess-championship-... “Lc0 defeated Stockfish in their head-to-head match, four wins to three”. Stockfish did got more wins against the other computers, so won the round-robin, but in head-to-head games Leela was ahead of Stockfish.
I don't have any stock in those 2 engines, so I don't care which one is better than the other. At the end as a poor chess player it won't change anything :) It's actually interesting to compare how those two software are evolving and how they got here.
Stockfish is much older. And it took it a lot of hand tuning to reach its current level. It is (or was) full of carefully tested heuristic to give a direction to the computation. It would be very difficult to build an engine like stockfish in a short span.
Leela got there very very quickly. Even if it was not able to win in October, the fact that it got competitive and forced the field to adopt drastic changes in such a short period of time is impressive. It seems to be a good example of how sometimes no using the "best" solution could still be a win. Getting good results after a few months against something that required 10 years of work.
This is their most recent ongoing head-to-head: https://www.chess.com/events/2021-tcec-20-superfinal
Current result: 9 draws, one win with Stockfish as White, and one win with Leela as White. Drawn.
There is also this one from a couple of years ago: https://www.chess.com/news/view/computer-chess-championship-... “Lc0 defeated Stockfish in their head-to-head match, four wins to three”. Stockfish did got more wins against the other computers, so won the round-robin, but in head-to-head games Leela was ahead of Stockfish.