MCTS Algorithm Visualizer

Learn Monte Carlo Tree Search through interactive visualization

🎮 Launch Interactive Visualizer →

What is Monte Carlo Tree Search?

Monte Carlo Tree Search (MCTS) is a powerful decision-making algorithm used in artificial intelligence, particularly famous for its role in AlphaGo, the AI that defeated world champions in the ancient game of Go.

Unlike traditional game AI that requires handcrafted evaluation functions, MCTS learns through simulated experience — playing out thousands of random games to discover which moves lead to victory.

How MCTS Works

MCTS builds a search tree through repeated iterations, each consisting of four phases:

🔍

1. Selection

Start at the root and traverse the tree, selecting the most promising child nodes using the UCB1 formula until reaching an unexplored node.

2. Expansion

Add a new child node to the tree by selecting one of the unexplored moves from the current position.

🎲

3. Simulation

From the new node, play out a complete random game (rollout) until reaching a terminal state to get a win, loss, or draw result.

⬆️

4. Backpropagation

Propagate the simulation result back up the tree, updating visit counts and win statistics for all nodes along the path.

The UCB1 Formula

MCTS balances exploitation (choosing known good moves) and exploration (trying less-explored moves) using:

UCB1 = (wins / visits) + c × √(ln(parent_visits) / visits)

The first term favors high win rates; the second term encourages exploring rarely-visited nodes.

Why MCTS is Powerful

No Evaluation Function

Learns what's good through experience, not hand-coded rules

⏱️

Anytime Algorithm

Can be stopped at any time and return the best move found

🎯

Asymmetric Growth

Focuses resources on promising branches automatically

🌍

Scales Well

Effective even with enormous game trees

Applications

MCTS has been successfully applied to:

  • Board games (Go, Chess, Shogi, Hex)
  • Video games (RTS, general game playing)
  • Planning problems (scheduling, optimization)
  • Robot navigation (path planning)
  • Resource allocation
  • Medical diagnosis

About This Visualizer

This interactive tool lets you watch MCTS in action, step-by-step. You can:

  • Watch the search tree grow in real-time as MCTS explores the game space
  • See visit counts, win rates, and UCB values for every node
  • Step through each phase manually to understand the algorithm deeply
  • Adjust iteration counts, speed, and simulation strategies
  • Play against MCTS or watch it play against random moves

Perfect for: Students, educators, AI enthusiasts, and anyone curious about how modern game AI works!

🚀 Start Learning with the Visualizer →

Learning Resources