[go: nahoru, domu]

Skip to content

Monte Carlo Tree Search for Markov decision processes using the POMDPs.jl framework

License

Notifications You must be signed in to change notification settings

JuliaPOMDP/MCTS.jl

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MCTS

CI codecov Documentation Documentation

MCTS Tree for Grid World, visualized

This package implements the Monte-Carlo Tree Search algorithm in Julia for solving Markov decision processes (MDPs). The user should define the problem as a POMDPs.jl MDP model. A simple example of the mountaincar problem defined with the QuickPOMPDs package can be found here; additional examples of problem definitions can be found in POMDPModels.jl.

There is also a BeliefMCTSSolver that solves a POMDP by converting it to an MDP in the belief space.

Special thanks to Jon Cox for writing the original version of this code.

For reference, see the UCT algorithm in this paper: Kocsis, Levente, and Csaba Szepesvári. "Bandit Based Monte-Carlo planning." European Conference on Machine Learning. Springer, Berlin, Heidelberg, 2006.

Installation

In Julia, type, ]add MCTS

Documentation

Documentation can be found on the following site: juliapomdp.github.io/MCTS.jl/latest/

Usage

If mdp is an MDP defined with the POMDPs.jl interface, the MCTS solver can be used to find an optimized action, a, for the MDP in state s as follows:

using POMDPs
using POMDPModels # for the SimpleGridWorld problem
using MCTS
using StaticArrays
mdp = SimpleGridWorld()
solver = MCTSSolver(n_iterations=50, depth=20, exploration_constant=5.0)
planner = solve(solver, mdp)
a = action(planner, SA[1,2])

See this notebook for an example of how to visualize the search tree.

See this notebook for examples of customizing solver behavior, specifically the Rollouts section for using heuristic rollout policies.