Probabilistic and Reinforcement Learning Track

2023 International Planning Competition

Welcome to the International Planning Competition 2023: Probabilistic and Reinforcement Learning Track

The International Probabilistic Planning Competition is organized in the context of the International Conference on Planning and Scheduling (ICAPS). It empirically evaluates state-of-the-art planning systems on a number of benchmark problems. The goals of the IPC are to promote planning research, highlight challenges in the planning community and provide new and interesting problems as benchmarks for future research.

Since 2004, probabilistic tracks have been part of the IPC under different names (as the International Probabilistic Planning competition or as part of the uncertainty tracks). After 2004, 2006, 2008, 2011, 2014, and 2018, the 7th IPPC will be held in 2023 and conclude together with ICAPS, in July 2023, in Prague (Czech Republic). This time it is organized by Ayal Taitler and Scott Sanner.


Please forward the following calls to all interested parties:

We invite interested competitors to join the competition discussion:

Preliminary Schedule

Event Date
Infrastructure release with sample domains October, 2022
Call for domains and praticipants October, 2022
Final domains announcement March 2, 2023
Competitors registration deadline March 15, 2023
Planner abstract dubmission May 1, 2023
Contest run June 5-8, 2023
Results announced July 12, 2023


This year’s competition will be using the generic pyRDDLGym - an autogeneration tool for gym environments from RDDL textual description.

RDDLGym diagram

More information about the infrastructure, how to use it and how to add user defined domains can be found the following short guide

pyRDDLGym also comes with a set of auxiliary utils and baseline methods:


We provide a sample of RDDL domains here and include a list of the eight Final Competition Domains further below. We still encourange the community to contribute user defined domains, or ideas you think the community should be aware of. While it will not make it into the current competition it will help enrich the the problem database, and be mature enough to be included in future competition.

In addition to the original domain, we have recreated some of the classical control domains in RDDL. Illustrating how easy it is to generate domains in pyRDDLSim:

Note, that there are additional domains out there from past competitions (IPPC 2011, IPPC 2014), which can be also be used with pyRDDLSim:

Past competitions were entirely discrete; the focus of this year’s competition is on continuous and mixed discrete-continuous problems. However, everybody are welcome to take advantage of their existance. All previous competition domains are avilavble through the rddlrepository package/git.


Registration for the competition is now closed.

A record number of groups have registered, we are expecting an amazing competition.


Documentation and Source Code

All competitors must submit a (maximum) 2 page abstract + unlimited references, describing their method. The competitors must also submit the source code of their method to be examined (and may be run) by the organizers of the competition. Please format submissions in AAAI style (see instructions in the Author Kit), please use the camera ready version with the names and affiliations.

An important requirement for IPC 2023 competitors is to give the organizers the right to post their paper and the source code of their learners/planners on the official IPC 2023 web site, and the source code of submitted planners must be released under a license allowing free non-commercial use.

Abstract submission due May 1, 2023.

Final Competition Domains

The competition will include:

Domain pyRDDLGym name
Race Car RaceCar
Reservoir Control Reservoir continuous
Recommender Systems RecSim
UAV UAV continuous
Power Generation PowerGen continuous
Mountain Car MountainCar
Mars Rover MarsRover

Competition Logistics

The competition week will take place Monday-Thursday June 5 - June 8 2023. Starting June 5th, at the beginning of each day (for four consecutive days) 5 instances of varying size and difficulty with a maximum horizon of 100 will be released for 2 selected domains per day. The competitors will have 24 hours for any autonomous learning or tuning before they submit a maximum 2GB container (per domain) before the end of the 24 hour window.

Competitors should self-report training specifications (how many machines and machine specifications).

Manual encoding of domain knowledge is prohibited – the 24 hour period is a training phase intended for reinforcement learning competitors and competitors who otherwise need to tune planner hyperparameters.

Remark: we will use the same five instances at evaluation time in this edition of the competition in order to facilitate reinforcement learning competitors who may need to learn per-instance.

Detailed instructions on how to upload the containers will be released closer to the competition date.

Evaluation and Scoring

After June 8, 2023, competitor container submissions will be evaluated using an 8-core CPU (no GPU) with 32Gb of RAM (exact specifications TBD) on 50 randomized trials for each of the 8 competition domains and the 5 instances released during the competition.

The average over all 50 trails will be taken as the raw score for each instance.

Normalized [0,1] instance scores will be computed according to the following lower and upper bounds:

A planner that does worse than 0 on this normalized scale will receive a 0. Each trial has a 2 minute time limit; failure to execute any trial (e.g., crash) for an instance or exceeding the time limit in any trial for an instance will lead to overall normalized score of 0 for that instance. No competitor can exceed a score of 1, by definition.

Normalized domain scores will be computed as an average of normalized instance scores.

An overall competition score will be computed as an average of normalized domain scores.

Results Announcement

By the competition registration deadline, we will release code for computing all scores above from competition traces for each competitor.

At the ICAPS conference (July 8-13), we will announce winners per domain (by domain score) and an overall winner (by competition score). At this time, we will also publicly release traces of all trails for the competitors to allow reproduction and verification of score computations and to facilitate trace analysis of competitors.



Contact us: