Coder Social home page Coder Social logo

sail-sg / rosmo Goto Github PK

View Code? Open in Web Editor NEW
26.0 6.0 0.0 68 KB

Codes for "Efficient Offline Policy Optimization with a Learned Model", ICLR2023

Home Page: https://arxiv.org/abs/2210.05980

License: Apache License 2.0

Makefile 0.72% Python 99.28%
atari model-based-rl muzero offline-reinforcement-learning reinforcement-learning rl-unplugged jax dm-haiku arcade-learning-environment bsuite

rosmo's Introduction

ROSMO


Check status License Arxiv

Table of Contents

Introduction

This repository contains the implementation of ROSMO, a Regularized One-Step Model-based algorithm for Offline-RL, introduced in our paper "Efficient Offline Policy Optimization with a Learned Model". We provide the training codes for both Atari and BSuite experiments, and have made the reproduced results on Atari MsPacman publicly available at W&B.

Installation

Please follow the installation guide.

Usage

BSuite

To run the BSuite experiments, please ensure you have downloaded the datasets and placed them at the directory defined by CONFIG.data_dir in experiment/bsuite/config.py.

  1. Debug run.
python experiment/bsuite/main.py -exp_id test -env cartpole
  1. Enable W&B logger and start training.
python experiment/bsuite/main.py -exp_id test -env cartpole -nodebug -use_wb -user ${WB_USER}

Atari

The following commands are examples to train 1) a ROSMO agent, 2) its sampling variant, and 3) a MZU agent on the game MsPacman.

  1. Train ROSMO with exact policy target.
python experiment/atari/main.py -exp_id rosmo -env MsPacman -nodebug -use_wb -user ${WB_USER}
  1. Train ROSMO with sampled policy target (N=4).
python experiment/atari/main.py -exp_id rosmo-sample-4 -sampling -env MsPacman -nodebug -use_wb -user ${WB_USER}
  1. Train MuZero unplugged for benchmark (N=20).
python experiment/atari/main.py -exp_id mzu-sample-20 -algo mzu -num_simulations 20 -env MsPacman -nodebug -use_wb -user ${WB_USER}

Citation

If you find this work useful for your research, please consider citing

@inproceedings{
  liu2023rosmo,
  title={Efficient Offline Policy Optimization with a Learned Model},
  author={Zichen Liu and Siyi Li and Wee Sun Lee and Shuicheng Yan and Zhongwen Xu},
  booktitle={International Conference on Learning Representations},
  year={2023},
  url={https://arxiv.org/abs/2210.05980}
}

License

ROSMO is distributed under the terms of the Apache2 license.

Acknowledgement

We thank the following projects which provide great references:

Disclaimer

This is not an official Sea Limited or Garena Online Private Limited product.

rosmo's People

Contributors

lkevinzc avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

rosmo's Issues

Can you provide an image?

Hello, can you provide an image that can be used to run this implementation?I encountered the following error when running this code in my image:
File "experiment/atari/main.py", line 28, in
from acme import EnvironmentLoop
File "/opt/conda/lib/python3.8/site-packages/acme/init.py", line 35, in
from acme.environment_loop import EnvironmentLoop
File "/opt/conda/lib/python3.8/site-packages/acme/environment_loop.py", line 26, in
from acme.utils import signals
File "/opt/conda/lib/python3.8/site-packages/acme/utils/signals.py", line 22, in
import launchpad
File "/opt/conda/lib/python3.8/site-packages/launchpad/init.py", line 36, in
from launchpad.nodes.courier.node import CourierHandle
File "/opt/conda/lib/python3.8/site-packages/launchpad/nodes/courier/node.py", line 21, in
import courier
File "/opt/conda/lib/python3.8/site-packages/courier/init.py", line 26, in
from courier.python.client import Client # pytype: disable=import-error
File "/opt/conda/lib/python3.8/site-packages/courier/python/client.py", line 30, in
from courier.python import py_client
ImportError: libpython3.8.so.1.0: cannot open shared object file: No such file or directory

Google Colab Unrecognized split format error

Hi,

I have following the installation guide for setting up ROSMO to work on GPU in Google Colab.

I have run the algorithm on the BReakout Environment using:

!python experiment/atari/main.py -exp_id test -env Breakout -nodebug

and have received the following error:

Dataset rlu_atari_checkpoints_ordered downloaded and prepared to ./datasets/rl_unplugged/tensorflow_datasets/rlu_atari_checkpoints_ordered/Breakout_run_1/1.1.0. Subsequent calls will reuse this data.
I0305 17:10:38.168606 140094298748736 logging_logger.py:49] Constructing tf.data.Dataset rlu_atari_checkpoints_ordered for split , from ./datasets/rl_unplugged/tensorflow_datasets/rlu_atari_checkpoints_ordered/Breakout_run_1/1.1.0
Traceback (most recent call last):
File "experiment/atari/main.py", line 268, in
app.run(main)
File "/usr/local/lib/python3.8/dist-packages/absl/app.py", line 308, in run
_run_main(main, args)
File "/usr/local/lib/python3.8/dist-packages/absl/app.py", line 254, in _run_main
sys.exit(main(argv))
File "experiment/atari/main.py", line 188, in main
env, dataloader = get_env_data_loader(cfg)
File "experiment/atari/main.py", line 105, in get_env_data_loader
environment, dataset = atari_env_loader(
File "/content/drive/MyDrive/Colab_Notebooks/rosmo/rosmo/data/rlu_atari.py", line 301, in env_loader
return environment(game=env_name, stack_size=stack_size), create_atari_ds_loader(
File "/content/drive/MyDrive/Colab_Notebooks/rosmo/rosmo/data/rlu_atari.py", line 188, in create_atari_ds_loader
dataset = _uniformly_subsampled_atari_data(
File "/content/drive/MyDrive/Colab_Notebooks/rosmo/rosmo/data/rlu_atari.py", line 142, in _uniformly_subsampled_atari_data
return tfds.load(
File "/usr/local/lib/python3.8/dist-packages/tensorflow_datasets/core/logging/init.py", line 169, in call
return function(*args, **kwargs)
File "/usr/local/lib/python3.8/dist-packages/tensorflow_datasets/core/load.py", line 629, in load
ds = dbuilder.as_dataset(**as_dataset_kwargs)
File "/usr/local/lib/python3.8/dist-packages/tensorflow_datasets/core/logging/init.py", line 169, in call
return function(*args, **kwargs)
File "/usr/local/lib/python3.8/dist-packages/tensorflow_datasets/core/dataset_builder.py", line 827, in as_dataset
all_ds = tree_utils.map_structure(build_single_dataset, split)
File "/usr/local/lib/python3.8/dist-packages/tree/init.py", line 435, in map_structure
[func(*args) for args in zip(*map(flatten, structures))])
File "/usr/local/lib/python3.8/dist-packages/tree/init.py", line 435, in
[func(*args) for args in zip(*map(flatten, structures))])
File "/usr/local/lib/python3.8/dist-packages/tensorflow_datasets/core/dataset_builder.py", line 845, in _build_single_dataset
ds = self._as_dataset(
File "/usr/local/lib/python3.8/dist-packages/tensorflow_datasets/core/dataset_builder.py", line 1298, in _as_dataset
return reader.read(
File "/usr/local/lib/python3.8/dist-packages/tensorflow_datasets/core/reader.py", line 413, in read
return tree_utils.map_structure(_read_instruction_to_ds, instructions)
File "/usr/local/lib/python3.8/dist-packages/tree/init.py", line 435, in map_structure
[func(*args) for args in zip(*map(flatten, structures))])
File "/usr/local/lib/python3.8/dist-packages/tree/init.py", line 435, in
[func(*args) for args in zip(*map(flatten, structures))])
File "/usr/local/lib/python3.8/dist-packages/tensorflow_datasets/core/reader.py", line 404, in _read_instruction_to_ds
file_instructions = splits_dict[instruction].file_instructions
File "/usr/local/lib/python3.8/dist-packages/tensorflow_datasets/core/splits.py", line 400, in getitem
instructions = _make_file_instructions(
File "/usr/local/lib/python3.8/dist-packages/tensorflow_datasets/core/splits.py", line 501, in _make_file_instructions
absolute_instructions = _make_absolute_instructions(
File "/usr/local/lib/python3.8/dist-packages/tensorflow_datasets/core/splits.py", line 455, in _make_absolute_instructions
instruction = AbstractSplit.from_spec(instruction)
File "/usr/local/lib/python3.8/dist-packages/tensorflow_datasets/core/splits.py", line 552, in from_spec
instructions = [_str_to_relative_instruction(s) for s in subs]
File "/usr/local/lib/python3.8/dist-packages/tensorflow_datasets/core/splits.py", line 552, in
instructions = [_str_to_relative_instruction(s) for s in subs]
File "/usr/local/lib/python3.8/dist-packages/tensorflow_datasets/core/splits.py", line 685, in _str_to_relative_instruction
raise ValueError(err_msg)
ValueError: Error parsing split ''. See format at: https://www.tensorflow.org/datasets/splits
Unrecognized split format: ''. See format at https://www.tensorflow.org/datasets/splits

Any help would be greatly appreciated.

Thanks

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.