Coder Social home page Coder Social logo

Watching a replay about smac HOT 29 CLOSED

oxwhirl avatar oxwhirl commented on May 29, 2024
Watching a replay

from smac.

Comments (29)

nwayt001 avatar nwayt001 commented on May 29, 2024 7

I had a similar issue where I was getting the error on line 305 in pysc2/lib/features.py. The problem is that the code is trying to get a color option that is higher that what exists in the color pallet. In line 304, there is an if statement "if self.clip:" that will clip the values in the variable 'plane' so that this error won't happen. It looks like self.clip was never set to True so the code doesn't actually clip it. My solution was to just set clip to True so that the values are clipped and the indexError does not occur. Not sure if this is the best fix or not but it worked for me. I don't see why you wouldn't always want to clip anyways to prevent this error from happening.

from smac.

samvelyan avatar samvelyan commented on May 29, 2024

SMAC doesn't use the RGB features that are present PySC2 since SMAC uses only feature vectors as observations. Therefore, replays with such "deteriorated graphics" as in pysc2 are not available, unfortunately.

Have you tried the following command which is suggested in pymarl repository:

python -m pysc2.bin.play --norender --rgb_minimap_size 0 --replay NAME.SC2Replay

It works for me on Mac.

from smac.

PiggyGenius avatar PiggyGenius commented on May 29, 2024

This command works but I don't get the point, it doesn't visualize anything. Is it useful to compute test time indicators or things like that by changing the behaviour of pysc2.bin.play ?
Thank you for the reply by the way (and the amazing work), I'll probably be reaching out more in the future :)

from smac.

samvelyan avatar samvelyan commented on May 29, 2024

Which OS are you using? On Mac, it opens an SC2 window and replays the episodes that have been saved.

from smac.

PiggyGenius avatar PiggyGenius commented on May 29, 2024

I am in an ubuntu docker on a redhat computer. It doesn't show anything, just dumps a few log messages (opening replay, playing the game) until this replay is done playing (I assume). I only have the Linux version of the game (from blizzard repo) installed by the way, I guess I should play the replay from the official game release with wine ?

from smac.

samvelyan avatar samvelyan commented on May 29, 2024

I guess I should play the replay from the official game release with wine ?

Yes, please do that until I take a close look at the Linux version.

from smac.

douglasrizzo avatar douglasrizzo commented on May 29, 2024

I am also on Linux and had the same problem.

What I can do

  • Visualize pysc2 games and replays, for example:
python -m pysc2.bin.agent --map Simple64
python -m pysc2.bin.play --render --rgb_minimap_size 0 --replay DefeatRoaches_replay.SC2Replay
  • Run a SMAC agent:
python -m smac.examples.random_agents
  • Run a replay from a SMAC map, without rendering anything:
python -m pysc2.bin.play --norender --rgb_minimap_size 0 --replay 2s3z_replay.SC2Replay

What I am unable to do

Replay a game played on a SMAC map and visualize it, by using the --render option:

python -m pysc2.bin.play --render --rgb_minimap_size 0 --replay 2s3z_replay.SC2Replay

Since the SMAC tutorial explicitly says to use the --norender flag, I'm gonna go on a hunch here and say that the inability to render is an intended behavior, correct?

My settings

  • Manjaro Linux
  • Python 3.7.5
  • StarCraft 4.10
  • PySC2 v3.0

Log of failure

pygame 1.9.6
Hello from the pygame community. https://www.pygame.org/contribute.html
I0302 20:05:25.010760 140085184051008 sc_process.py:135] Launching SC2: /opt/StarCraftII/Versions/Base75689/SC2_x64 -listen 127.0.0.1 -port 22190 -dataDir /opt/StarCraftII/ -tempDir /tmp/sc-qov54kig/ -dataVersion B89B5D6FA7CBF6452E721311BFBC6CB2
I0302 20:05:25.016530 140085184051008 remote_controller.py:167] Connecting to: ws://127.0.0.1:22190/sc2api, attempt: 0, running: True
Version: B75689 (SC2.4.10)
Build: Aug 12 2019 17:16:57
Command Line: '"/opt/StarCraftII/Versions/Base75689/SC2_x64" -listen 127.0.0.1 -port 22190 -dataDir /opt/StarCraftII/ -tempDir /tmp/sc-qov54kig/ -dataVersion B89B5D6FA7CBF6452E721311BFBC6CB2'
Starting up...
Startup Phase 1 complete
I0302 20:05:26.021589 140085184051008 remote_controller.py:167] Connecting to: ws://127.0.0.1:22190/sc2api, attempt: 1, running: True
Startup Phase 2 complete
Creating stub renderer...
Listening on: 127.0.0.1:22190
Startup Phase 3 complete. Ready for commands.
I0302 20:05:27.023804 140085184051008 remote_controller.py:167] Connecting to: ws://127.0.0.1:22190/sc2api, attempt: 2, running: True
ConnectHandler: Request from 127.0.0.1:45872 accepted
ReadyHandler: 127.0.0.1:45872 ready
Could not find map name for file: /tmp/sc-qov54kig/StarCraft II/TempReplayInfo.SC2Replay
----------------------- Replay info ------------------------
map_name: "Just Another StarCraft II Map"
local_map_path: "SMAC_Maps/2s3z.SC2Map"
player_info {
  player_info {
    player_id: 1
    type: Participant
    race_requested: Terran
    race_actual: Terran
    player_name: "Local Player"
  }
  player_result {
    player_id: 1
    result: Tie
  }
  player_apm: 2258
}
game_duration_loops: 424
game_duration_seconds: 18.92989158630371
game_version: "4.10.0.75689"
data_build: 75689
base_build: 75689
data_version: "B89B5D6FA7CBF6452E721311BFBC6CB2"

------------------------------------------------------------
Configuring interface options
Configure: raw interface enabled
Configure: feature layer interface enabled
Configure: score interface enabled
Configure: render interface disabled
Launching next game.
Next launch phase started: 2
Next launch phase started: 3
Next launch phase started: 4
Next launch phase started: 5
Next launch phase started: 6
Next launch phase started: 7
Next launch phase started: 8
Starting replay 'TempStartReplay.SC2Replay'
Game has started.
Using default stable ids, none found at: /opt/StarCraftII/stableid.json
Successfully loaded stable ids: GameData\stableid.json
Exception in thread Renderer:
Traceback (most recent call last):
  File "/home/user/.anaconda3/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/home/user/.anaconda3/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "/home/user/.anaconda3/lib/python3.7/site-packages/pysc2/lib/renderer_human.py", line 1706, in render_thread
    self.render_obs(obs)
  File "/home/user/.anaconda3/lib/python3.7/site-packages/pysc2/lib/renderer_human.py", line 69, in _with_lock
    return func(*args, **kwargs)
  File "/home/user/.anaconda3/lib/python3.7/site-packages/pysc2/lib/stopwatch.py", line 212, in _stopwatch
    return func(*args, **kwargs)
  File "/home/user/.anaconda3/lib/python3.7/site-packages/pysc2/lib/renderer_human.py", line 1724, in render_obs
    surf.draw(surf)
  File "/home/user/.anaconda3/lib/python3.7/site-packages/pysc2/lib/renderer_human.py", line 669, in <lambda>
    lambda surf: self.draw_feature_layer(surf, feature))
  File "/home/user/.anaconda3/lib/python3.7/site-packages/pysc2/lib/stopwatch.py", line 212, in _stopwatch
    return func(*args, **kwargs)
  File "/home/user/.anaconda3/lib/python3.7/site-packages/pysc2/lib/renderer_human.py", line 1655, in draw_feature_layer
    surf.blit_np_array(feature.color(layer))
  File "/home/user/.anaconda3/lib/python3.7/site-packages/pysc2/lib/stopwatch.py", line 212, in _stopwatch
    return func(*args, **kwargs)
  File "/home/user/.anaconda3/lib/python3.7/site-packages/pysc2/lib/features.py", line 305, in color
    return self.palette[plane]
IndexError: index 1971 is out of bounds for axis 0 with size 1962

from smac.

PiggyGenius avatar PiggyGenius commented on May 29, 2024

The problem is known and I guess fixing it is not easy. I suggest you do the same thing I did, install wine and the official game. You will be able to watch the replays with this version of the game.

from smac.

douglasrizzo avatar douglasrizzo commented on May 29, 2024

For future reference, these are the lines.

from smac.

douglasrizzo avatar douglasrizzo commented on May 29, 2024

Hi there, I have revisited this issue and decided to provide the fix suggested by @nwayt001 (which I have tested and works) in a pysc2 fork of my own. It is only a single commit ahead of v3.0.0 and master, so it should still be compatible with everything.

A quick installation can be done like so:

pip install git+https://github.com/douglasrizzo/pysc2.git@smac-view

I have tested it in a replay file generated by the example agents in the 8m map:

python -m pysc2.bin.play --render --replay 8m_replay.SC2Replay

image

from smac.

GoingMyWay avatar GoingMyWay commented on May 29, 2024

@douglasrizzo Wonderful, does it work on SMAC envs while using SMAC to training agents?

from smac.

douglasrizzo avatar douglasrizzo commented on May 29, 2024

@GoingMyWay I don't think so, since the Starcraft2Env provided by SMAC doesn't implement the render() method in the first place.

def render(self):
"""Not implemented."""
pass

from smac.

GoingMyWay avatar GoingMyWay commented on May 29, 2024

@GoingMyWay I don't think so, since the Starcraft2Env provided by SMAC doesn't implement the render() method in the first place.

def render(self):
"""Not implemented."""
pass

Thanks, sorry for my late reply, these days I am trying to record video the play while evaluating.

from smac.

douglasrizzo avatar douglasrizzo commented on May 29, 2024

@GoingMyWay my advice would be for you to run the evaluation, save the replay, then replay it either on the game on Windows for the best graphics, or on Linux. Then capture the screen with a screen recorder app.

from smac.

GoingMyWay avatar GoingMyWay commented on May 29, 2024

@GoingMyWay my advice would be for you to run the evaluation, save the replay, then replay it either on the game on Windows for the best graphics, or on Linux. Then capture the screen with a screen recorder app.

Thanks, my laptop is Macbook Pro, I can evaluate the model while training but cannot record the video. If you can share some steps-by-steps guidelines to do video recording, it would be very nice. Or you can make a PR to SMAC, I think it is a very good enhancement for SMAC.

from smac.

GoingMyWay avatar GoingMyWay commented on May 29, 2024

@douglasrizzo I am using your code, here is my simple code to save the replay

import time
import argparse

from smac.env import StarCraft2Env
import numpy as np


def main(args):
    env = StarCraft2Env(map_name=args.map_name)
    env_info = env.get_env_info()

    n_actions = env_info["n_actions"]
    n_agents = env_info["n_agents"]

    n_episodes = 10

    for e in range(n_episodes):
        env.reset()
        terminated = False
        episode_reward = 0

        while not terminated:
            obs = env.get_obs()
            state = env.get_state()

            time.sleep(1)

            actions = []
            for agent_id in range(n_agents):
                avail_actions = env.get_avail_agent_actions(agent_id)
                avail_actions_ind = np.nonzero(avail_actions)[0]
                action = np.random.choice(avail_actions_ind)
                actions.append(action)

            reward, terminated, _ = env.step(actions)
            episode_reward += reward
            env.save_replay()  # here is the saving replay operation

        print("Total reward in episode {} = {}".format(e, episode_reward))

    env.close()

if __name__ == '__main__':
    parser = argparse.ArgumentParser(description='run demo')
    parser.add_argument('--map-name', type=str, default='3m')
    args = parser.parse_args()
    main(args=args)

After running for 1 episode, I got many replay files, how can I merge or save all replays of one episode to one replay file? I found it not possible to do this, the save_replay function saves replays in the previous buffer all episodes.

from smac.

douglasrizzo avatar douglasrizzo commented on May 29, 2024

@GoingMyWay you don't call save_replay() inside your loop, only after the episode is finished.

from smac.

GoingMyWay avatar GoingMyWay commented on May 29, 2024

@GoingMyWay you don't call save_replay() inside your loop, only after the episode is finished.

Thanks, another question is while using SMAC, when I want to evaluate the MARL model for some episode after some training time steps and save the replay of these evaluation episodes. Does it also save the replay of last evaluation? Or does it only save the replay of current evaluation episodes?

from smac.

douglasrizzo avatar douglasrizzo commented on May 29, 2024

I believe it saves everything between the moment you call env.reset() to the moment you call env.save_replay(). Since you have to call env.reset() after an episode ends, each replay file will only keep data from a single episode.

from smac.

GoingMyWay avatar GoingMyWay commented on May 29, 2024

I believe it saves everything between the moment you call env.reset() to the moment you call env.save_replay(). Since you have to call env.reset() after an episode ends, each replay file will only keep data from a single episode.

Looks like it will save all the previous episodes one by one even calling reset() after each episode.

You can try this code and see the output replays

import time
import argparse

from smac.env import StarCraft2Env
import numpy as np


def main(args):
    env = StarCraft2Env(map_name=args.map_name)
    env_info = env.get_env_info()

    n_actions = env_info["n_actions"]
    n_agents = env_info["n_agents"]

    n_episodes = 10

    for e in range(n_episodes):
        env.reset()
        terminated = False
        episode_reward = 0

        while not terminated:
            obs = env.get_obs()
            state = env.get_state()

            time.sleep(1)

            actions = []
            for agent_id in range(n_agents):
                avail_actions = env.get_avail_agent_actions(agent_id)
                avail_actions_ind = np.nonzero(avail_actions)[0]
                action = np.random.choice(avail_actions_ind)
                actions.append(action)

            reward, terminated, _ = env.step(actions)
            episode_reward += reward
        
        # save after each episode
        env.save_replay()  # here is the saving replay operation

        print("Total reward in episode {} = {}".format(e, episode_reward))

    env.close()

if __name__ == '__main__':
    parser = argparse.ArgumentParser(description='run demo')
    parser.add_argument('--map-name', type=str, default='3m')
    args = parser.parse_args()
    main(args=args)

from smac.

plutonic88 avatar plutonic88 commented on May 29, 2024

I am using Starcraft 2 version 5.0.4.
It does not create any replay file when I run the following command:

python -m smac.examples.random_agents

Can the version be a problem?

However, I was able to create a replay file using the code above by @GoingMyWay. To replay it I had to use a different code which skips the version checking.

from smac.

plutonic88 avatar plutonic88 commented on May 29, 2024

I had to explicitly call env.save_replay() to create the replay files in the following file:

smac.examples.random_agents

Otherwise, the replay files are not created.

from smac.

plutonic88 avatar plutonic88 commented on May 29, 2024

Ok. So I was able to play the replay files (in smac env) by modifying the pysc2 play file that skips the version checking.

from smac.

douglasrizzo avatar douglasrizzo commented on May 29, 2024

@plutonic88 SMAC does not save replays automatically, you do have to call save_replay() when you want to record a replay file.

from smac.

samvelyan avatar samvelyan commented on May 29, 2024

I have added a section in the README file on how to save and watch replays. I hope this answers your questions. Please let me know if something is unclear.

from smac.

GoingMyWay avatar GoingMyWay commented on May 29, 2024

@plutonic88 Hi, you can also use the SC2Switcher to open the replay files without running any Python code.

image

from smac.

douglasrizzo avatar douglasrizzo commented on May 29, 2024

I have added a section in the README file on how to save and watch replays. I hope this answers your questions. Please let me know if something is unclear.

@samvelyan I am trying to watch replays generated on Linux through the Windows game. I have a few replays generated on Linux. I was able to install the game using Wine (as mentioned in the SMAC README) and access the replay files from inside the game, but I get an "Unable to open map" error when double clicking the replay, also from inside the game.

I'm not sure if I need to provide the SMAC map files to the game. I looked around but it seems the game doesn't have a "Maps" folder to which we can drop the map files to. Any clues?

from smac.

wang88256187 avatar wang88256187 commented on May 29, 2024

The problem is known and I guess fixing it is not easy. I suggest you do the same thing I did, install wine and the official game. You will be able to watch the replays with this version of the game.

but offical version of sc2 is too high now, wine is not work for watching the replay anymore

from smac.

wang88256187 avatar wang88256187 commented on May 29, 2024

I believe it saves everything between the moment you call env.reset() to the moment you call env.save_replay(). Since you have to call env.reset() after an episode ends, each replay file will only keep data from a single episode.

Looks like it will save all the previous episodes one by one even calling reset() after each episode.

You can try this code and see the output replays

import time
import argparse

from smac.env import StarCraft2Env
import numpy as np


def main(args):
    env = StarCraft2Env(map_name=args.map_name)
    env_info = env.get_env_info()

    n_actions = env_info["n_actions"]
    n_agents = env_info["n_agents"]

    n_episodes = 10

    for e in range(n_episodes):
        env.reset()
        terminated = False
        episode_reward = 0

        while not terminated:
            obs = env.get_obs()
            state = env.get_state()

            time.sleep(1)

            actions = []
            for agent_id in range(n_agents):
                avail_actions = env.get_avail_agent_actions(agent_id)
                avail_actions_ind = np.nonzero(avail_actions)[0]
                action = np.random.choice(avail_actions_ind)
                actions.append(action)

            reward, terminated, _ = env.step(actions)
            episode_reward += reward
        
        # save after each episode
        env.save_replay()  # here is the saving replay operation

        print("Total reward in episode {} = {}".format(e, episode_reward))

    env.close()

if __name__ == '__main__':
    parser = argparse.ArgumentParser(description='run demo')
    parser.add_argument('--map-name', type=str, default='3m')
    args = parser.parse_args()
    main(args=args)

I found the issue same like yours, how solve it ?

from smac.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.