ppo-MountainCarContinuous-v0 / wrappers /video_compat_wrapper.py
sgoodfriend's picture
PPO playing MountainCarContinuous-v0 from https://github.com/sgoodfriend/rl-algo-impls/tree/5598ebc4b03054f16eebe76792486ba7bcacfc5c
fc3c3e2
raw
history blame
380 Bytes
import gym
import numpy as np
class VideoCompatWrapper(gym.Wrapper):
def __init__(self, env: gym.Env) -> None:
super().__init__(env)
def render(self, mode="human", **kwargs):
r = super().render(mode=mode, **kwargs)
if mode == "rgb_array" and isinstance(r, np.ndarray) and r.dtype != np.uint8:
r = r.astype(np.uint8)
return r