ppo-LunarLander-v2 / results.json
DBusAI's picture
Retrain PPO model for LunarLander-v2 v3
23d5286
raw
history blame contribute delete
164 Bytes
{"mean_reward": 290.8454147555641, "std_reward": 20.005885358340613, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-05-04T21:48:10.951025"}