Isaac Gym Environments for Legged Robots

Related tags

Hardwarelegged_gym
Overview

Isaac Gym Environments for Legged Robots

This repository provides the environment used to train ANYmal (and other robots) to walk on rough terrain using NVIDIA's Isaac Gym. It includes all components needed for sim-to-real transfer: actuator network, friction & mass randomization, noisy observations and random pushes during training.
Maintainer: Nikita Rudin
Affiliation: Robotic Systems Lab, ETH Zurich
Contact: [email protected]

Useful Links

Project website: https://leggedrobotics.github.io/legged_gym/ Paper: https://arxiv.org/abs/2109.11978

Installation

  1. Create a new python virtual env with python 3.6, 3.7 or 3.8 (3.8 recommended)
  2. Install pytorch 1.10 with cuda-11.3:
    • pip3 install torch==1.10.0+cu113 torchvision==0.11.1+cu113 torchaudio==0.10.0+cu113 -f https://download.pytorch.org/whl/cu113/torch_stable.html
  3. Install Isaac Gym
    • Download and install Isaac Gym Preview 3 (Preview 2 will not work!) from https://developer.nvidia.com/isaac-gym
    • cd isaacgym_lib/python && pip install -e .
    • Try running an example python examples/1080_balls_of_solitude.py
    • For troubleshooting check docs isaacgym/docs/index.html)
  4. Install rsl_rl (PPO implementation)
  5. Install legged_gym
    • Clone this repository
    • cd legged_gym && git checkout develop && pip install -e .

CODE STRUCTURE

  1. Each environment is defined by an env file (legged_robot.py) and a config file (legged_robot_config.py). The config file contains two classes: one conatianing all the environment parameters (LeggedRobotCfg) and one for the training parameters (LeggedRobotCfgPPo).
  2. Both env and config classes use inheritance.
  3. Each non-zero reward scale specified in cfg will add a function with a corresponding name to the list of elements which will be summed to get the total reward.
  4. Tasks must be registered using task_registry.register(name, EnvClass, EnvConfig, TrainConfig). This is done in envs/__init__.py, but can also be done from outside of this repository.

Usage

  1. Train:
    python issacgym_anymal/scripts/train.py --task=anymal_c_flat
    • To run on CPU add following arguments: --sim_device=cpu, --rl_device=cpu (sim on CPU and rl on GPU is possible).
    • To run headless (no rendering) add --headless.
    • Important: To improve performance, once the training starts press v to stop the rendering. You can then enable it later to check the progress.
    • The trained policy is saved in issacgym_anymal/logs/ / _ /model_ .pt . Where and are defined in the train config.
    • The following command line arguments override the values set in the config files:
    • --task TASK: Task name.
    • --resume: Resume training from a checkpoint
    • --experiment_name EXPERIMENT_NAME: Name of the experiment to run or load.
    • --run_name RUN_NAME: Name of the run.
    • --load_run LOAD_RUN: Name of the run to load when resume=True. If -1: will load the last run.
    • --checkpoint CHECKPOINT: Saved model checkpoint number. If -1: will load the last checkpoint.
    • --num_envs NUM_ENVS: Number of environments to create.
    • --seed SEED: Random seed.
    • --max_iterations MAX_ITERATIONS: Maximum number of training iterations.
  2. Play a trained policy:
    python issacgym_anymal/scripts/play.py --task=anymal_c_flat
    • By default the loaded policy is the last model of the last run of the experiment folder.
    • Other runs/model iteration can be selected by setting load_run and checkpoint in the train config.

Adding a new environment

The base environment legged_robot implements a rough terrain locomotion task. The corresponding cfg does not specify a robot asset (URDF/ MJCF) and no reward scales.

  1. Add a new folder to envs/ with ' _config.py , which inherit from an existing environment cfgs
  2. If adding a new robot:
    • Add the corresponding assets to resourses/.
    • In cfg set the asset path, define body names, default_joint_positions and PD gains. Specify the desired train_cfg and the name of the environment (python class).
    • In train_cfg set experiment_name and run_name
  3. (If needed) implement your environment in .py, inherit from an existing environment, overwrite the desired functions and/or add your reward functions.
  4. Register your env in isaacgym_anymal/envs/__init__.py.
  5. Modify/Tune other parameters in your cfg, cfg_train as needed. To remove a reward set its scale to zero. Do not modify parameters of other envs!

Troubleshooting

  1. If you get the following error: ImportError: libpython3.8m.so.1.0: cannot open shared object file: No such file or directory, do: sudo apt install libpython3.8

Known Issues

  1. The contact forces reported by net_contact_force_tensor are unreliable when simulating on GPU with a triangle mesh terrain. A workaround is to use force sensors, but the force are propagated through the sensors of consecutive bodies resulting in an undesireable behaviour. However, for a legged robot it is possible to add sensors to the feet/end effector only and get the expected results. When using the force sensors make sure to exclude gravity from trhe reported forces with sensor_options.enable_forward_dynamics_forces. Example:
    sensor_pose = gymapi.Transform()
    for name in feet_names:
        sensor_options = gymapi.ForceSensorProperties()
        sensor_options.enable_forward_dynamics_forces = False # for example gravity
        sensor_options.enable_constraint_solver_forces = True # for example contacts
        sensor_options.use_world_frame = True # report forces in world frame (easier to get vertical components)
        index = self.gym.find_asset_rigid_body_index(robot_asset, name)
        self.gym.create_asset_force_sensor(robot_asset, index, sensor_pose, sensor_options)
    (...)

    sensor_tensor = self.gym.acquire_force_sensor_tensor(self.sim)
    self.gym.refresh_force_sensor_tensor(self.sim)
    force_sensor_readings = gymtorch.wrap_tensor(sensor_tensor)
    self.sensor_forces = force_sensor_readings.view(self.num_envs, 4, 6)[..., :3]
    (...)

    self.gym.refresh_force_sensor_tensor(self.sim)
    contact = self.sensor_forces[:, :, 2] > 1.
Owner
Robotic Systems Lab - Legged Robotics at ETH Zürich
The Robotic Systems Lab investigates the development of machines and their intelligence to operate in rough and challenging environments.
Robotic Systems Lab - Legged Robotics at ETH Zürich
Home Assistant custom integration for e-distribución

e-Distribución is an energy distribution company that covers most of South Spain area. If you live in this area, you probably are able to register into their website to get some information about you

VMG 17 Sep 07, 2022
My 500 LED xmas tree

xmastree2020 This repository contains the code used for Matt's Christmas tree, as featured in "I wired my tree with 500 LED lights and calculated thei

Stand-up Maths 581 Jan 07, 2023
Home Assistant component to handle key atom

KeyAtome Home Assistant component to handle key atom, a Linky-compatible device made by Total/Direct-Energie. Installation Either use HACS (default),

18 Dec 21, 2022
Bucatini: a soft PIPE PHY for FPGA SerDes

Bucatini: a soft PIPE PHY for FPGA SerDes Bucatini is a noodly gateware layer capable of transforming an FPGA SerDes into a PIPE PHY, allowing you to

Great Scott Gadgets 28 Dec 02, 2022
raspberry pi servo control using pca9685

RPi_servo-control_pca9685 raspberry pi 180° servo control using pca9685 Requirements Requires you to have the adafruit servokit library installed You

1 Jan 10, 2022
Python script for printing to the Hanshow price-tag

This repository contains Python code for talking to the ATC_TLSR_Paper open-source firmware for the Hanshow e-paper pricetag. Installation # Clone the

12 Oct 06, 2022
Estimation of whether or not the persons given information will have diabetes.

Diabetes Business Problem : It is desired to develop a machine learning model that can predict whether people have diabetes when their characteristics

Barış TOKATLIOĞLU 0 Jan 20, 2022
This tool emulates an EMV-CAP device, to illustrate the article "Banque en ligne : à la decouverte d'EMV-CAP" published in MISC

About This tool emulates an EMV-CAP device, to illustrate the article "Banque en ligne : à la decouverte d'EMV-CAP" published in MISC, issue #56 and f

Philippe Teuwen 28 Nov 21, 2022
A python script for macOS to enable scrolling with the 3M ergonomic mouse EM500GPS in any application.

A python script for macOS to enable scrolling with the 3M ergonomic mouse EM500GPS in any application.

3 Feb 19, 2022
Setup DevTerm to be a cool non-GUI device

DevTerm hobby project I bought this amazing device: DevTerm A-0604. It has a beefy ARM processor, runs a custom version of Armbian, embraces Open Sour

Alex Shteinikov 9 Nov 17, 2022
PBA: Pleco Breeding Assistant

A small monitor that reports the external, fishroom and water change parameters to have suitable water parameters and induce breeding. These two features already represent 50% of the "reproductive su

Mirko Mancin 1 Jan 19, 2022
A raspberrypi tools for python

raspberrypi-tools how to install: first clone this project: git clone https://github.com/Ardumine/rpi-tools.git then go to the folder cd rpi-tools and

1 Jan 04, 2022
Provide Unifi device info via api to Home Assistant that will give ap sensors

Unifi AP Device info Provide Unifi device info via api to Home Assistant that will give ap sensors

12 Jan 07, 2023
Connect a TeslaMate instance to Home Assistant, using MQTT

TeslaBuddy Connect a TeslaMate instance to Home Assistant, using MQTT. It allows basic control of your Tesla vehicle via Home Assistant (currently, ju

4 May 23, 2022
Add filters (background blur, etc) to your webcam on Linux.

webcam-filters Add filters (background blur, etc) to your webcam on Linux. Video conferencing applications tend to either lack video effects altogethe

Jashandeep Sohi 480 Dec 14, 2022
DIY split-flap display

The goal is to make a low-cost display that's easy to fabricate at home in small/single quantities (e.g. custom materials can be ordered from Ponoko or similar, and other hardware is generally availa

Scott Bezek 2.5k Jan 05, 2023
A Home Assistant sensor that tells you what holiday is next

Next Holiday Sensor This sensor tells you what holiday is coming up next. You can use it to set holiday light colors or other scenes. The state of the

Nick Touran 20 Dec 04, 2022
A IC scan test interface for Arduino

ICSCAN_ARDUINO Prerequisites Python 3.6 or higher arduino uno or nano what is this It is a bitstream tranceiver to test IC chip It sends bitstream to

Nifty Chips Laboratory 0 Sep 15, 2022
Hourglass on the pi pico using circuitpython

hourglass-on-pico "Hourglass" on the raspberry pi pico using circuitpython circuitpython version 7.0.0 Components used: Raspberry Pi Pico ADXL345 acce

4 Jul 18, 2022
A simple program to make MSI Modern 15 speaker and microphone mute led work.

MSI Modern 15 sound led fixup for linux A simple program to fix the MSI Modern 15 speaker and microphone mute LEDs. Installation Requirements pulsectl

Seyed Danial Movahed 4 Oct 18, 2022