Final-project-robokeeper created by GitHub Classroom

Related tags

HardwareRoboKeeper
Overview

RoboKeeper!

Jonny Bosnich, Joshua Cho, Lio Liang, Marco Morales, Cody Nichoson


Robokeeper being a boss height=

Demonstration Videos

Equipment

Hardware:
  • HDT Global Adroit Manipulator Arm
  • Intel RealSense Camera
Software:
  • Robot Operating System (ROS)
  • MoveIt!
  • OpenCV
  • AprilTag

Quickstart Guide

  1. Install ROS Noetic on Ubuntu 20.04
  2. Create catkin workspace
    $ source /opt/ros/noetic/setup.bash
    $ mkdir -p ~/catkin_ws/src
    $ cd ~/catkin_ws/
    $ catkin_make
    
  3. Copy this repository into src folder
    $ cd ~/catkin_ws/src
    $ git clone [email protected]:ME495-EmbeddedSystems/final-project-robokeeper.git
    
  4. Install required packages and build
    $ source devel/setup.bash
    $ rosdep install --from-paths src --ignore-src -r -y
    $ catkin_make
    

Running the package

  1. First, run the main launchfile. To run the program on the real robot, run the command below.

    roslaunch robokeeper robokeeper_go.launch
    
  2. If using a simulation, add the sim:=true argument when running the main launchfile.

    roslaunch robokeeper robokeeper_go.launch sim:=true
    
  3. The robot now has to pick up the paddle and this is done with two services. First, call above_paddle.

    rosservice call /above_paddle
    
  4. Next, call the 'retrieve_paddle` service.

    rosservice call /retrieve_paddle
    
  5. Call the reset service to move the robot in front of the goal.

    rosservice call /reset
    
  6. Call start_keeping to enable the goal keeping component of the project.

    rosservice call /start_keeping
    
  7. When finished, call the 'stop_keeping' service.

    rosservice call /stop_keeping 
    

Launchfiles

robokeeper_go.launch

This is the main launchfile used to operate robokeeper. It starts by launching robokeeper_moveit.launch which loads the necessary urdf file and hardware configuration, as well as the main MoveIt! executable. It then launches intel_cam.launch which starts the Intel Realsense camera. It also starts a transforms node which handles the calculation of transformation between various frames within the world. Finally, the launchfile starts a motion_control node that publishes appropriate joint state messages to actuate the arm.

robokeeper_moveit.launch

This launchfile loads robot description for the Adroit 6-dof manipulator arm, as well as its hardware and controller configuration from the hdt_6dof_a24_pincer_description package. It also includes move_group.launch from the hdt_6dof_a24_pincer_moveit package, which starts the move group that MoveIt! uses to plan the motion of the arm.

intel_cam.launch

This launchfile starts the Intel Realsense camera by launching rs_camera.launch from the realsense2_camera package. It then launches AprilTag_detection.launch for AprilTag integration.

AprilTag_detection.launch

This launchfile loads parameters necessary for integrating AprilTag detection, which is crucial for detecting the position of the robot relative to the camera. It starts apriltag_ros_continuous_node from the apriltag_ros package.

Nodes

perception

The perception node is responsible for handling the data collected from the Intel RealSense camera utilized to identify and locate the objects that our robot is tasked with blocking. It contains a CV bridge to enable OpenCV integration with ROS, subscribes to the RealSense's camera data, and ultimately publishes 3-dimensional coordinate data of the centroid of the object of interest (a red ball for our purposes).

In order to identify the ball, video frames are iteratively thresholded for a range of HSV values that closely match those of our ball. Once the area of interest is located, a contour is created around its edges and the centroid of the contour located. This centroid can then be treated as the location of the ball in the camera frame and published appropriately.

transforms

Knowing where the ball is relative to the camera is great, but it doesn't help the robot locate the ball. In order to accomplish this, transformations between the camera frame and the robot frame are necessary. This node subscribes to both the ball coordinates from the perception node and AprilTag detections, and publishes the transformed ball coordinates in the robot frame.

In order to complete the relationship between the two frames, an AprilTag with a known transformation between itself and the baselink of the robot (positioned on the floor next to the robot) was used. Using the RealSense, the transformation between the camera frame and the AprilTag can then also be determined. Using these three frames and their relationships, the transformation between coordinates in the camera frame and coordinates in the robot frame can finally be determined.

motion_control

This node provides the core functionality of the robokeeper. Primarily, it subscribes to the topic containing the ball coordinates in the robot frame and contains a number of services utilized to interact with its environment in several ways.

The main service used is /start_keeping. As the name suggests, this service allows the robot to begin interpreting the ball coordinates and attempting to intersect it at the goal line. Appropriate joint trajectory commands are sent to the robot through a mix of MoveIt! and direct joint publishing (depending on the service called) in order to accomplish the task. This node also keeps track of goals scored by determining if the ball has entered the net.

Services

  1. The reset service moves the Adroit arm directly in front of its base and the goal.

    rosservice call /reset
    
  2. The keep service moves the robotic arm to a pose that is only dependent on a y-value. An example of the service being called follows.

    rosservice call /keep "pos: 0.0"
    
  3. above_paddle is a service that moves the arm directly above the paddle holster to get in a position for consistent retrieval.

    rosservice call /above_paddle
    
  4. To retrieve the paddle, the retrieve_paddle can be called. It moves the arm to a postion where it can grip the paddle, it then closes the gripper, and finally moves to the same position as above_paddle.

    rosservice call /retrieve_paddle
    
  5. The start_keeping service enables the robot to block the red ball from entering the goal.

    rosservice call /start_keeping
    
  6. To stop the robot from moving and tracking the ball, call the stop_keeping service.

    rosservice call /stop_keeping 
    

Additional Notes

There are some features within this code that were partially developed, but not completed due to time contraints. Because of this, you may notice certain things in the source code that are not mentioned here.

An example of this is the scoreboard feature. The original plan was to include both a goal counter and block counter when playing with the robot and display these stats to the user in order to create a game. The goal counter was successfully created, but we didn't have time to complete the black counter. The goal counter is located within the 'motion_control' node and the infrastructure for displaying the actual scoreboard using the 'tkinter' library is located in a node called 'scorekeeper'.

Owner
Cody Nichoson
Cody Nichoson
Playing diabolo with two robot arms in ROS + Gazebo

Playing diabolo with robots This repository holds the ROS packages for playing diabolo with two UR5e robot arms on ROS Melodic (Ubuntu 18.04). Read ou

23 Dec 18, 2022
Tool to create 3D printable terrain with integrated path/road part files (Single material 3d printer)

BACKGROUND This has been an ongoing project of mine for a few months now. I run trails a lot and original the goal was to create a function to combine

9 Apr 26, 2022
Python library to interact with the GCE Electronics IPX800 device

A python library to control a GCE-Electronics IPX800 V4 device through its API.

Marc-Aurèle Brothier 2 Oct 20, 2021
Resmed_myair_sensors - This is a Home Assistant custom component to pull daily CPAP data from ResMed's myAir service using an undocumented API

resmed_myair This component will set up the following platforms. Platform Description sensor Show info from the myAir API. Installation Using the tool

Preston Tamkin 17 Dec 29, 2022
A ch341dll Wrap is for using in Python 32bits windows to access I2C SPI and MDIO (by GPIO), and Demo with display PC sreen on OLED by i2c or SPI .

ch341dll_wrap_typcal_app A ch341dll Wrap is for using in Python 32bits windows to access I2C SPI and MDIO (by GPIO). In addition, I provided 3 Demo. I

13 Jan 02, 2023
Open-Source board for converting RaspberryPI to Brain-computer interface

The easiest way to the neuroscience world with the shield for RaspberryPi - PIEEG (website). Open-source. Crowdsupply This project is the result of se

Ildaron 436 Jan 01, 2023
This OctoPrint plugin will make the initial connection to 3D Hub a breeze

3D Hub Connector This OctoPrint plugin will make the initial connection to 3D Hub a breeze. In future it will help in setting up a tunnel connection a

3D Hub 2 Aug 03, 2022
Terkin is a flexible data logger application for MicroPython and CPython environments.

Terkin Data logging for humans, written in MicroPython. Documentation: https://terkin.org/ Source Code: https://github.com/hiveeyes/terkin-datalogger

hiveeyes 45 Dec 15, 2022
Like htop (CPU and memory usage), but for your case LEDs. 😄

Like htop (CPU and memory usage), but for your case LEDs. 😄

Derek Anderson 3 Dec 08, 2021
Classes and functions for animated text and graphics on an LED display

LEDarcade A collection of classes and functions for animated text and graphics on an Adafruit LED Matrix.

datagod 31 Jan 04, 2023
Simple Microservice to control 433Mhz wireless sockets over HTTP, e.g. on a RaspberryPi

REST-light is a simple microservice to control 433Mhz wireless sockets over HTTP, e.g. on a RaspberryPi. The main usage is an easy integration of 433M

Pascal Höhnel 1 Jan 09, 2022
FERM: A Framework for Efficient Robotic Manipulation

Framework for Efficient Robotic Manipulation FERM is a framework that enables robots to learn tasks within an hour of real time training.

Ruihan (Philip) Zhao 111 Dec 31, 2022
hardware design of the 250mm drone

hardware design of the 250mm drone

ZJU FAST Lab 645 Dec 25, 2022
Custom component for interacting with Octopus Energy

Home Assistant Octopus Energy ** WARNING: This component is currently a work in progress ** Custom component built from the ground up to bring your Oc

David Kendall 116 Jan 02, 2023
Pure micropython ESP32 SPI driver for sdcard and screen at the same SPI bus

micropython-esp32-spi-sdcard-and-screen-driver Proof of concept of Pure micropython espidf SPI driver for sdcard with screen at the same SPI bus (exam

Thomas Favennec 7 Mar 14, 2022
Automate gate/garage door opening via 433.92MHz emitter with Raspberry Pi, Home Assistant and Homekit.

Automate opening your garage door / gate Summary This project sums up how I automated opening my garage door using a Raspberry PI, a 433Mhz emitter, H

Julien Fouilhé 29 Nov 30, 2022
Automatic CPU speed & power optimizer for Linux

Automatic CPU speed & power optimizer for Linux based on active monitoring of laptop's battery state, CPU usage, CPU temperature and system load. Ultimately allowing you to improve battery life witho

Adnan Hodzic 3.4k Jan 07, 2023
🌱 - WebhookHard◞ Fines Educativos ◟

v1.0.0 WebhookHardware ¿Que es WebhookHardware? WebhookHardware se trata de un proyecto tratado para sacar informacion sobre el hardware de tus victim

3 Jun 14, 2021
Keystroke logging, often referred to as keylogging or keyboard capturing

Keystroke logging, often referred to as keylogging or keyboard capturing, is the action of recording the keys struck on a keyboard, typically covertly, so that a person using the keyboard is unaware

Bhumika R 2 Jan 11, 2022
A simple portable USB MIDI controller based on Raspberry-PI Pico and a 16-button keypad, written in Circuit Python

RPI-Pico-16-BTn-MIDI-Controller-using-CircuitPython A simple portable USB MIDI controller based on Raspberry-PI Pico, written in Circuit Python. Link

Rounak Dutta 3 Dec 04, 2022