Skip to content
Snippets Groups Projects
user avatar
Mathieu Reymond authored
88140a20
History

Setup

Installation

Follow the instructions from the Workstation Setup, the Simulator installation, the Moveit tutorial and the PyKDL tutorial :

  1. Setup your sources.list
sudo sh -c 'echo "deb http://packages.ros.org/ros/ubuntu trusty main" > /etc/apt/sources.list.d/ros-latest.list'
  1. Setup your keys
wget http://packages.ros.org/ros.key -O - | sudo apt-key add -
  1. Verify Latest Debians
sudo apt-get update
  1. Install All Dependencies (ROS Indigo, rosinstall, SDK, Simulator, Moveit, PyKDL)
sudo apt-get install ros-indigo-desktop-full python-rosinstall git-core python-argparse python-wstool python-vcstools python-rosdep ros-indigo-control-msgs ros-indigo-joystick-drivers gazebo2 ros-indigo-qt-build ros-indigo-driver-common ros-indigo-gazebo-ros-control ros-indigo-gazebo-ros-pkgs ros-indigo-ros-control ros-indigo-control-toolbox ros-indigo-realtime-tools ros-indigo-ros-controllers ros-indigo-xacro python-wstool ros-indigo-tf-conversions ros-indigo-kdl-parser ros-indigo-moveit ros-indigo-moveit-full-pr2 ros-indigo-moveit-full ros-indigo-python-orocos-kdl  ros-indigo-orocos-kinematics-dynamics ros-indigo-kdl-conversions ros-indigo-orocos-kdl
  1. Initialize rosdep
sudo rosdep init
rosdep update
  1. Create ROS Workspace
mkdir -p ros/src
  1. Source ROS Setup
source /opt/ros/indigo/setup.bash
  1. Build and Install
cd ros
catkin_make
catkin_make install
  1. Download baxter script
wget https://github.com/RethinkRobotics/baxter/raw/master/baxter.sh
chmod u+x baxter.sh
  1. Install Baxter SDK, Simulator, Moveit, PyKDL, our project
cd src
wstool init .
wstool merge https://raw.githubusercontent.com/RethinkRobotics/baxter/master/baxter_sdk.rosinstall
wstool merge https://raw.githubusercontent.com/RethinkRobotics/baxter_simulator/master/baxter_simulator.rosinstall
git clone https://github.com/ros-planning/moveit_robots.git
git clone https://github.com/RethinkRobotics/baxter_pykdl.git
git clone https://github.com/Ardillen66/GarbageBot.git
wstool update
  1. Build and Install
cd ..
catkin_make
catkin_make install

Run the environment

Open a terminal in the ROS workspace and connect to the Simulator:

./baxter.sh sim

Every command to interact with the simulator must be in such a shell.

To launch Gazebo with the simulator:

roslaunch baxter_gazebo baxter_world.launch

You should wait until the following message appears:

Gravity compensation was turned off

Enable the robot

The robot must be enabled to be able to move its arms, use its sensors etc. You can either enable the robot directly, by running the following command in the terminal:

rosrun baxter_tools enable_robot.py -e

Or, you can enable it programmatically, by for example running the following python commands:

import baxter_interface
baxter_interface.RobotEnable().enable()

Run an example

Open another simulator shell, and type:

rosrun garbage_bot example.py

This is the same as Running

python src/Garbagebot/src/planning/example.py

ROS packages are defined by both package.xml and CMakeLists.txt in our project folder. The python binding is done in setup.py.

Try playing with MoveIt

First of all, you need to enable the robot. Then you need a joint trajectory controller. Open a separate terminal, and run :

rosrun baxter_interface joint_trajectory_action_server.py

You can interact directly with the robot using the Rviz editor. In yet another terminal, run:

roslaunch baxter_moveit_config baxter_grippers.launch

You can move the robot's arms, and click on execute to see the arms move to the desired position.

Programmatically, take a look at the python interface tutorial.

If you get a boost error like this:

'boost::exception_detail::clone_impl<boost::exception_detail::error_info_injectorboost::lock_error >'
what(): boost: mutex lock failed in pthread_mutex_lock: Invalid argument

Ignore it for now, your whole script runs anyway and I could not find a fix yet (and btw, just ignore it seems to be the main proposed solution).

Garbage bot

Garbage bot sorts items, either left or right, according to their features. The features we used are color (red, green, blue) and shape (cube, sphere, triangle).

Dependencies

The whole project is programmed in python (2.7, but should work with 3+), making use of both MoveIt's python interface, as well as ROS' python interface.

Next to that, you need to install a bunch of dependencies to run the decision trees and neural nets, as well as image processing:

  • numpy
  • six
  • scipy
  • tqdm
  • sklearn
  • scipy
  • opencv-python
  • imgaug
  • tensorflow
  • tflearn
  • pandas

Structure

The project is composed of 3 main parts:

  • decision_api: used to train a decision tree based on a set of samples possessing each a specific color and shape
  • detecting: used to train a neural network on images to extract their features. Provided is the training dataset used, as well as our model's weights. In model.py, you can either train the network, or use an existing one to extract features out of an image. The other files are examples that can be used separately: take_images.py will, given a set of features, take (for each arm) a bunch of images (that we used later on as training set). predict_images.py uses the robot to take a snapshot and extract its features.
  • planning: this module focuses on the robot itself: moving the arms and taking snapshots. All this is done in baxter.py. If the robot runs in a simulator, a Scene needs to be specified, which is done in scene.py.

All parts are merged together in baxter_agent.py.

Executables

If you want to run a file using rosrun, two requirements are necessary: the file must be executable (chmod +x <file>), and the command must be specified at the beginning of the file (#!/usr/bin/python). We provided a bunch of examples to use, that can be executed using rosrun garbage_bot <file>:

  • predict_images moves the robot arm to take a snapshot and extract its features
  • take_images for each arm, snapshots are taken from random (but closeby) vantage points. They are then saved to the disk
  • example moves an arm in the simulator to take an item and sort it
  • initialize simply take a snapshot and save it to the disk
  • demo runs the whole process. First, the robot will train, by having the human specify, for each given item, which arm needs to sort it. Then, the training phase ends, and the robot will sort the next items itself.