ME530707_2018

530.707 Robot Systems Programming Course Home Page – Spring 2018

Contents

 

Course Description

This course seeks to introduce students to open-source software tools that are available today for building complex experimental and fieldable robotic systems. The course is grouped into four sections, each of which building on the previous in increasing complexity and specificity: tools and frameworks supporting robotics research, robotics-specific software frameworks, integrating complete robotic systems, and culminates with an independent project of the student’s own design using small mobile robots or other robots in the lab. Students will need to provide a computer (with at least a few GB of memory and a few tens of GB of disk space) running Ubuntu 16.04 LTS Xenial Xerus (http://releases.ubuntu.com/16.04) or one of its variants such as Xubuntu 16.04 LTS (http://xubuntu.org/getxubuntu) and ROS Kinetic Kame (http://wiki.ros.org/kinetic)– note that these specific versions of Linux and ROS are required! Students should have an understanding of intermediate programming in C/C++ (including data structures and dynamic memory allocation) Familiarity with Linux programming. Familiarity with software version control systems (e.g. subversion, mercurial, git), linear algebra.  Required Course Background: Familiarity with robot kinematics and algorithms such as those covered in EN.530.646 Robot Devices, Kinematics, Dynamics, and Control and EN.600.636 Algorithms for Sensor Based Robotics, and 601.220.  Intermediate Programming in C++ on Linux.

 

Course-Related Web Pages

Instructors

Faculty

Louis L. Whitcomb
Professor
Department of Mechanical Engineering
G.W.C. Whiting School of Engineering
Johns Hopkins University
office: 115 Hackerman Hall
Phone: 410-516-6724
email: [email]
https://git-teach.lcsr.jhu.edu user id: lwhitco1

Teaching Assistants

  1. Mr. Tyler Paine, [email], https://git-teach.lcsr.jhu.edu user id: tpaine1
  2. Mr. Soham Patkar,  [email], https://git-teach.lcsr.jhu.edu user id: spatkar1
  3. Ms. Rachel Hegeman, [email],  https://git-teach.lcsr.jhu.edu user id: rhegema1

Class Meeting Schedule

Times:  Tuesday and Thursdays 4:30-6:00PM
Dates:   Spring 2018
Location: 210 Hodson

Office Hours

  • Mondays 10-11AM, Louis Whitcomb, 115 Hackerman
  • Mondays 1-2PM, Soham Patkar,  111 Hackerman
  • Tuesdays 2:30-3:30PM, Tyler Paine, 111 Hackerman=

EduMIP Robots

In this course we will use the EduMIP mobile robots – see ROS on the EduMIP Mobile Robot for more information.

The EduMIP robot was created by Prof. Thomas Bewley and his students at UCSD. It  is available from Renaissance Robotics

Textbooks

Although there is no required text for the course, if you are new to ROS we recommend that you get and read one or more of the following two introductory ROS texts:

Electronic books available through the Milton Eisenhower Library are accessible from on-camps IP addresses. If you are off-campus, you will need to VPN into the campus network to access these electronic books.

Prerequisites

Prerequisite Courses

You must have taken both of “core” robotics courses 530.646  Robot Devices, Kinematics, Dynamics, and Control  and also 601.463/601.663 Algorithms for Sensor-Based Robotics.  It is OK, if you have previously taken one of these two courses, for you to concurrently take the other course while you are taking 530.707.  It is NOT OK to be taking both of these courses while you are taking 530.707.

You must have already taken 601.220 Intermediate Programming in C++  or an equivalent course.  This is a prerequisite,  not a co-requisite.   You must have completed 601.220 or the equivalent before you take this 530.707.   No exceptions.

Required Computer

Students will need to provide their own computer (with at least a few GB of memory and at least ~30 GB of disc space) running Ubuntu 16.04 LTS LTS (Xenial Xerus)  (https://wiki.ubuntu.com/XenialXerus/ReleaseNotes) or one of its variants such as Xubuntu).  You will also need to install  ROS Kinetic Kame  (http://wiki.ros.org/kinetic) .  Your computer can be dual boot.  Linux installations in a virtualbox are NOT an acceptable substitute.

Prerequisite Knowledge:

This course will benefit you the most if you are already familiar with the following subjects:

  • Kinematics & Dynamics
  • Basic Machine Vision
  • Basic Probability Theory and Random Processes
  • Data Structures
  • Linear Algebra

This course will require you to:

  • Use the Linux Operating System
  • Use the following programming languages:
    • Intermediate C++ programming including data structures (absolutely required)
    • bash
    • Python (optional)
  • Use the following markup languages:
    • XML
  • Use the following software tools:
    • Git
    • CMake

 Notes on Insalling Ubuntu on your PC

Problems installing Ubuntu on some Macs

Note that several students have reported difficulty installing Ubuntu linux on Mac notebook  PCs.

What Desktop Environment Should I Use?

We recommend the Xubuntu Xfce Desktop , which is based on xfce, s being superior to the default Unity Desktop that comes with Ubuntu.  To install the Xubuntu Xfce Desktop after you have installed Ubuntu, first “sudo apt-get install xubuntu-desktop”, then log out of your desktop, select the “Xubuntu Session” (there should be three options: “Ubuntu”, “Xfce”, and “Xubuntu Session” in the desktop selector menu on the login page), and log in.

Xubuntu Xfce Desktop Screenshot – click for higher resolution image.


Unity Desktop Screenshot- click for higher resolution image.

 

WiFi in Class

In class I will have set up a WiFi access point with SSID “turtlebot_2.4gHz” and “turtlebot_5gHz”, with WiFi password “turtlebot707”. If you connect your PC to this WiFi network you can type commands in real time during the class.

The turtlebot’s do not presently deal gracefully with enterprise WiFi authentication, so this access point will be preferred for working with the EduMIP.

Your PC’s ~/.bashrc should have this to set up environment variables for ROS. Set your ROS_MASTER_URI to http://192.168.10.101:11311 (the IP address of Louis’s PC), and set the ROS_IP to the IP address of your pc 192.168.10.XXX, where you determine the IP address with the command line command “ifconfig”.  The last part of your ~/.bashrc file should look something like this (remember to leave comments for yourself preceded by “#” ):

# ----------------------------------------------------------------------
# ROS kinetic environment variables
source /opt/ros/kinetic/setup.bash

# ----------------------------------------------------------------------
# chain local workspace
# source ~/catkin_ws/devel/setup.bash
source ~/catkin_ws/devel/setup.bash

# ----------------------------------------------------------------------
# set ROS_MASTER_URI to IP address and port of PC running roscore
export ROS_MASTER_URI=http://192.168.10.101:11311

# ----------------------------------------------------------------------
# set ROS_HOSTNAME or ROS_IP to IP address of YOUR local PC as
# determined by "ifconfig"
export ROS_IP=192.168.10.123

After you edit !/.bashrc be sure to “source ~/.bashrc” in all interactive shells, or kill and restart your shells.

Youtube Videos of Course Presentations

Here is a youtube channel where I will post screen captures of the lectures for spring 2018 530.707 Robot Systems Programming:

https://www.youtube.com/channel/UC_HYa5JUr2yKZAM0rYtAiFQ?

As an experiment I posted the screen+audio capture videos of my 6 overview presentations for the ROS short-course that I gave in November 2017 at WHOI (Modules 1-6) on YouTube here:

https://www.youtube.com/channel/UCNvvEcyAZnZa183rhMmuq4Q

The home page for this ros short course is here: https://dscl.lcsr.jhu.edu/home/courses/ros_short_course_fall_2017

The production values are not great – just a screen-capture with audio. I am curious if they are at all useful for you (or not!) – please send me feedback!

Robotics Teaching Lab – Wyman 170

We have 6 computers as dual boot with Windows and 64-bit Xubuntu Linux 16.04 LTS and ROS Kinetic (aka ROS Kinetic Kameo) installed and five turtlebot 2s mobile robots in the Wyman 170 Lab. By Feb 1 or so we hope to have everything set up so that you can boot the computers into Linux, and log in with your JHED ID and password.

  • If you have problems logging in for the First Time on a Workstation: There is a BUG in likewise that sometimes crops up: The first time you log in to a workstation using likewise, the graphical login may fail. Workaround: for your very first (and only the first) login with likewise on a machine:
      • 1. Type CTRL-ALT-F1 to get an ASCII console.
      • 2. Log in with your JHED ID and password. Your home directory on the machine will be named by your jhed id, for my case it is: /home/lwhitco1
      • 3. Log out.
      • 5. Type CTRL-ALT-F7 to get the X-windows login screen.
      • 6. Log in normally using the graphical login screen with you JHED ID and password.
      • 7. For all future logins on this machine, you can log in in normally using the graphical login screen with you JHED ID and password.

Wyman 170 Lab Etiquette:

    • Your account has sudo privileges so BE CAREFUL WHEN USING sudo! The only sudo command you should ever use is “sudo apt-get install ros-kinetic-packagename” where “packagename” is the name of a ros package.
    • Do not edit or modify and ROS files under /opt/ros. If you want to modify a standard ROS package, then download the kinetic versoin of the source-code for the package into your catkin workspace, and modify your local copy. ROS will automatically detect your local copy and use it instead of the system version in /opt/ros.
    • The only version of the operating system we will support is Xubuntu 16.04 64-bit.
    • DO NOT upgrade the operating system to more recent releases.
    • The only version of ROS we will support is ROS kinetic
    • DO NOT install other versions of ROS other than ROS kinetic.
    • Leave the lab spotless.
    • Never not “lock” a workstation with a screen-saver. Locked computers will be powered off.
    • No Backup!: The files on these computers are NOT BACKED UP. Any files that you leave in your account on these computers can disappear at any time. Use https://git-teach.lcsr.jhu.edu to save your projects during and after every work session.
    • When you are finished a work session, log off the computer.
    • If you encounter a problem: Notify Prof. Whitcomb and the TAs if you have any problems with the lab or its equipment. Put “530.707” in the subject line of any email regarding this course.

Git Repository for Handing in Assignments

All assignments and projects for this course will use the git repository https://git-teach.lcsr.jhu.edu.  You may not submit assignments with other git repositories.  

You should already have an active account our account on https://git-teach.lcsr.jhu.edu.

Test your account by  browsing to  https://git-teach.lcsr.jhu.edu and log in with your JHED credentials.

Your account should already have a default quota of 25 projects, which should be plenty for this course.  If you need to have more than 25 projects, email “lcsr-it@jhu.edu” with your JHED ID and ask Mr. Anton Deguet to increase your project quota beyond 25.

If you have problems with your account on https://git-teach.lcsr.jhu.edu, please send an email addressed to both of (a) the course instructor (llw@jhu.edu) and (b) Mr. Anton Deguet (lcsr-it@jhu.edu).

When you create your assignment projects, please use precisely the project name specified in the assignment, and add the course instructor (lwhitco1), and the TAs (tpaine1 and spatkar1) as members of your projects, all with DEVELOPER access.

Breaking News!: As of Feb 2, 2018 https://git-teach.lcsr.jhu.edu is only ccessible from within the campus network.  If you are off-campus, you must VPN into campus to access this repository.  I have requested that this be changes, and WSE IT agreed.  Soon https://git-teach.lcsr.jhu.edu will be accesible directly from off campus.  I will announce it when this is completed.

Course Grade Policy

Course grade will be based upon class participation (~10-15%%), weekly assignments (~45%) , and a final independent course project (~45%)

Weekly Assignments

Weekly assignments are due at 4:30PM Tuesday each week.  No credit for late assignments or late pushes to git-teach.lcsr.jhu.edu.   Lowest score is dropped.

Independent Project  Demonstration, Monday-Tuesday May 7, 2018

Each team will choose a 30 minute time slot to demonstrate their project to the Instructors, TAs, and other students, faculty, staff, and possibly some local press.

Here is the demo SCHEDULE: https://docs.google.com/spreadsheets/d/1Z2i-C25W6wR9N3rAR2NMUpXqnuQA-iIdFT0fPVsVzCI/edit?usp=sharing

Independent Project Poster Session, 10-12AM Saturday, May 12, 2018

Syllabus

Module 1:  Week of Jan 30, 2018: Course Overview and ROS Basics

NOTE: in this course we will exclusively use Ubuntu 16.04 LTS (or an equivalent release such as Xubuntu 16.04 LTS) and the stable ROS Kinetic Kame release.

Robot of the Week: The EduMIP

  • The EduMIP (Educational Mobile Inverted Pendulum
    • EduMIP ROS Page: https://dscl.lcsr.jhu.edu/home/courses/edumip_ros
    • Thomas Bewley’s EduMIP Page: https://www.ucsdrobotics.org/edumip

Topics

  • Course Overview
    • Course Description
    • Prerequisites
    • Assignments
    • Class Project
    • Ethics
  • Background of Robot Software Frameworks and The Open-Source Robotics Community
  • Development Tools, Managing Code, Environments, Installing Software, Building Code
  • The ROS core systems: Packaging, Buildsystems, Messaging, CLI & GUI tools, creating packages, nodes, topics, services, and paramaters.
  • Writing a Simple Publisher and Subscriber (C++)
  • Examining the Simple Publisher and Subscriber
  • Writing a Simple Service and Client (C++)
  • Examining the Simple Service and Client

Reading

Assignments for This Module

Tutorials

  • Install Ubuntu 16.04 LTS (or an equivalent release such as Xubuntu 16.04 LTS).
  • Install ROS Kinetic Kame
  • Complete these Tutorials
    • Installing and Configuring Your ROS Environment. NOTE: in this course we will exclusively use Ubuntu 16.04 LTS Xenial Xerus (http://releases.ubuntu.com/16.04) or one of its variants such as Xubuntu 16.04 LTS (http://xubuntu.org/getxubuntu) and ROS Kinetic Kame (http://wiki.ros.org/kinetic).
    • Navigating the ROS Filesystem
    • Creating a ROS Package
    • Building a ROS Package
    • Understanding ROS Nodes
    • Understanding ROS Topics
    • Understanding ROS Services and Parameters
    • Using rqt_console and roslaunch
    • Using rosed to edit files in ROS
    • Creating a ROS msg and a ROS srv
    • Writing a publisher and subscriber in C++
    • Writing a Simple Publisher and Subscriber (C++)
    • Examining the Simple Publisher and Subscriber
    • Writing a service and client in C++
    • Examining the Simple Service and Client

Assignment #1 – Due 4:30PM Tuesday Feb 6, 2018.

  • Write and test a ROS package named “beginner_tutorials” comprised of
    • A C++ publisher node that publishes a TOPIC and
    • A C++ subscriber node that subscribes to this TOPIC.
    • A C++ server node that provides a SERVICE.
    • A C++ client node that calls the server node’s SERVICE
  • Hand in your code project “beginner_tutorials” on https://git-teach.lcsr.jhu.edu
    • Login to your https://git-teach.lcsr.jhu.edu account.
    • Create a project called “beginner_tutorials” on https://git-teach.lcsr.jhu.edu ( When you name the ros project that you will hand in each week, please use EXACTLY the name specified in the assignment – this week’s project, for example, should be entitled “beginner_tutorials”),
    • Add the instructor (lwhitco1), and the TAs (tpaine1 and spatkar1) as members of your project, all witih DEVELOPER access.
    • Initialize your project “beginner_tutorials” as a git repository
    • Add the files to the repo
    • Commit them to the repo
    • Add the remote repository
    • Push your files
    • Push the repo to the server
    • Email the TAs and instructor when done, with “530.707 RSP Assignment 1” in the subject line.
    • See us with questions.

Module 2:  Week of February 6, 2018:  Roslaunch, Nodes, tf, Parameters, and Rosbag

Robots of the Week: Nereid Under-Ice (NUI) and Boston Dymamics Big Dog

Nereid Under-Ice (NUI):

  • Homepage: http://www.whoi.edu/main/nereid-under-ice
  • Press Release: https://www.whoi.edu/news-release/NereidUI
  • Scientific American video featuring Dr. Christopher German: https://www.youtube.com/watch?v=PHJCI7M4knQ

Boston Dynamics Big Dog:

  • Big Dog Video: https://www.youtube.com/watch?v=W1czBcnX1Ww
  • Big Dog Spoof Video: https://www.youtube.com/watch?v=mXI4WWhPn-U

 

Topics

  • rosbag
  • roswtf
  • ROS.org
  • tf
  • RVIZ
  • Getting and setting parameters in roslaunch and C++ nodes

Reading

Assignments for This Module

Tutorials

Assignment #2 – Due 4:30PM Tuesday Feb 13, 2018.

  • Write and test a ROS package named “learning_tf” containing your source code and launch files for
    •  TF
      •  tf broadcaster node
      • tf listener node
      • adding a tf frame, either a moving frame or a “time traveled” frame like the tutorials.
    • ROS Parameters
      • A simple C++ ROS node that reads some parameters on startup and uses them
      • A roslaunch file that sets the parameters and runs the C++ node
  • Hand in your code project “learning_tf” on https://git-teach.lcsr.jhu.edu and share it with the instructors.
  • Email the TAs and instructor when done, with “530.707 RSP Assignment 2” in the subject line.

 

Module 3: Week of February 13, 2018: EduMIPs, Joy, and ROS Node development in C++

Robots of the Week:  Boston Dynamics Spot Mini and Harvest Automation HV-100

Boston Dynamics Spot Mini:

Harvest Automation HV-100:

  • https://www.public.harvestai.com
  • https://www.youtube.com/watch?v=OPIyGyBNwAo&t=24s
  • https://www.youtube.com/watch?v=PXRpZDV4bCY

Topics

  • Assembling the EduMIP mobile robot
  • Installing and testing joysticks
  • Publishing /joy topic with the ROS joy package
  • Joystick tutorials – including teleoperating a robot from a joystick
  • ROS timers
  • Writing your own package to subscribe to the joystick /joy and publish a geometry_msgs/Twist topic to command the EduMIP.
  • Writing launch files for same.
  • Running ROS systems spanning more than one computers.

Reading

Assignments for This Module:

Assignment #3: Due 4:30PM Tuesday Feb 20, 2018

  • Assemble and Test the EduMIP as described here: https://dscl.lcsr.jhu.edu/home/courses/edumip_ros
  • Set Up Your Beaglebone Black Wireless Board for Your EduMIP: Install a 8GB Ubuntu 16.04 LTS  image pre-loaded with ROS Kinetic and support for the Robotics Cape on a Micro-SD card on your BBBW. Follow the instructions here: https://dscl.lcsr.jhu.edu/home/courses/edumip_ros
  • Joystick Assignment #1 of 3: Install your joystick and test it (nothing to hand in for this part of the assignment)
    • Plug in and test your USB joystick
      • List the usb devices on your computer with the “lsusb” command with the joystick plugged USB cable connected and also when disconnected.
      • See that the device /dev/input/js0 appears when your joystick is connected, and that this device vanishes when the joystick is disconnected
      • Use the command “jstest /dev/input/js0” to test your joystick. This utility gives text output of the joystick data.
      • Alternatively, test the joystick with the graphical widget “jstest-gtk”.
        • Install this utility with the command “sudo apt-get install jstest-gtk”
        • Run this utility it with the command “jstest-gtk”.
  • Joystick Assignment #2 of 3: Complete the tutorial  Configuring and Using a Linux-Supported Joystick with ROS.   Do not hand in anything for this tutorial, it is just to get you started on running ROS across multiple computers.
    • Notes on this tutorial for most Ubuntu 16.04 installations:
      • The default joystick is /dev/input/js0 (where “0” is numeral zero, not the letter O.
      • The permissions for /dev/input/js0 are already OK, i.e. you NOT need to change the permissions for /dev/input/js0 with the command “sudo chmod a+rw /dev/input/js0”.
      • The ROS joy_node automatically looks for the device /dev/input/js0. You do NOT need to set the parameter with the command “rosparam set joy_node/dev “/dev/input/js0”.
    • Run “roscore” in one terminal, then run “rosrun joy joy_node” and look at the topic /joy
    • Be sure to use the commands “rosnode list”, “rostopic list”, and rostopic echo /joy” to explore the /joy topic messages.
  • Joystick Assignment #3 of 3 (to hand in) (Git Project name: joy_twist ):
    • Create a ROS C++ package entitled “joy_twist”, with dependencies to roscpp, std_msgs, geometry_msgs, and sensor_msgs with the command “catkin_create_pkg joy_twist roscpp std_msgs geometry_msgs sensor_msgs”.
    • In this package create a C++ node entitled joy_twist.cpp that subscribes to a sensor_msgs/Joy joystick topic entitled “/joy” and publishes a geometry_msgs/Twist topic named “/edumip/cmd”. We suggest you use a ROS Timer callback function to publish the Twist messages at 10Hz – see ROS Timer documentation for details.
      • Your node should assign joystick axis 1 to twist.linear.x, and joystick axis 0 to twist.angular.z — BUT YOU CAN CHOOSE A DIFFERENT MAPPING IF YOU LIKE — you may need to change a sign in the assignment so that pushing the joystick forward makes twist.linear.x positive, and pushing the joystick to the right makes the twist.angular.z positive.
    • In this package create a launch file entitled joy_twist.launch  in the joy_twist/launch directory that
      1. Launches a joy node from the ROS joy package, which opens and reads the USB joystick values and publishes them as sensor_msgs/Joy messages on the topic /joy
      2. Launches your joy_twist node which subscribes to sensor_msgs/joy messages on the /joy topic and publishes geometry_msgs/twist messages on the /edumip/cmd topic.
    • Be sure to use the commands “rosnode list”, “rostopic list”, “rostopic echo”, “rostopic type”, and “rostopic hz” to explore the /joy and /edumip/cmd topics.
    • Run rqt_graph to see the nodes and topics graphically.
  • Joystick Control of the EduMIP and multi-computer ROS Programming: You will “hand in” this part of the assignment by demonstrating joystick control of your EduMIP to one of the instructors.
  1. Running ROS Across Multiple Machines: Complete the tutorial Running ROS Across Multiple MachinesDo not hand in anything for this tutorial, it is just to get you started on running ROS across multiple computers.
  2. Configure WiFi on your EduMIP: Follow the instructions here: https://dscl.lcsr.jhu.edu/home/courses/edumip_ros
  3. Determine the IP addresses of your PC and your EdhMIP:
    • ifconfig: Use the command “ifconfig” to see all configured network interfaces (Ethernet, WiFi, USB, etc) on your machine.
    • iwconfig: Use the command “iwconfig” to see all configured WiFi network interfaces on your machine.
    • Who am I? The easiest way to determine the IP address (or addresses) of a Linux machine is to log into it and use the command “ifconfig”.
  4. In this example I will assume the following IP addresses (YOURS WILL BE DIFFERENT):
    • My PC has IP address 192.168.10.101
    • My EduMIP BBBW has IP address 192.168.10.102
  5. On your PC set the ROS environment variables to look for the ros master (roscore) on the PC with the .bashrc commands (add these commands to the end of your  .bashrc file):

export ROS_MASTER_URI=http://192.168.10.101:11311” <- this tells ROS the IP address of the machine that is running the ros master (roscore).

export ROS_IP=192.168.10.101” <- this tells ROS the IP address of this machine (your PC).

  1. On your EduMIP set the ROS environment variables to look for the ros master (roscore) on the PC with the .bashrc commands(add these commands to the end of your  .bashrc file):

“export ROS_MASTER_URI=http://192.168.10.101:11311” <- this tells ROS the IP address of the machine that is running the ros master (roscore).

export ROS_IP=192.168.10.102” <- this tells ROS the IP address of this machine (your EduMIP).

  1. Test your configuration:
    1. On your PC, in a new shell, run roscore.
    2. On your PC, in a new shell, run “rostopic list”, you should see the standard default topics from the roscore on your PC.
    3. On your EduMIP, in a new shell, run “rostopic list”, you should see the standard default topics from the roscore on your PC. Yay!
    4. You can now publish a topic on one machine, and subscribe to the topic on the other machine. For example
      1. On your EduMip, publish a topic with command “rostopic pub -r 1 my_topic std_msgs/String “hello there”
      2. On your PC, subscribe to this topic with “rostopic echo /my_topic”
      3. On your PC, run “rqt_graph” to visualize the nodes and topics.

 

rqt_plot diagram depicting the nodes, topics, and data flow between the nodes via the topics for Assignment #3. Click for higher resolution image.
  1. Now you can control your EduMIP from a joystick on your PC:
    1. On your PC run your joy_twist launch file with the command “roslaunch joy_twist joy_twist.launch” to run roscore, joy and joy_twist nodes. Recall that:
      • The joy node publishes sensor_msgs/Joy messages on the topic /joy
      • The joy_twist node subscribes to sensor_msgs/Joy messages on the topic /joy and publishes geometry_msgs/Twist messages the /edumip/cmd topic.
    2. On your EduMIP run edumip_balance_ros with the command “roslaunch edumip_balance_ros edumip_balance_ros”. Recall that:
      • The edumip_balance_ros node subscribes geometry_msgs/Twist messages the /edumip/cmd topic and
      • The edumip_balance_ros node publishes edumip_balance_ros/EduMipState messages on the topic /edumip/state
    3. Stand up your EduMIP and take it for a drive.
    4. Explore the ROS topics, and rqt_graph.
  2. Hand in your code project name “joy_twist” on https://git-teach.lcsr.jhu.edu and share it with the instructors.
  3. Email the TAs and instructor when done, with “530.707 RSP Assignment 3” in the subject line.
  4. DEMONSTRATE YOUR ROBOT UNDER JOYSTICK CONTROL TO YOUR INSTRUCTORS during either office hours or at the beginning or end of class on Tuesday February 20, 2018.

Module 4: Week of February 20, 2018: URDF and Robot State Publisher

Topics

  • Unified Robot Description Format (URDF)
  • Robot State Publisher

Reading

  • Notes from Class: In class we downloaded the urdef tutorial package into my catkin workspace. Here are few notes on the steps taken to do this. In this example we will download a local copy of the urdef_tutorial ROS package into my catkin workspace~/catkin_ws/src/urdf_tutorial. You can edit the local copy in your workspace. The system copy located here /opt/ros/kinetic/share/urdf_tutorial but you cannot edit these files because they are protected system files. Better to edit your own local copy in your catkin workspace rather than mucking with the system copy. This is an example of workspace overlay where we create a package in a local workspace that ROS will use in preference to the default system package of the same name. Linux commands are shown in bold font. Comments are in italic font.
    • cd ~/catkin_ws/src (cd to ~/catkin_ws/src)
    • git clone https://github.com/ros/urdf_tutorial.git (clone the git tutorial from github. Note that this creates the directory ~/catkin_ws/src/urdf_tutorial and associated subdirectories.)
    • cd ~/catkin_ws (cd to ~/catkin_ws)
    • rm -r devel build (remove the catkin_ws/devel and catkin_ws/build directory trees, which deletes ~/catkin_ws/devel/setup.bash
    • catkin_make (Builds everything in my workspace from scratch, including generate a new ~/catkin_ws/devel/setup.bash)
    • source devel/setup.bash (Source the newly created file ~/catkin_ws/devel/setup.bash to add this new workspace to the ROS bash environment variables, in particular it will add the present workspace to the ROS_PACKAGE_PATH environment variable)
    • echo $ROS_PACKAGE_PATH (Look again at the ROS_PACKAGE_PATH environment variable that was set by the previous command. It should NOW be a string with your catkin workspace listed at the first element, followed by the standard package path like this: ROS_PACKAGE_PATH=/home/llw/catkin_ws/src:/opt/ros/kinetic/share)
    • rospack profile (This command forces rospack to rebuild the cache of the ros package path that is used by roscd. The cache the text file ~/.ros/rospack_cache).
    • roscd urdf_tutorial (Now roscd will take me to my local copy of the urdef tutorial in ~/catkin_ws/src/urdf_tutorial instead of taking me to the system copy located here /opt/ros/kinetic/share/urdf_tutorial )
    • roslaunch urdf_tutorial display.launch model:=urdf/01-myfirst.urdf (Now I can run the tutorial exercise and edit the local URDF files in ~/catkin_ws/src/urdf_tutorial/urdf)
    • Note that the later tutorial urdf files such as urdf/05-visual.urdf refer to a PR2 gripper model mesh files. If you get error messages from RVIZ like “[ERROR] [1393792989.872632824]: Could not load resource [package://pr2_description/meshes/gripper_v0/l_finger.dae]: Unable to open file “package://pr2_description/meshes/gripper_v0/l_finger.dae” then you need to install the PR2 mesh files. You can install the PR2 model files with the command sudo apt-get install ros-kinetic-pr2-common
    • Note that beginning with urdf/05-visual.urdf RVIZ throws lots of warnings like “TIFFFieldWithTag: Internal error, unknown tag 0x829a.” but the program runs OK.

Assignments for This Module

Tutorials

  • Learning URDF Step by Step
    • 1. Building a Visual Robot Model with URDF from Scratch
    • 2. Building a Movable Robot Model with URDF
    • 3. Adding Physical and Collision Properties to a URDF Model
    • 4. Using Xacro to Clean Up a URDF File
  • Learning URDF (including C++ API)
    • 1. Create your own urdf file
    • 2. Parse a urdf file
    • 3. Using the robot state publisher on your own robot
    • 4. Start using the KDL parser (You can skip this tutorial for now if you like, it is not required for this module’s assignment.)
    • 5. Using urdf with robot_state_publisher
  • Assignment #4  Due 4:30PM Tuesday Feb 27, 2018

  • On your PC clone (with “git clone …”)  the package edumip_msgs into your ~/catkin_ws/src directory from this repo  https://git.lcsr.jhu.edu/lwhitco1/edumip_msgs.git  This package defined the message EduMipState.msg.  After you have downloaded this package into your catkin_ws/src directory, run catkin_make to create the message files on your PC.

Here is the definition found in the file esumip_msgs/msg/EduMipState.msg message definition file:

float32 setpoint_phi_dot # commanded average wheel vel ~ trans vel
float32 setpoint_gamma_dot # commanded steering angle vel ~ angualr vel
float32 setpoint_phi # commanded average wheel pos
float32 phi # average wheel pos
float32 setpoint_gamma # commanded steering angle
float32 gamma # steering angle
float32 setpoint_theta # commanded body tilt
float32 theta # body tilt
float32 d1_u # control command for balnce loop
float32 d3_u # control command for steering loop
float32 dutyL # left motor duty cycle
float32 dutyR # right motor duty cycle

# 2017-02-22 LLW Added odometry data 
float32 wheel_angle_L # total rotation of left wheel (radians) (+ is forward)
float32 wheel_angle_R # total rotation of right wheel (radians) (+ is forward)
float32 body_frame_easting # displacemnt of body frame (m) (+ is East )
float32 body_frame_northing # displacemnt of body frame (m) (+ is North)
float32 body_frame_heading # compass heading (radians)

float32 vBatt # battery voltage in volts
bool armed # controllers are active
bool running # balance program is running
  • Develop a ROS package named edumip_my_robot for your EduMIP.

Rqt_plot diagram depicting the nodes, topics, and data flow between the nodes via the topics for Assignment #4. Click for higher resolution image.
  • Your package should consist of at least the following:
    • An URDF file named urdf/edumip_my_robot.urdf’ (or better yet a xacro file urdf/edumip_my_robot.xacro ) describing the robot links and joints. Your link and joint names should be prcisely the following:
      • A body link named “edumip_body”
      • A left wheel link named “wheelL”
      • A right wheel link named “wheelR”
      • A left continuous joint named “jointL” with parent link “edumip_body” and child link “wheelL”.
      • A right continuous joint named “jointR” with parent link “edumip_body” and child link “wheelR”.
      • Here are the measured parameters I used in my xacro file – units are meters and radians:
<!-- Numerical Macros - units are meters and radians -->
<xacro:property name="wheel_separation" value="0.070" /> 
<xacro:property name="wheel_radius" value="0.034" /> 
<xacro:property name="wheel_width"  value="0.010" />
<xacro:property name="body_height"  value="0.1" /> 
<xacro:property name="body_width"   value="0.050" /> 
<xacro:property name="body_length"  value="0.035" />
      •  A C++ node named src/edumip_my_robot_publisher.cpp
      • Your node should subscribe to the /edumip/state message that has recently been expanded to include wheel joint angles and odometry data (i.e. robot X, Y, and heading).
      • Your node should publish the following:
        • sensor_msgs/JointState messages for this robot on the topic /joint_states.. Look at the message definition file edumip_msgs/msgs/EduMipState.msg to see comments on the state message fields.
        • A tf transform from the fixed “world” frame to this robot’s “robot_base” frame that specified the moving position and orientation of the robot with respect to the fixed “world” frame.
        • NOTE:Your node should have a SINGLE callback function for subscribing to the /edumip/state topic, and within this callback function it should publish the the /jount_states topic and the /tf topic.

          RVIZ screen shot for Assignment #4 showing visualization of your robot_model (defined by your urdf or xacro file) and visualization of your tf frames. Click for higher resolution image.
    • A RVIZ initialization file called “rviz/edumip_my_robot.rviz” that displays your robot_model and tf frames.
    • A launch file named ‘launch/edumip_my_robot.launchthat
      1. Launches a joy node from the system-defined joy ROS package. Recall that the joy node publishes sensor_msgs/Joy messages on the topic /joy
      2. Launches your joy_twist node from the joy_twist package that you wrote for last module’s assignment. Recall that the joy_twist node subscribes to sensor_msgs/Joy messages on the topic /joy and publishes geometry_msgs/Twist messages the /edumip/cmd topic.
      3. Launches your custom edumup_my_robot_publisher C++ node from your edumip_my_robot package that you wrote for this module’s assignment. Recall that this node subscribes to the /edumip/state topic and publishes on the /jount_states topic and the /tf topic as described earlier in this assignment.
      4. Launches a standard robot_state_publisher node from the robot_state_publisher package. Recall that this node subscribes to the /jount_states topic and the robot_description parameter and publishes /tf frames for the robot based onyoru urdef and the joint_states.
      5. Sets the parameter “robot_description” to load your urdf/edumip_my_robot.urdf’ (or .xacro) that you wrote for this module to model your edumip.
      6. Launches RVIZ specifying the a rviz initialization file entitles rviz/edumip_my_robot.rviz. hat you create to visualize your robot_model (defined by your urdf or xacro file) and visualization of your tf frames.
      • Now you should be able to drive your robot around with your joystick and see your robot model drive around in RVIZ complete with depiction of the robot and its coordinate frames.
      • NOTE that RVIZ will not display your robot model correctly until it receives valid /tf transforms for all of the robot links. If the link /tf transforms are not valid then RVIZ will show errors in the “robot_model” RVIZ GUI, and the robot links will appear white instead of the colors specified in your urdf/xacro file.
  • Push the finished package edumip_my_robot  gitand share it with the course instructors.
  • DEMO IT TO YOUR INSTRUCTORS!

Module 5: Week of February 27, 2018: Gazebo Intro, SDF, and worlds

Topics

  • Simulating robots, their environments, and robot-environment interaction with Gazebo
  • Gazebo ROS integration.
  • NOTE: You do not need to install Gazebo. Your full ROS kinetic desktop installation will have installed Gazebo V7.0.0 So DO NOT follow the installation instructions on gazebosim.org. If for some reason the ros gazebo package is not installed, install it with sudo apt-get install ros-kinetic-gazebo-ros ros-kinetic-gazebo-ros-pkgs ros-kinetic-gazebo-ros-control
    • You can verify that Gazebo is installed by issuing the command “gazebo” on the command line – a gazebo window should open after a few seconds delay.
  • NOTE: Gazebo is CPU-intensive, and will not run very well in virtual boxes.

Reading

  • Gazebo Overview
  • Gazebo API
  • SDFormat Specification: The SDF XML file format is a superset of URDF.  SDF files are how Gazebo defines robots and the environment. You can generate SDF from URDF or XACRO on-the-fly, so in practice it is easier to maintain a single XACRO file, and use it to generate URDF and SDF from it on-the fly.

Assignments to for This Module

Tutorials

  • Gazebo Version 7.0 Tutorials
    • Note: In this next set of Gazebo tutorials you will use the  command-line “gazebo”, not the ROS gazebo package.
    • Beginner – First-time Gazebo Users
    • Get Started
      • Skip “Install”. Do not install Gazebo, it was installed when your installed the full ROS kinetic desktop.
      • Quick Start: How to run Gazebo with a simple environment.
      • Gazebo Components: This page provides and overview of the basic components in a Gazebo simulation.
      • Gazebo Architecture: Overview of Gazebo’s code structure.
      • Screen Shots
    • Build a Robot
      • Model Structure and Requirements: How Gazebo finds and load models, and requirements for creating a model.
      • Skip this one for now: How to contribute a model.
      • Make a model: Describes how to make models.  Read this one, but there is no exercise to do so quickly move to the next tutorial…
      • Make a mobile robot: How to make model for a 2-wheeled mobile robot.
      • Import Meshes
      • Attach Meshes
      • Add a Sensor to the Robot
      • Make a simple gripper
      • Attach gripper to robot.
    • Build a World
      • Build a world
      • Modifying a world
      • Skip these for now: “Digital Elevation Models”, “Population of models”, and “Building Editor” for now — you can return to them at a later date when and if you need them.  Digital elevation models can be particularly useful in providing realistic simulated terrain for a simulated robot to explore.
    • Friction: How to set friction properties. Be sure to experiment with the more exhaustive friction example linked at the very end of this tutorial. This is the example that I showed in class with sliding blocks. Modify the gravity, friction, and initial position/orientation of the objects to observe different dynamics.
  • Connect to ROS: ROS integration
    • Note: In these ROS Gazebo tutorials you will use the  ROS Gazebo package (“rosrun gazebo_ros gazebo”), not the command-line “gazebo”.
    • Ros Overview
    • Skip this: Which combination of ROS/Gazebo versions to use You can skip this as you will use the default version 7.0.0 that comes with ROS kinetic — see gazebo ROS installation notes above.
    • Installing gazebo_ros_pkgs
      • Note that in this tutorial in addition to apt-get installing some gazebo binary packages you will also clone the KINETIC branch of git clone https://github.com/ros-simulation/gazebo_ros_pkgs.git into your ROS workspace with the command “git clone https://github.com/ros-simulation/gazebo_ros_pkgs.git -b kinetic-devel”.
    • Using roslaunch: Using roslaunch to start Gazebo, world files and URDF models
    • URDF in Gazebo: Using a URDF in Gazebo
      • Note that you will clone git clone https://github.com/ros-simulation/gazebo_ros_demos.git into your ROS workspace  (~catkin_ws/src) in this tutorial.

Assignment #5  Due 4:30PM Tuesday March 6, 2018

Screen shot of Gazebo screen shot from Assignment #5 showing the EduMIP robot model derived from a XACRO file. Click thumbnail for higher resolution image.
  • Create a new Gazebo ROS package named edumip_my_robot_gazebo_ros with dependencies to at least the following packages: roscpp tf std_msgs sensor_msgs geometry_msgs edumip_msgs gazebo_msgs gazebo_ros.
    • DO not use the directory structure specified in the Creating your own Gazebo ROS Package tutorial and exemplified in the URDF in Gazebo RRBot package that you downloaded and used in this tutorial.
  • Your project should have at least the following sub-directories:
    • edumip_my_robot_gazebo_ros/urdf for your xacro file
    • edumip_my_robot_gazebo_ros/launch for your launch file
    • edumip_my_robot_gazebo_ros/worlds for your world file
  • Create a XACRO file urdf/edumip_my_robot.xacro for your EduMip – you can begin with the XACRO file you created for HW#4. If you did not create a XACRO file for HX#4, then do so now.
    • You can use a xacro file in your launch file to set the robot_descriptionparameter with the launch file command “<param name=”robot_description” command=”$(find xacro)/xacro – –inorder $(find edumip_my_robot_gazebo_ros)/urdf/edumip_my_robot.xacro” />’“.      NOTE: the “inorder” argument is preceeded by two dashes!
    • Add additional statements to your XACRO file so that it can be automatically translated to SDF format for use by Gazebo.
      • Your robot should have one link named “edumip_body” with at least the following attributes:
        • <visual> with
          • <origin>
          • <geometry>
          • <material>
        • <collision>
          • <origin>
          • <geometry>
        • <inertial> with
          • <origin>
          • <mass>
          • <inertia>
      • Your robot should have two wheels named “WheelL” and “WheelR” with at least these attributes:
        • <visual> with
          • <origin>
          • <geometry>
          • <material>
        • <collision>
          • <origin>
          • <geometry>
        • <inertial> with
          • <origin>
          • <mass>
          • <inertia>
      • Your robot should have two joints named “JointL” and “JointR”, each with
        • <parent>
        • <child>
        • <origin>
        • <axis>
    • When you simulate your edumip in gazebo, it will not balance.  It will fall over because gravity is acting on it and at present it has no control program like “edumip_ros_balance” to make it balance.  Next week we will use a gazebo plugin to make the simulated robot balance!
    • Your XACRO file should specify physically realistic numbers for the robot link masses and moments of inertia.  My robot body has a mass of about 0.18 Kg, and my wheels mass of about 0.030 Kg.

Here is how I specified the mass and moments of inertia for my edumip body:

<link name="edumip_body">
 <visual>
 <origin xyz="0 0 ${0.5*body_height}" rpy="0 -0.20 0"/>
 <geometry>
 <box size="${body_length} ${body_width} ${body_height}"/>
 </geometry>
 <material name="Blue">
 <color rgba="0 0.0 1.0 0.5"/>
 </material>
 </visual>

<collision>
 <origin xyz="0 0 ${0.5*body_height}" rpy="0 -0.20 0"/> 
 <geometry>
 <box size="${body_length} ${body_width} ${body_height}"/> 
 </geometry>
 </collision>

 <inertial>
 <origin xyz="0 0 ${0.5*body_height}" rpy="0 0 0"/>
 <mass value="0.180"/>
 <inertia ixx="6.0e-4" ixy="0" ixz="0" iyy="6.0e-4" iyz="0" izz="6.0e-4"/>
 </inertial>

 </link>

Here is how I specified the mass and moments of inertia for my robot’s right wheel:

<link name="wheelR">

<visual>
 <origin xyz="0 0 0" rpy="1.57079 0 0" />
 <geometry>
 <cylinder length="${wheel_width}" radius="${wheel_radius}"/>
 </geometry>
 <material name="Green">
 <color rgba="0.0 1.0 0.0 0.5"/>
 </material>
 </visual>

<collision>
 <origin xyz="0 0 0" rpy="1.57079 0 0" />
 <geometry>
 <cylinder length="${wheel_width}" radius="${wheel_radius}"/>
 </geometry>
 </collision>

<inertial>
 <origin xyz="0 0 0" rpy="0 0 0"/>
 <mass value="0.030"/>
 <inertia ixx="1.75e-5" ixy="0" ixz="0" iyy="1.75e-5" iyz="0" izz="1.75e-5"/>
 </inertial>

</link>
  • Your robot should have nice happy colors.
  • Colors are specified differently for RVIZ (urdf) and Gazebo (sdf). Coulomb friction is specified as <mu1> and <mu2> dimensionless parameters. You can add these lines to your XACRO file which will generate SDF-compatible color specifications when the XACRO file is translated to SDF:
 <gazebo reference="edumip_body">
   <material>Gazebo/Blue</material>
   <mu1>0.2</mu1>
   <mu2>0.2</mu2>    
 </gazebo>
 <gazebo reference="wheelR">
   <material>Gazebo/Green</material>
   <mu1>0.2</mu1>
   <mu2>0.2</mu2>    
 </gazebo>
 <gazebo reference="wheelL">
   <material>Gazebo/Red</material>
   <mu1>0.2</mu1>
   <mu2>0.2</mu2>    
 </gazebo>
  • Create a gazebo world file named world/edumip_my_robot.world which provides at least horizontal plane, gravity, and some objects or buildings. Your world file can contain your robot model, or you can spawn your robot model in your launch file. In Gazebo you should be able to cause the robot to move on the plane by applying torques to the wheel (or leg) joints.
  • Your world file
  • Create a launch file named launch/edumip_my_robot_gazebo.launch which launches your robot world with your robot in it — recall that you did this in the assigned tutorial section on roslaunch with gazebo. Your launch file should do the following:
    • Set some parameters – see how the “$(find…” command is used in the tutorial launch files.
      • Set the parameter robot_description to the contents of your xacro file with the command ” <param name=”robot_description” command=”$(find xacro)/xacro – –inorder $(find edumip_my_robot_gazebo_ros)/urdf/edumip_my_robot.xacro” />”  NOTE: the “inorder” argument is preceeded by two dashes!
      • Set the parameter world_name to the name of the world file using similar syntax.
    • Spawn a model of your EduMip with the roslaunch file using the “spawn_model” node in the “gazebo_ros” package with arguments args=”-param robot_description -urdf -model edumip_my_robot”. Read the documentation of this package for details.
  • Run your launch file with roslaunch. It should launch gazebo with your robot model at the origin.
    • Make your robot move around in the Gzebo World:
      • Select your robot with the mouse
      • On the right hand side og the gazebo window, with your mouse swipe open the “Joint” pane.
      • Apply some torque to he joints to make your robot move.
    • Introspect on the topics with rostopic list’ and rostopic echo.
  • Push the finished package edumip_my_robot_gazebo_ros as a new repo on https://git-teach.lcsr.jhu.edu and share it with the course instructors.

Module 6: Week of March 6, 2018: Gazebo physical simulation, ROS Integration

Topics

  • Simulating robots, their environments, and robot-environment interaction with Gazebo
  • Gazebo ROS integration.
  • Gazebo Intermediate Concepts

Reading

Assignments to do for This Module

Tutorials

  • Gazebo Tutorials

    Rviz screen shot showing the RRBot camera and laser-scanner sensor data. Click for higher resolution image.

Gazebo screen shot showing the RRBot camera and laser-scanner sensor data. Click for higher resolution image.

 

  • Connect to Ros: ROS Integration
    • Gazebo plugins in ROS
      • Note: To visualize the laser in gazebo as shown in the Gazebo figure above and in the tutorial  here you will also need to set the visualize property of the hokuyo laser plugin to true (i.e. “<visualize>true</visualize>”).
      • Note: As mentioned in class, if you are running a PC or VirtualBox without a GPU graphics adapter then the Gazebo laser scanner plugin may not simulate the laser scanner properly, or may crash when you run the launch file with an error message beginning something like [gazebo-1] process has died [pid 3207, exit code 139, cmd /opt/ros/indigo/lib/gazebo_ros/gzserver… – so you will need to modify the rrbot.gazebo sdf file to use non-GPU hokuto laser plugin as follows:
        • Replace <sensor type=”gpu_ray” name=”head_hokuyo_sensor”> with <sensor type=”ray” name=”head_hokuyo_sensor”>
        • Replace <plugin name=”gazebo_ros_head_hokuyo_controller” filename=”libgazebo_ros_gpu_laser.so”> with <plugin name=”gazebo_ros_head_hokuyo_controller” filename=”libgazebo_ros_laser.so”>
      • Note: To visualize the laser in gazebo as shown in the Gazebo figure on the right of this web page and in the tutorial here you will also need to set the visualize property of the hokuyo laser plugin to true (i.e. “<visualize>true</visualize>”).
    • ROS control Before you do this tutorial be sure that you have the kinetic ros-control packages installed with the command “sudo apt-get install ros-kinetic-ros-control ros-kinetic-ros-controllers”. If these packages are not installed on your system you will get error messages like [ERROR] [WallTime: 1395611537.772305] [8.025000] Failed to load joint_state_controller and [ERROR] [WallTime: 1395611538.776561] [9.025000] Failed to load joint1_position_controller.
    • ROS communication with Gazebo
    • ROS Plugins
      • Note: If you previously downloaded the gazebo_ros_demos tutorial package from this tutorial then you do not need to create a new custom package names “gazebo_tutorials” for this tutorial since the “gazebo_tutorials” package is already present in the directory ~/catkin_ws/src/gazebo_ros_demos/custom_plugin_tutorial.
    • Advanced ROS integration

Assignment #6  Due 4:30PM Tuesday March 13, 2018

Expand upon your package named edumip_my_robot_gazebo_ros that you created for the previous module’s assignment.

  • In your robot’s xacro file urdf/edumip_my_robot.xacro add a plugin from the package edumip_balance_ros_gazebo_plugin that will actively balance your robot.
    • Clone the and build the plugin from this public repository into your ros workspace src directory: https://git.lcsr.jhu.edu/lwhitco1/edumip_balance_ros_gazebo_plugin.git
    • Compile the plugin with catkin_make
    • Rebuild your package cache with “rospack profile” just to be on the safe side.
    • This Gazebo plugin does the following:
      • Balances the robot with active feedback control
      • Subscribes to twist messages on the topic (e.g. /edumip/cmd )
      • Publishes an EduMipState messages on the topic (e.g. /edumip/state)
      • You need to provide information in your zaxro file specifying the
        • ros debug level
        • controller update rate
        • robot base link name
        • wheel joint names
        • wheel separation and diameter
        • twist command topic to subscribe to
        • EduMipState topic to publish
    • Here is the code I used in my xacro file for the EduMip Gazebo plugin:
<gazebo>
 <plugin name="edumip_balance_ros_gazebo_plugin" filename="libedumip_balance_ros_gazebo_plugin.so">
 <rosDebugLevel>3</rosDebugLevel>
 <updateRate>100</updateRate>
 <robotBaseFrame>edumip_body</robotBaseFrame>
 <leftJoint>jointL</leftJoint>
 <rightJoint>jointR</rightJoint>
 <wheelSeparation>${wheel_separation}</wheelSeparation>
 <wheelDiameter>${wheel_radius*2.0}</wheelDiameter>
 <commandTopic>/edumip/cmd</commandTopic>
 <stateTopic>/edumip/state</stateTopic> 
 </plugin>
</gazebo>
  • Once you have added the plugin xml code to your urdf/edumip_my_robot.xacro file,  test it by running your launch file edumip_my_robot_gazebo.launch to launch gazebo with your world file from last week and spawn a robot model.
    • Your model should balance.
    • You should see a topic /edumip/state, and the plugin should be publishing EduMipState messages on this topic at 10 Hz.
    • If your robot does not balance, then check the mass and interial parameters for the edumip body and wheels that I suggested you use in the previous assignment (see HW #5 above).
  • After you have successfully added and tested the edumip_balance_ros_gazebo_plugin:  In your robot’s xacro file urdf/edumip_my_robot.xacro add camera link and a camera plugin
    1. In your robot’s xacro file add new link on top of the robot named camera_link.
    2. In your robot’s xacro file add new fixed joint named camera_joint with parent link edumip_base and child link camera_link
    3. In your robot’s xacro file add a camera plugin
      1. The camera should be located on the camera_link
      2. The name of the camera should be camera1′
      3. The camera plugin should publish images on the topic /edumip/camera1
    • Here is the code that I used in my xacro file for the camera plugin:
<gazebo reference="camera_link">
 <sensor type="camera" name="camera1">
   <update_rate>30.0</update_rate>
   <camera name="head">
     <horizontal_fov>1.3962634</horizontal_fov>
     
     <clip>
       <near>0.02</near>
       <far>300</far>
     </clip>
     <noise>
       <type>gaussian</type>
       <mean>0.0</mean>
       <stddev>0.007</stddev>
     </noise>
   </camera>
   <plugin name="camera_controller" filename="libgazebo_ros_camera.so">
     <robotNamespace>edumip</robotNamespace>
     <alwaysOn>true</alwaysOn>
     <updateRate>10.0</updateRate>
     <cameraName>camera1</cameraName>
     <imageTopicName>image_raw</imageTopicName>
     <cameraInfoTopicName>camera_info</cameraInfoTopicName>
     <frameName>camera_link</frameName>
     <hackBaseline>0.07</hackBaseline>
     <distortionK1>0.0</distortionK1>
     <distortionK2>0.0</distortionK2>
     <distortionK3>0.0</distortionK3>
     <distortionT1>0.0</distortionT1>
     <distortionT2>0.0</distortionT2>
   </plugin>
 </sensor>
</gazebo>

 

Gazebo screen shot for HW #6 showing the EduMIP robot at the gas station. Click for higher resolution image.

 

 

RVIZ screen shot for HW #6 showing the EduMIP visualized, and showing the robot camera image generated by the camera plugin, and showing the odometry data published by the differential drive controller plugin. Click for higher resolution image.
  • Use your launch file from the previous module (launch/edumip_my_robot_gazebo.launch) to launch gazebo – you should be able to see the camera link in the gazebo window.  Check to verify that this launch file:
    • runs gazebo with your world file from your package’s world directory file specified as an argument.
    • loads the robot_description parameter from your xacro file located in your package’s urdf directory.
    • spawns your robot model with the “spawn_model” executable from the package “gazebo ros”.
  • Use rqt_image_view and select the topic /edumip/camera1/image_raw to see the images generated by the camera plugin.
  • After you have successfully added the camera plugin to your xacro file and successfully tested it: Add two C++ nodes that you previously developed to this ROS package to make it stand-alone:
    • Copy the source files edumup_my_robot_state_publisher.cpp and joy_twist_node.cpp from your previous assignments into this packages src directory.
    • Edit your CMakeLists.txt to compile these executables to new and unique names, for example:
add_executable(edumup_my_robot_state_publisher_hw6 src/edumup_my_robot_state_publisher.cpp)
target_link_libraries(edumup_my_robot_state_publisher_hw6 ${catkin_LIBRARIES})

add_executable(joy_twist_node_hw6 src/joy_twist_node.cpp)
target_link_libraries(joy_twist_node_hw6 ${catkin_LIBRARIES})
  • Create a RVIZ initialization file in the rviz directory of your package named rviz/edumip_gazebo_rviz_hw6.rviz. This rviz initialization file should include visualization of the following data:
    1. RobotModel of parameter robot_description
    2. TF of topic /tf
    3. Image display of topic /edumip/camera1/image_raw
  • Create a new launch file named launch/edumip_my_robot_rviz.launch that does the following:
    1. Loads the robot_description parameter from your xacro file urdf/edumip_my_robot.xacro.
    2. Launches your edumup_my_robot_state_publisher_hw6 node like you did in HW #4.
    3. Launches a state_publisher node from the robot_state_publisher (like you did in HW #4) that subscribes to the topic /joint_states and publishes on /tf. This is a system-defined node, you do not need to write it.
    4. Launches RVIZ with the initialization file rviz/edumip_gazebo_rviz_hw6.rviz that you created for this assignment.
  • Create a new launch file named  launch/edumip_my_robot_joy.launch that does the following:
    • runs a joy node
    • runs the joy_twist_node_hw6 node
  • Run your three launch files –
    • launch/edumip_my_robot_joy.launch launches a joy node and a your joy_twist_node C++ node (from your previous assignment HW #3).
    • launch/edumip_my_robot_gazebo.launch launches Gazebo with your world loaded and spawns your EduMip model.
    • launch/edumip_my_robot_rviz.launch launches RVIZ, your robot joint state publisher C++ node (from previous assignment HW#4) , and a robot_state_publisher.
    • You should be able to drive your robot around in Gazebo with your joystick, and see the robot visualized simultaneously in both Gazebo and RVIZ. In RVIZ you should be able to see the robot, its odometry trail, and the camera image.
    • Kill everything when you are done.
  • Once you have successfully implemented and tested the above: Add a simulated laser scanner to your robot:
    • Add a link to the top of your robot xacro named “hokyu_link” and connect it to the edumip_body link with a fixed joint.
    • Add a laser scanner Gazebo ROS plugin – follow the example in the rrbot tutorial.
    • Run your gazebo, joy, and rviz launch files and see the simulated laser scanner mean in the Gazebo Window (remember to set “visualize” to “true” as noted above.
    • Add a laser scan display marker in Rvis to visualize the laser scan data in Rviz. Save your new rviz/edumip_gazebo_rviz_hw6.rviz file. 
    • Drive the robot around in gazebo and see the robot, camera image, and laser scan visualized in rviz.
    • Kill everything when you are done.
  • Modify your robot joint state publisher C++ node (from previous assignment HW#4) so that, in addition to publishing the robot joint states and edumip_body tf transform (you did this in HW #4), add additional code to that it also publishes an odometry message for the robot edumip_base frame on the topic /edumip/odometry .  You only need to populate the frame IDs and the Pose data in the Odometry message, you  do not need to populate the covariances or velocity data.

 

 

 

 

rqt_graph plot of the topic transport for the simulated EduMip Robot with simulated camera and laser scanner topics, and a C++ node publishing edumip_base tf frames, joint_states, and Odometry messages.

 

rqt_graph robot of the real EduMip and a C++ node publishing edumip_base tf frames, joint_states, and Odometry messages.

  • Test it in the real-world with your EduMip: Now run your joy and rviz launch filers, but do not launch gazebo. Instead of gazebo, launch your actual edumip balance program on your edumip.  Dive it around and see that it works just like the simiulation, minus the simiulated sensors.
  • Check your project directory structure and files:  Your project should have the following directories and files:
    • edumip_my_robot_gazebo_ros/
      • CMakelists.txt
      • package.xml
      • launch/
        • edumip_my_robot_gazebo.launch
        • edumip_my_robot_joy.launch
        • edumip_my_robot_rviz.launch
      • rviz/
        • edumip_gazebo_rviz_hw6.rviz
      • src/
        • edumup_my_robot_state_publisher.cpp (revised to publish Odometry messages on the /edumip/odometry topic)
        • joy_twist_node.cpp
      • urdf/
        • edumip_my_robot.xacro
      • worlds/
        • edumip_my_robot.world
  • Push It: Push your finished package edumip_my_robot_gazebo_ros  to https://git-teach.lcsr.jhu.edu share it with the course instructors. This is the same git repo that you used for the previous assignment – just update it with this module’s work (add, commit, push).
  • DEMO YOUR ROBOT SIMULATION IN GAZEBO with RVIZ DISPLAY TO YOUR INSTRUCTORS

Screenshot of  HW #6 RVIZ visualization of simulated EduMip showing EduMip mode, Odometry markers, image stream from simulated camera, and red markers showing the data from the simulated laser scanner.

Module 7:   Week of March 13, 2018: Turtlebot-2 Simulation in Gazebo, SLAM Navigation, Adaptive Monte-Carlo Localization

This assignment demonstrates a larger robot system using sophisticated ROS packages to use a simulate mobile robot and  Simultaneous Localization and Mapping (SLAM) algorithms to construct a 2D map, and then to use the map to do 2D adaptive monte-carlo navigation of this robot, all using existing available ROS packages.

Topics

Reading

Assignments to do for This Module

Tutorials

  • NOTE that the turtlebot tutorials are not all updated to ROS Kinetic, but the tutorials for ROS Jade should work just fine when kinetic-specific tutorials are unavailable.
  • Read 2.1 Turtlebot Developer Habits and and 2.2 Interacting with your Turtlebot a t this link: http://wiki.ros.org/Robots/TurtleBot
  • Install the turtlebot packages and the turtlebot simulator Gazebo and Rviz packages.
    • Update your Linux system and ROS packages and Install the turtlebot ROS packages as follows:
      1. Update your PCs Linux system and ROS packages with the commands first sudo apt-get update and then sudo apt-get dist-upgradeTHIS IS IMPORTANT, DO NOT SKIP THIS STEP! If you have not updated recently, the command may pull down several hundreds of MB from the Ubuntu and ROS repositories.
      2. Install the turtlebot ROS packages as described in the Turtlebot Installation Tutorial – item 3.1 Turtlebot Installation: Installing software onto the turtlebot at this link. Do the “Ubuntu Package Install” with this command for the KINETIC version of these packages:
sudo apt-get install ros-kinetic-turtlebot ros-kinetic-turtlebot-apps ros-kinetic-turtlebot-interactions ros-kinetic-turtlebot-simulator ros-kinetic-kobuki-ftdi ros-kinetic-rocon-multimaster ros-kinetic-ar-track-alvar-msgs
    • NOTE 1: After you install the above packages, kill your login shells and start a clean shells so that additional required gazebo environment variables are set.
    • NOTE 2: You DO NOT need to do a source installation
    • NOTE 3: You do not need to run the command rosrun kobuki_ftdi create_udev_rules because your notebook computer will not be physically connected to the kobuki base with a USB cable. The notebook computers that are physically on the turtlebot will be connected to the kbuki base with a USB cable. Your computer will communicate to the turtlebot’s on-board netbook via WiFi.
  • Read about the Turtlebot simulator package for ROS kinetic.
  • Do the “(Section) 6: Simulation” Tutorials for Gazebo (do not do the “Stage” simulator tutorials).
    • Gazebo Bringup Guide: See the simulated turtlebot in Gazebo.
      • When you run the command roslaunch turtlebot_gazebo turtlebot_world.launch if you get the error “Invalid <arg> tag: environment variable ‘TURTLEBOT_GAZEBO_WORLD_FILE’ is not set.”, refer to the NOTE 1 item on installation above.
    • Explore the Gazebo world: Cruise around in the Gazebo world and use RViz to “see” what’s in it.
      • NOTE 4: Do not attempt to install ROS Indigo or Jade packages, we are using ROS kinetic. So install only ROS kinetic packages.
      • NOTE 4.1: You may find it convenient to clone these two repositories of turtlebot files into your catkin_ws/src so that you can edit the demo files: https://github.com/turtlebot/turtlebot_interactions.git  and https://github.com/turtlebot/turtlebot_simulator.git  After you clone the repository, be sure to run “catkin_make” and then “rospack profile”, and then kill all of your open shells/terminals, and open clean terminals.
      • NOTE 5: Use this command to get keyboard control to drive the robot “roslaunch kobuki_keyop keyop.launch”, it is easier to use (supports arrow keys) than the alternative command “roslaunch turtlebot_teleop keyboard_teleop.launch”
      • NOTE 5.1: The tutorial’s RVIZ initialization file has some errors in the topic names. For example: when displaying the “DepthCloud” visualization in RVIZ, set its “Color Image Topic” to /camera/rgb/image_raw – this will overlay the Kinect camera image on the Kinect’s depth cloud.
    • Make a map and navigate with it: Use the navigation stack to create a map of the Gazebo world and start navigation based on it.
      • NOTE 6:There is a bug in the ROS Kinetic package turtlebot_gazebo in the package’s launch file gmapping_demo.launch — its full path is /opt/ros/kinetic/share/turtlebot_gazebo/launch/gmapping_demo.launch (what is the bug?). We can work around this bug by pulling down a copy of this package source into our own ros catkin workspace with the commands:
        • cd ~/catkin_ws/src (varies depending the name and location of your catkin workspace).
        • git clone https://github.com/turtlebot/turtlebot_simulator
        • cd ..
        • catkin_make
        • source devel/setup.bash
        • rospack profile
        • Check that you have overlaid this package with your local copy. Now when you give the command roscd turtlebot_gazebo your default direcectory should be set to ~/catkin_ws/src/turtlebot_simulator/turtlebot_gazebo.
      • To command the turtlebot motion from the keyboard, this node allows you to use the arrow-keys (up, down, left, right): roslaunch kobuki_keyop safe_keyop.launch
      • Save your map to files with your name as part of the filename – for example I might use the command rosrun map_server map_saver -f map_by_louis to generate the map occupancy grid data file map_by_louis.pgm and its associated meta-data file map_by_louis.yaml.
        • Examine the contents of the .pgm map file with an image viewer such as gimp.
        • Examine the contents of the .yaml metadata map with a text editor or more.
      • Note 7: To run the turtlebot AMCL navigation demo, you need to specify the full path name of the .yaml map file, e.g. roslaunch turtlebot_gazebo amcl_demo.launch map_file:=/home/llw/my_first_map.yaml
      • NOTE 8: Under ROS kinetic the turtlebot AMCL mapping demo seems to ahve some wonky parameters that cause the robot to rotate (twirl) considerably when given a 2-D navigation goal. My workaround was to run rqt_reconfigure (see below for command) and tune some parameters:
        • I set the node move_base parameter “recovery_behavior_enabled” to false (unchecked).
        • I set the node move_base parameter “clearing_rotation_allowed” to false (unchecked).
        • I set the node move_base navigation parameter speed_lim_w to 1.0 (was 5.4).
        • Note that setting these parameters with rqt_reconfigure is ephemeral – the parameters return to the default values specified in the AMCL package then the package is re-launched.
      • Be sure to
        • List and examine the nodes, topics, service
        • Echo some of the published topics
        • Run rqt_graph to visualize the topic data paths
        • Run rqt_reconfigure with the command rosrun rqt_reconfigure rqt_reconfigure to see some of the configuration parameters of packages that you are running.

Assignment #7   Due 4:30PM Tuesday March 27, 2018

Make a SLAM map and navigate with it with a simulated robot and worls in Gazebo! Do this this assigned tutorial completely and carefully: Make a SLAM map and navigate with it

  • Be sure to
    • List and examine the nodes, topics, service
    • Echo some of the the published topics
    • Run rqt_graph to visualize the topic data paths
    • Save your map to files with your name as part of the filename – for example I might use the command rosrun map_server map_saver -f map_by_louis to generate the map occupancy grid data file map_by_louis.pgm and its associated meta-data file map_by_louis.yaml.
      • Examine the contents of the .pgm map file with an image viewer such as gimp.
      • Examine the contents of the .yaml metadata map with a text editor or more.
  • Note: To run the AMCL mapping demo, you need to specify the full path name of the .yaml map file, e.g. roslaunch turtlebot_gazebo amcl_demo.launch map_file:=/home/llw/my_first_map.yaml
  • Email the two map files (.pgm and .yaml) that you generated to the instructor and the TAs with the subject line “530.707 RSP HW#7 by your_firstname your_lastname”.
  • Demonstrate to the TAs or instructor using your map to navigate the playground world with amcl particle filter navigation (you did this in the last part of the SLAM navigation tutorial).

Module 8: Week of March 27, 2018: Independent Project

Week 1 of 5: Project Proposal: Week of April 3, 2018

Topics

  • Formulate your class project and
  • Form your team
  • Write your project proposal.
  • Get your project proposal approved by Prof. Whitcomb.

Reading

Project reports, Articles, and Photos of Previous Independent Class Projects

Assignments to do this week

Plan your independent class project and form your team

Here are the project rules:

  • Form a project team of no more than 3-4 members. You can work independently if you prefer.
  • Formulate a project that
    • Performs at least two specific tasks
    • Uses at least two sensors
    • At least one of the tasks must be performed autonomously or semi-autonimously (i.e. the not just pure teleopertion)
    • Your project MUST include several original ROS nodes that you programmed by you from scratch.
  • Review your preliminary project formulation with the Instructor and TAs during regular office hours and after classes this week. Ask them questions.
  • Determine the availability of all required hardware.
  • Conduct tests of any critical required existing ROS packages.
  • IMPORTANT NOTE:These rules can be waived with the permission of the instructor if you have a good rationale for doing something different.

Homework #8: Assignment to hand in this week – Due 4:30PM Tuesday March 3, 2018

  • Project Proposal – create A SINGLE PDF description of your proposed project and submit it to the Instructor and TAs.
  • ONE of your team members should create a GIT repository entitled 530_707_Independent_Project and share it with the team members  with developer access.
    • Create a text file containing the name of your project and he names of he team members
    • Create a directory entitled “reports
    • Commit your project proposal to the repository with the title 01_Project_Proposal.pdf
  • Email the Instructor and TAs to notify them that  “530.707 Proposal for Project YOUR PROJECT NAME by NAMES OF ALL TEAM MEMBERS” in email subject line.
  • Submit a single project proposal document for each project team.
  • Your proposal should include
    • Project Title – give your project a NAME
    • Project Author(s) (i.e. team members),
    • Date
    • Description of proposed project. What does it do? Describe! How does a user interact with it?
    • Software
      • List and description of the NEW ros packages you will program from scratch to accomplish your project.
      • List of the major existing ROS packages you will utilize
      • List what testing you have done on these existing ROS packages.
    • Hardware and infrastructure
      • List and description of any new or additional hardware and infrastructure needed to implement your project.
      • List of existing and infrastructure needed to implement your project.
      • Describe availability of all required hardware
      • List what testing you have done on existing available hardware.
      • Include a detailed list of all new hardware required to be purchased or borrowed, including quantity, description, model number, source/vendor, unit cost, total cost, etc.
    • Safety Plan
      • Are there any safety risks?
      • If so, can they be managed and/or mitigated to an acceptable level fo safety?
      • If so, what is the plan for risk management and/or mitigation?
    • References – list of any references cited in your text
    • Project Timeline: For each week, give the hardware and software development and testing goals for the week.
      1. Week 1 hardware and software development and testing goals.
      2. Week 2 hardware and software development and testing goals.
      3. Week 3 hardware and software development and testing goals.
      4. Week 4 hardware and software development and testing goals.
      5. Week 5 hardware and software development and testing goals. Your goal should be to have your project completed by the end of Week 5 – the last week of classes. This should include your project poster for the class poster session on May 15, and also your project report.
  • Your submitted proposal will be reviewed by the Instructor and TAs, and you will be notified of the result. It will either be “approved” or “not approved” – if “not approved” you will be asked to revise and resubmit the proposal.

Week 2 of 5: Project Implementation and Testing & Project Weekly Progress Report #1: Week of April 10, 2018

  • Weekly progress reports are due at 4:30PM Tuesdays each week until the end of classes.  Weekly progress reports are due as follows:
    1. 4:30PM Tuesday April 10, 2018
    2. 4:30PM Tuesday April 17, 2018
    3. 4:30PM Tuesday April 24, 2018
    4. 4:30PM Tuesday May 1, 2018
  • Project Weekly Progress Report – create A SINGLE PDF report of your progress.
  • Your Project Weekly Progress Report should include
    • Heading:
      • Project Title
      • Weekly Progress Report #1
      • Project Author(s) (i.e. team members),
      • Date
    • The body should contain at least the following sections:
      • 1. This Week’s Goals
        • State this past week’s goals quoted from the “Project Timeline” of your Project Proposal.
      • 2. This Week’s Progress
        • State the goals you HAVE accomplished this week.  Describe each briefly.   When possible include a few representative figures with captions (photos, screenshots, or data plots) illustrating your system in  operation.
        • State the goals you have HAVE NOT accomplished this week.  Describe each briefly.  Discuss that the obstacles were.
      • 3. Changes in Project Scope/Goals
        • Discuss briefly any changes in the project scope that are different from those stated in the Project Proposal.  You must get changes in scope approved by Prof. Whitcomb!
      • 4. Lessons Learned
        • Discuss important lessons learned this past week.
      • 5. Next Week’s Goals
        • State next week’s goals quoted from the “Project Timeline” of your Project Proposal.
  • ONE of your team members should have already created a GIT repository entitled 530_707_Independent_Project and shared it with the team members  with developer access.
  • In the directory entitled “reports” of your project repository, commit your project report  with the title 02_Project_Weekly_Report_2018_04_10.pdf
  • Email the Instructor and TAs to notify them that  “530.707 Project Weekly Report for YOUR PROJECT NAME by NAMES OF ALL TEAM MEMBERS” in email subject line.

Week 3 of 5: Project Implementation and Testing & Project Weekly Progress Report #2: Week of April 17, 2018

  • Follow the outline from the first weekly report, above, to prepare your weekly report.
  • In the directory entitled “reports” of your project repository, commit your project report  with the title 03_Project_Weekly_Report_2018_04_17.pdf
  • Email the Instructor and TAs to notify them that  “530.707 Project Weekly Report for YOUR PROJECT NAME by NAMES OF ALL TEAM MEMBERS” in email subject line.

Week 4 of 5: Project Implementation and Testing & Project Weekly Progress Report #3: Week of April 24, 2018

  • Follow the outline from the first weekly report, above, to prepare your weekly report.
  • In the directory entitled “reports” of your project repository, commit your project report  with the title 04_Project_Weekly_Report_2018_04_24.pdf
  • Email the Instructor and TAs to notify them that  “530.707 Project Weekly Report for YOUR PROJECT NAME by NAMES OF ALL TEAM MEMBERS” in email subject line.

Week 5 of 5: Project Implementation and Testing & Project Weekly Progress Report #4: Week of May 1, 2018

 

  • Follow the outline from the first weekly report, above, to prepare your weekly report.
  • In the directory entitled “reports” of your project repository, commit your project report  with the title 05_Project_Weekly_Report_2018_05_01.pdf
  • Email the Instructor and TAs to notify them that  “530.707 Project Weekly Report for YOUR PROJECT NAME by NAMES OF ALL TEAM MEMBERS” in email subject line.

Independent Project  Demonstration, Monday-Tuesday May 7, 2018

Each team will choose a 30 minute time slot to demonstrate their project to the Instructors, TAs, and other students, faculty, staff, and possibly some local press.

Here is the demo SCHEDULE: https://docs.google.com/spreadsheets/d/1Z2i-C25W6wR9N3rAR2NMUpXqnuQA-iIdFT0fPVsVzCI/edit?usp=sharing

Independent Project Poster Session, 10AM-12 Saturday, May 12, 2018

  • Final presentations will be a poster session 10AM-12 Saturday, May 12, 2018 in Hackerman Hall.
  • This poster session is scheduled in the time slot for our class final exam, so it should not conflict with other final exams, per the JHU Registrar’s final exam schedule: https://studentaffairs.jhu.edu/registrar/wp-content/uploads/sites/23/2018/01/Spring-2018-Final-Exam.pdf
  • I posted a poster template for a 48” (wide) x 36” (tall) poster here https://jh.box.com/v/530-707-Dropbox in the file named Robot_Systems_Programming_Poster_Template_Rev_01.pptx
  • We will provide 48”x36” foam-core boards and poster-stands.
  • You need to print your own poster.
  • The printers are both loaded with 42” wide HP Heavyweight Coated Paper, so you can print your poster sideways to save paper.
  • Do not wait until the last day to print your poster!
  • If you can demo your project at the poster session, please do so. If not, showing a video of your project in action is desirable.
  • Please commit the .PPT and PDF of your poster to the “reports” directory of your project git repository, with file names like these:06_Project_Poster_Your_Project_Team_Name.pdf
    06_Project_Poster_Your_Project_Team_Name.pptfor example

    06_Project_Poster_UR5_Plays_Jenga.pdf
    06_Project_Poster_UR5_Plays_Jenga.ppt

Project Videos for upload to the Course YouTube Channel!

PROJECT VIDEOS: Not required, but greatly appreciated. If you made a video and would like me to load your video to the class youtube channel (https://www.youtube.com/channel/UC_HYa5JUr2yKZAM0rYtAiFQ) , please commit the video file to your project git repo in the “reports” directly with a file name like:

530707_Project_Video_YOUR PROJECT_NAME.mp3

for example

530707_Project_Video_EduMIP_SLAM_and_UR5_Transporting.mp3
or
530707_Project_Video_EduMIP_SLAM_and_UR5_Transporting.mov

 

Independent Project Final report is due by end of exam period: Thursday May 17, 2018

Here is the outline:

  • Your project report should include
    • Project Title, Author(s) (i.e. team members), Date
    • Description of proposed project goals
    • Software
      • List and description of the new ros packages you implemented to accomplish your project. Include git repository URL for each package.
      • List of the major existing ROS packages you utilized
      • Figures as needed.
    • Hardware and infrastructure
      • List and description of any new or additional hardware and infrastructure you needed to implement your project.
      • List of existing and infrastructure you needed to implement your project.
      • Figures/Photos as needed.
    • Sample Data Products (if any) from the project
    • Link to brief online video of your project in action (desired but not required).
    • Project “Lessons Learned”
    • Suggestions for future projects
    • References – list of any references cited in your text
  • Please commit your project final report to the same directory by May 17, 2018, with the names07_Project_Final_Report_Your_Project_Team_Name.pdf
    07_Project_Final_Report_Your_Project_Team_Name.pptfor example

    07_Project_Final_Report_UR5_Plays_Jenga.pdf
    07_Project_Final_Report_Plays_Jenga.ppt

    Please email me when you have uploaded your final report with “530.707 Project Final Report for YOUR PROJECT NAME by NAMES OF ALL TEAM MEMBERS” in email subject line.

HOW TO ORDER COMPONENTS FOR YOUR PROJECT

Some Tips and Notes on Projects and Programming the EduMIP in ROS

Istalling ROS Packages on the EduMIP:

The ROS installation /op/ros/kinetic on your EduMIP was compiled from source on the EduMIP, it was NOT installed with “apt-get install ros-kinetic-desktop-full”. If you want to install additional ROS packages on your EduMIP, we suggest that you download the KINETIC branch of the package souce code into catkin_ws/src and compile it with “catkin_make”

  • So, for example, if I want to install the ROS “serial” package on my edumip, I would need to clone a copy of the source and compile it:
  • Now your EduMIP should use the serial package from your workspace.
  • On my PC, in contrast, I can use apt-get to install this binary package with the command “sudo apt-get install ros-kinetic-serial”

ROS Can’t Find my ROS Package

If your ROS environment does not seem to recognize that you have new packages in your catkin_ws/src, try updating your rospack profile with the command “rospack profile”, and update your rosdep cache with the command “rosdep update”.

Compiling and Linking to the Robtoics Cape “C” Libraries

The robotics cape library and associated header files are already installed on your EduMIP. The link library is /usr/lib/libroboticscape.so, and the two top-level header files are rc_usefulincludes.h and roboticscape.h.

You can refer to the edumip_balance_ros project (https://git.lcsr.jhu.edu/lwhitco1/edumip_balance_ros) to see how to use the robotics cape C header files and C link library with your C++ ROS Node.

See edumip_balance_ros/src/edumip_balance_ros.cpp to see how to include the C header files in a C++ program with the extern “C” directive.

See edumip_balance_ros/CMakeLists.txt to see how to link your program to the roboticscape C library.

Edumip_balance_ros Package

EduMIP Horizontal Configuration

Several teams propose to employ the EduMIP in a horizontal configuration with a caster wheel. To do this you will need to replace the “edumip_balance_ros” project with your own new code to command the wheel servos and to read and publish the wheel encoders. As I reviewed in class yesterday, one possible package you could use is the ROS differential_drive package (http://wiki.ros.org/differential_drive). To use the differential_drive package you will need to write a new C++ node for the edumip that (a) Reads the wheel encoders and publishes them on topics and (b) Subscribes to motor command topics and commands effort to the wheel motors.

Interfacing to Serial Port Devices

To interface your C++ code to a serial device such as the USB ultrasonic sensor demonstrated in class, you will need to be able to open, read, and write to the serial ports. A good package for this is the ROS serial package (http://wiki.ros.org/serial). You can clone a copy of the src into your ROS workspace with the command “git clone https://github.com/wjwwood/serial.git” and compile it with “catkin_make”. The “examples” directory has a somewhat complex example of serial port usage. A simpler example is available here: https://github.com/garyservin/serial-example