ME 530.707 2019

530.707 Robot Systems Programming Course Home Page – Spring 2019

Course Description

This course seeks to introduce students to open-source software tools that are available today for building complex experimental and fieldable robotic systems. The course is grouped into four sections, each of which building on the previous in increasing complexity and specificity: tools and frameworks supporting robotics research, robotics-specific software frameworks, integrating complete robotic systems, and culminates with an independent project of the student’s own design using small mobile robots or other robots in the lab. Students will need to provide a computer (with at least a few GB of memory and a few tens of GB of disk space) running Ubuntu 16.04 LTS Xenial Xerus (http://releases.ubuntu.com/16.04) or one of its variants such as Xubuntu 16.04 LTS (http://xubuntu.org/getxubuntu) and ROS Kinetic Kame (http://wiki.ros.org/kinetic)– note that these specific versions of Linux and ROS are required! Students should have an understanding of intermediate programming in C/C++ (including data structures and dynamic memory allocation) Familiarity with Linux programming. Familiarity with software version control systems (e.g. subversion, mercurial, git), linear algebra.  Required Course Background: Familiarity with robot kinematics and algorithms such as those covered in EN.530.646 Robot Devices, Kinematics, Dynamics, and Control and EN.600.636 Algorithms for Sensor Based Robotics, and 601.220.  Intermediate Programming in C++ on Linux.

Course-Related Web Pages

Contents

Instructors

Faculty

Louis L. Whitcomb
Professor
Department of Mechanical Engineering
G.W.C. Whiting School of Engineering
Johns Hopkins University
office: 115 Hackerman Hall
Phone: 410-516-6724
email: [email]
https://git-teach.lcsr.jhu.edu user id: lwhitco1

Teaching Assistants

  1. Mr. Gabe Bariban, [email], https://git-teach.lcsr.jhu.edu user id: gbaraba1
  2. Mr. Chia-Hung Lin,  [email], https://git-teach.lcsr.jhu.edu user id: clin110
  3. Mr. Han Shi, [email],  https://git-teach.lcsr.jhu.edu user id: hshi17

Class Meeting Schedule

Times:  Tuesday and Thursdays 4:30-6:00PM
Dates:   Spring 2019
Location: 210 Hodson Hall

Office Hours

  • Mondays 10AM-11AM, Han Shi, 111 Hackerman
  • Mondays 3PM-4PM, Chia-Hung Lin, 111 Hackerman
  • Tuesdays 11AM-12PM, Gabe Baraban, 111 Hackerman

EduMIP Robots

In this course we will use the EduMIP mobile robots – see ROS on the EduMIP Mobile Robot for more information.

The EduMIP robot was created by Prof. Thomas Bewley and his students at UCSD. It  is available from Renaissance Robotics

Textbooks

Although there is no required text for the course, if you are new to ROS we recommend that you get and read one or more of the following two introductory ROS texts:

Electronic books available through the Milton Eisenhower Library are accessible from on-camps IP addresses. If you are off-campus, you will need to VPN into the campus network to access these electronic books.

Prerequisites

Prerequisite Courses

You must have taken both of “core” robotics courses 530.646  Robot Devices, Kinematics, Dynamics, and Control  and also 601.463/601.663 Algorithms for Sensor-Based Robotics.  It is OK, if you have previously taken one of these two courses, for you to concurrently take the other course while you are taking 530.707.  It is NOT OK to be taking both of these courses while you are taking 530.707.

You must have already taken 601.220 Intermediate Programming in C++  or an equivalent course.  This is a prerequisite,  not a co-requisite.   You must have completed 601.220 or the equivalent before you take this 530.707.   No exceptions.

Required Computer

Students will need to provide their own computer (with at least a few GB of memory and at least ~30 GB of disc space) running Ubuntu 16.04 LTS LTS (Xenial Xerus)  (https://wiki.ubuntu.com/XenialXerus/ReleaseNotes) or one of its variants such as Xubuntu).  You will also need to install  ROS Kinetic Kame  (http://wiki.ros.org/kinetic) .  Your computer can be dual boot.  Linux installations in a virtualbox are NOT an acceptable substitute.

Prerequisite Knowledge:

This course will benefit you the most if you are already familiar with the following subjects:

  • Kinematics & Dynamics
  • Basic Machine Vision
  • Basic Probability Theory and Random Processes
  • Data Structures
  • Linear Algebra

This course will require you to:

  • Use the Linux Operating System
  • Use the following programming languages:
    • Intermediate C++ programming including data structures (absolutely required)
    • bash
    • Python (optional)
  • Use the following markup languages:
    • XML
  • Use the following software tools:
    • Git
    • CMake

 Notes on Insalling Ubuntu on your PC

Problems installing Ubuntu on some Macs

Note that several students have reported difficulty installing Ubuntu linux on Mac notebook  PCs.

What Desktop Environment Should I Use?

We recommend the Xubuntu Xfce Desktop , which is based on xfce, s being superior to the default Unity Desktop that comes with Ubuntu.  To install the Xubuntu Xfce Desktop after you have installed Ubuntu, first “sudo apt-get install xubuntu-desktop”, then log out of your desktop, select the “Xubuntu Session” (there should be three options: “Ubuntu”, “Xfce”, and “Xubuntu Session” in the desktop selector menu on the login page), and log in.

Xubuntu Xfce Desktop Screenshot – click for higher resolution image.


Unity Desktop Screenshot- click for higher resolution image.

WiFi in Class

In class I will have set up a WiFi access point with SSID “turtlebot_2.4gHz” and “turtlebot_5gHz”, with WiFi password “turtlebot707”. If you connect your PC to this WiFi network you can type commands in real time during the class.

The turtlebot’s do not presently deal gracefully with enterprise WiFi authentication, so this access point will be preferred for working with the EduMIP.

Your PC’s ~/.bashrc should have this to set up environment variables for ROS. Set your ROS_MASTER_URI to http://192.168.10.101:11311 (the IP address of Louis’s PC), and set the ROS_IP to the IP address of your pc 192.168.10.XXX, where you determine the IP address with the command line command “ifconfig”.  The last part of your ~/.bashrc file should look something like this (remember to leave comments for yourself preceded by “#” ):

# --------------------------------------------------------------- 
# set ROS kinetic environment variables
source /opt/ros/kinetic/setup.bash

# chain local workspace
# source ~/catkin_ws/devel/setup.bash source ~/catkin_ws/devel/setup.bash

# set ROS_MASTER_URI to IP address snd port of PC running roscore
export ROS_MASTER_URI=http://192.168.10.101:11311

# set ROS_HOSTNAME or ROS_IP to IP address of YOUR local PC as
# determined by "ifconfig"
export ROS_IP=192.168.10.123
# ---------------------------------------------------------------

After you edit ~/.bashrc be sure to “source ~/.bashrc” in all interactive shells, or kill and restart your shells.

Youtube Videos of Course Presentations

Here is a youtube channel where I will post screen captures of a few lectures for spring 2019 530.707 Robot Systems Programming:

https://www.youtube.com/channel/UCc6F_gt9ITAK5JJsWaAGrxw?view_as=subscriber

The production values are not great – just a screen-capture with audio. I am curious if they are at all useful for you (or not!) – please send me feedback!

Robotics Teaching Lab – Wyman 170

We have 6 computers as dual boot with Windows and 64-bit Xubuntu Linux 16.04 LTS and ROS Kinetic (aka ROS Kinetic Kameo) installed and five turtlebot 2s mobile robots in the Wyman 170 Lab. By Feb 1 or so we hope to have everything set up so that you can boot the computers into Linux, and log in with your JHED ID and password.

  • If you have problems logging in for the First Time on a Workstation: There is a BUG in likewise that sometimes crops up: The first time you log in to a workstation using likewise, the graphical login may fail. Workaround: for your very first (and only the first) login with likewise on a machine:
    • 1. Type CTRL-ALT-F1 to get an ASCII console.
    • 2. Log in with your JHED ID and password. Your home directory on the machine will be named by your jhed id, for my case it is: /home/lwhitco1
    • 3. Log out.
    • 5. Type CTRL-ALT-F7 to get the X-windows login screen.
    • 6. Log in normally using the graphical login screen with you JHED ID and password.
    • 7. For all future logins on this machine, you can log in in normally using the graphical login screen with you JHED ID and password.

Wyman 170 Lab Etiquette:

  • Your account has sudo privileges so BE CAREFUL WHEN USING sudo! The only sudo command you should ever use is “sudo apt-get install ros-kinetic-packagename” where “packagename” is the name of a ros package.
  • Do not edit or modify and ROS files under /opt/ros. If you want to modify a standard ROS package, then download the kinetic versoin of the source-code for the package into your catkin workspace, and modify your local copy. ROS will automatically detect your local copy and use it instead of the system version in /opt/ros.
  • The only version of the operating system we will support is Xubuntu 16.04 64-bit.
  • DO NOT upgrade the operating system to more recent releases.
  • The only version of ROS we will support is ROS kinetic
  • DO NOT install other versions of ROS other than ROS kinetic.
  • Leave the lab spotless.
  • Never not “lock” a workstation with a screen-saver. Locked computers will be powered off.
  • No Backup!: The files on these computers are NOT BACKED UP. Any files that you leave in your account on these computers can disappear at any time. Use https://git-teach.lcsr.jhu.edu to save your projects during and after every work session.
  • When you are finished a work session, log off the computer.
  • If you encounter a problem: Notify Prof. Whitcomb and the TAs if you have any problems with the lab or its equipment. Put “530.707” in the subject line of any email regarding this course.

Git Repository for Handing in Assignments

All assignments and projects for this course will use the git repository https://git-teach.lcsr.jhu.edu.  You may not submit assignments with other git repositories.  

You should already have an active account our account on https://git-teach.lcsr.jhu.edu.

Test your account by  browsing to  https://git-teach.lcsr.jhu.edu and log in with your JHED credentials.

If you have problems with your account on https://git-teach.lcsr.jhu.edu, please send an email addressed to both of (a) the course instructor (llw@jhu.edu) and (b) Mr. Anton Deguet (lcsr-it@jhu.edu).

When you create your assignment projects, please use precisely the project name specified in the assignment, and add the course instructor (lwhitco1), and the TAs (gbaraba1, clin110, and hshi17) as members of your projects, all with DEVELOPER access.

Course Grade Policy

Course grade will be based upon class participation (~10-15%%), weekly assignments (~45%) , and a final independent course project (~45%)

Weekly Assignments

Weekly assignments are due at 4:30PM Tuesday each week.  No credit for late assignments or late pushes to git-teach.lcsr.jhu.edu.   The lowest assignment score is dropped. The TAs are not authorized to accept late assignments.

If you have an extenuating circumstance and need to request an extension for a particular assignment, please request it from Prof. Whitcomb well before the deadline.

Independent Project  Demonstration, Monday-Tuesday May 6-7, 2019

Each team will choose a 30 minute time slot to demonstrate their project to the Instructors, TAs, and other students, faculty, staff, and possibly some local press.

Here is the demo SCHEDULE: https://docs.google.com/spreadsheets/d/1Z2i-C25W6wR9N3rAR2NMUpXqnuQA-iIdFT0fPVsVzCI/edit?usp=sharing

Independent Project Poster Session, 2-4PM Tuesday May 14, 2019

Syllabus

Module 1:  Week of Jan 29, 2019: Course Overview and ROS Basics

NOTE: in this course we will exclusively use Ubuntu 16.04 LTS (or an equivalent release such as Xubuntu 16.04 LTS) and the stable ROS Kinetic Kame release.

Robots of the Week

The EduMIP: Educational Mobile Inverted Pendulum

Topics

  • Course Overview
    • Course Description
    • Prerequisites
    • Assignments
    • Class Project:
      • Video highlights from spring 2018
      • Read the project descriptions and especially the lessons-learned from projects completed in previous years available in the project reports available here: https://jh.box.com/v/530-707-Dropbox.
    • Ethics
  • Background of Robot Software Frameworks and The Open-Source Robotics Community
  • Development Tools, Managing Code, Environments, Installing Software, Building Code
  • The ROS core systems: Packaging, Buildsystems, Messaging, CLI & GUI tools, creating packages, nodes, topics, services, and paramaters.
  • Writing a Simple Publisher and Subscriber (C++)
  • Examining the Simple Publisher and Subscriber
  • Writing a Simple Service and Client (C++)
  • Examining the Simple Service and Client

Reading

Assignments for This Module

Tutorials

  • Install Ubuntu 16.04 LTS (or an equivalent release such as Xubuntu 16.04 LTS).
  • Note: I prefer the Xubuntu desktop to the poorly designed Ubuntu Unity desktop. I have had best results by installing the Ubuntu distribution first. and then installing xubuntu desktop with “sudo apt-get install xubuntu-desktop”. See notes here.
  • Install ROS Kinetic Kame
  • Complete these Tutorials
    • Installing and Configuring Your ROS Environment. NOTE: in this course we will exclusively use Ubuntu 16.04 LTS Xenial Xerus (http://releases.ubuntu.com/16.04) or one of its variants such as Xubuntu 16.04 LTS (http://xubuntu.org/getxubuntu) and ROS Kinetic Kame (http://wiki.ros.org/kinetic).
    • Navigating the ROS Filesystem
    • Creating a ROS Package
    • Building a ROS Package
    • Understanding ROS Nodes
    • Understanding ROS Topics
    • Understanding ROS Services and Parameters
    • Using rqt_console and roslaunch
    • Using rosed to edit files in ROS
    • Creating a ROS msg and a ROS srv
    • Writing a publisher and subscriber in C++
    • Writing a Simple Publisher and Subscriber (C++)
    • Examining the Simple Publisher and Subscriber
    • Writing a service and client in C++
    • Examining the Simple Service and Client

Assignment #1 – Due 4:30PM Tuesday Feb 5, 2019.

  • Write and test a ROS package named “beginner_tutorials” comprised of
    • A C++ publisher node that publishes a TOPIC and
    • A C++ subscriber node that subscribes to this TOPIC.
    • A C++ server node that provides a SERVICE.
    • A C++ client node that calls the server node’s SERVICE
  • Hand in your code project “beginner_tutorials” on https://git-teach.lcsr.jhu.edu
    • Login to your https://git-teach.lcsr.jhu.edu account.
    • Create a project called “beginner_tutorials” on https://git-teach.lcsr.jhu.edu ( When you name the ros project that you will hand in each week, please use EXACTLY the name specified in the assignment – this week’s project, for example, should be entitled “beginner_tutorials”),
    • Initialize your project “beginner_tutorials” as a git repository
    • Add the files to the repo
    • Commit them to the repo
    • Add the remote repository
    • Push your files
    • Push the repo to the server
    • Add the instructor (lwhitco1), and the TAs (gbaraba1, clin110, and hshi17) as members of your project, all witih DEVELOPER access.
    • Keep all your git-teach assignment repositories private; do not make them public.
    • See us with questions.

Module 2:  Week of February 5, 2019:  Roslaunch, Nodes, tf, Parameters, and Rosbag

Robots of the Week:

Boston Dynamics Big Dog:

Boston Dynamics Spot Mini:

Topics

  • rosbag
  • roswtf
  • ROS.org
  • tf
  • RVIZ
  • Getting and setting parameters in roslaunch and C++ nodes

Reading

Assignments for This Module

Tutorials

Assignment #2 – Due 4:30PM Tuesday Feb 12, 2019.

  • Write and test a ROS package named “learning_tf” containing your source code in learning_tf/src, rviz initialization file in learning_tf/rviz, and launch files in learning_tf/launch for
    1. A single simulated turtle, a keyboard input node, and a tf broadcaster node as follows:
      • Your  C++ tf broadcaster node entitled turtle_tf_broadcaster.cpp that subscribes to a /turtleX/pose topic and publishes the world-to-turtleX transform on the tf topic.
      • a launch file entitled start_demo_01.launch that
        • sets the parameter “scale_linear” of type double to a value of 2.5
        • sets the parameter “scale_angular” of type double to a value of 2.5
        • launches one instance of ros turtlesim_node from the package turtlesim, giving it the node name of “turtle_simulation_node”
        • launches one instance of turtle_teleop_key node from package turtlesim, giving it the name “teleop_key_node”, with output=”screen” launch option.
        • launches one instance of your compiled turtle_tf_broadcaster.cpp node from your learning_tf package, giving it the node name of “turtle1_tf_broadcaster_node” and specifying that it should subscribe to the turtle1 pose topic.
        • launches rviz, specifying that it load an rviz initialization file, named “learning_tf/rviz/learning_tf.rviz”. You create this file by manually configuring rviz, and saving the initialization file with FILE->,Save Config As rviz menu item. This rviz initialization file should
          • specify “world” as the fixed frame,
          • display a tf visualizer in rviz with
            • the tf frame “Marker Scale” set to a value of 5.
            • the tf frame “Show Names” parameter checked to display the names of each frame
            • the tf frame “Show Arrows” checked to show a vector between proximal and distal frame origins.
    2. Two single simulated turtles, a keyboard input node, two tf broadcaster nodes, and a new tf_listener node, as follows:
      • Your  C++ tf broadcaster node entitled turtle_tf_broadcaster.cpp , as used in the previous section.
      • Your  C++ tf listener node entitled turtle_tf_listener.cpp that
        • spawns a second simulated turtle
        • and then does the following at 10 Hz:
          • Uses listener.waitForTransform() and listener.lookuptransform() to compute the transform from turtle2 to turtle1.
          • Compute the linear and angular difference between the two frames.
          • Publishes a twist message on the topic /turtle2/cmd_vel to command turtle2 to drive toward turtle1.
      • a launch file entitled start_demo_02.launch that does everything in start_demo_01.launch, and also adds:
        • launches a second instance of your compiled turtle_tf_broadcaster.cpp node from your learning_tf package, giving it the node name of “turtle2_tf_broadcaster_node”.
        • launches one instance of your compiled turtle_tf_listner.cpp node from your learning_tf package, giving it the node name of “turtle_tf_listener_node”.
    3. Two single simulated turtles, a keyboard input node, two tf broadcaster nodes, a tf_listener node, and a new frame_tf_broadcaster.cpp node as follows:
      • Your  C++ tf broadcaster node turtle_tf_broadcaster.cpp as previously.
      • Your  C++ tf listener node turtle_tf_listener.cpp as previously.
      • A new C++ tf broadcaster node entitled frame_tf_broadcaster.cpp that
        • Publishes at 10 Hz a transform between turtle1 and a new frame named “carrot1”, specifying the carrot1 frame to be 3 meters above (i.e. along the Z axis) the turtle1 frame.
        • Uses a ROS timer callback function (as demonstrated in class, and in assigned reading and tutorials on ROS timers) and ros::spin() in the main(), instead of the rate.sleep() in its main().
      • a launch file entitled start_demo_03.launch that
        • includes everything in start_demo_02.launch by reference to launch everything in the previous example
        • launches one instance of your compiled frame_tf_broadcaster.cpp node from your learning_tf package, giving it the node name of “frame_tf_broadcaster_node”.
      • Experiment with what happens if you vary the publishing rate of the carrot1 frame in frame_tf_broadcaster.cpp. Remember to recompile the code and relaunch the start_demo_03.launch file each time you change the rate. (you do not need to hand in anything for these rate experiments, just try it).
        • slow the publishing rate of the carrot1 frame rate down to, say, 0.5 Hz. recompile the
        • speed up the publishing rate rate to, say, 50 Hz
      • a launch file entitled start_demo_03.launch that does everything in start_demo_02.launch, and also adds:
        • launches one instance of your compiled frame_tf_broadcaster.cpp node from your learning_tf package, giving it the node name of “frame_tf_broadcaster_node”.

Hand in your code project “learning_tf” on https://git-teach.lcsr.jhu.edu and share it with the instructors.

Add the instructor (lwhitco1), and the TAs (gbaraba1, clin110, and hshi17) as members of your project, all with DEVELOPER access.

Keep all your git-teach assignment repositories private; do not make them public.

Get started on next week’s assignment!

Module 3: Week of February 12, 2019 EduMIPs, Joy, and ROS Node development in C++

Robots of the Week:  Boston Dynamics Spot Mini and Harvest Automation HV-100

Nereid Under-Ice (NUI):

Harvest Automation HV-100:

  • https://www.public.harvestai.com
  • https://www.youtube.com/watch?v=OPIyGyBNwAo&t=24s
  • https://www.youtube.com/watch?v=PXRpZDV4bCY

Topics

  • Assembling the EduMIP mobile robot
  • Installing and testing joysticks
  • Publishing /joy topic with the ROS joy package
  • Joystick tutorials – including teleoperating a robot from a joystick
  • ROS timers
  • Writing your own package to subscribe to the joystick /joy and publish a geometry_msgs/Twist topic to command the EduMIP.
  • Writing launch files for same.
  • Running ROS systems spanning more than one computers.

Reading

Assignments for This Module:

Assignment #3: Due 4:30PM Tuesday Feb 19, 2019

  • Assemble and Test the EduMIP as described here: https://dscl.lcsr.jhu.edu/home/courses/edumip_ros
  • Set Up Your Beaglebone Black Wireless Board for Your EduMIP: Install a 8GB Ubuntu 16.04 LTS  image pre-loaded with ROS Kinetic and support for the Robotics Cape on a Micro-SD card on your BBBW. Follow the instructions here: https://dscl.lcsr.jhu.edu/home/courses/edumip_ros
  • Joystick Assignment #1 of 3: Install your joystick and test it (nothing to hand in for this part of the assignment)
    • Plug in and test your USB joystick
      • List the usb devices on your computer with the “lsusb” command with the joystick plugged USB cable connected and also when disconnected.
      • See that the device /dev/input/js0 appears when your joystick is connected, and that this device vanishes when the joystick is disconnected
      • Use the command “jstest /dev/input/js0” to test your joystick. This utility gives text output of the joystick data.
      • Alternatively, test the joystick with the graphical widget “jstest-gtk”.
        • Install this utility with the command “sudo apt-get install jstest-gtk”
        • Run this utility it with the command “jstest-gtk”.
  • Joystick Assignment #2 of 3: Complete the tutorial  Configuring and Using a Linux-Supported Joystick with ROS.   Do not hand in anything for this tutorial, it is just to get you started on running ROS across multiple computers.
    • Notes on this tutorial for most Ubuntu 16.04 installations:
      • The default joystick is /dev/input/js0 (where “0” is numeral zero, not the letter O.
      • The permissions for /dev/input/js0 are already OK, i.e. you NOT need to change the permissions for /dev/input/js0 with the command “sudo chmod a+rw /dev/input/js0”.
      • The ROS joy_node automatically looks for the device /dev/input/js0. You do NOT need to set the parameter with the command “rosparam set joy_node/dev “/dev/input/js0”.
    • Run “roscore” in one terminal, then run “rosrun joy joy_node” and look at the topic /joy
    • Be sure to use the commands “rosnode list”, “rostopic list”, and rostopic echo /joy” to explore the /joy topic messages.
  • Joystick Assignment #3 of 3 (to hand in) (Git Project name: joy_twist ):
    • Create a ROS C++ package entitled “joy_twist”, with dependencies to roscpp, std_msgs, geometry_msgs, and sensor_msgs with the command “catkin_create_pkg joy_twist roscpp std_msgs geometry_msgs sensor_msgs”.
    • In this package create a C++ node entitled joy_twist.cpp that subscribes to a sensor_msgs/Joy joystick topic entitled “/joy” and publishes a geometry_msgs/Twist topic named “/edumip/cmd”. We suggest you use a ROS Timer callback function to publish the Twist messages at 10Hz – see ROS Timer documentation for details.
      • Your node should assign joystick axis 1 to twist.linear.x, and joystick axis 0 to twist.angular.z — BUT YOU CAN CHOOSE A DIFFERENT MAPPING IF YOU LIKE — you may need to change a sign in the assignment so that pushing the joystick forward makes twist.linear.x positive, and pushing the joystick to the right makes the twist.angular.z positive.
    • In this package create a launch file entitled joy_twist.launch  in the joy_twist/launch directory that
      1. Launches a joy node from the ROS joy package, which opens and reads the USB joystick values and publishes them as sensor_msgs/Joy messages on the topic /joy
      2. Launches your joy_twist node which subscribes to sensor_msgs/joy messages on the /joy topic and publishes geometry_msgs/twist messages on the /edumip/cmd topic.
    • Be sure to use the commands “rosnode list”, “rostopic list”, “rostopic echo”, “rostopic type”, and “rostopic hz” to explore the /joy and /edumip/cmd topics.
    • Run rqt_graph to see the nodes and topics graphically.
  • Hand in your code project name “joy_twist” on https://git-teach.lcsr.jhu.edu and share it with the instructors. Add the instructor (lwhitco1), and the TAs (gbaraba1, clin110, and hshi17) as members of your project, all with DEVELOPER access.
    Keep all your git-teach assignment repositories private; do not make them public.
  • DEMONSTRATE YOUR ROBOT UNDER JOYSTICK CONTROL TO YOUR INSTRUCTORS during either office hours or at the beginning or end of class on Tuesday February 19, 2019.

Running ROS Across Multiple Computers – notes and tutorials you will need for this week’s assignment

  • Complete the tutorial Running ROS Across Multiple MachinesDo not hand in anything for this tutorial, it is just to get you started on running ROS across multiple computers.
  • Configure WiFi on your EduMIP: Follow the instructions here: https://dscl.lcsr.jhu.edu/home/courses/edumip_ros
  • Determine the IP addresses of your PC and your EdhMIP:
    • ifconfig: Use the command “ifconfig” to see all configured network interfaces (Ethernet, WiFi, USB, etc) on your machine.
    • iwconfig: Use the command “iwconfig” to see all configured WiFi network interfaces on your machine.
    • Who am I? The easiest way to determine the IP address (or addresses) of a Linux machine is to log into it and use the command “ifconfig”.
  • In this example I will assume the following IP addresses (YOURS WILL BE DIFFERENT):
    • My PC has IP address 192.168.10.101
    • My EduMIP BBBW has IP address 192.168.10.102
  • On your PC set the ROS environment variables to look for the ros master (roscore) on the PC with the .bashrc commands (add these commands to the end of your  .bashrc file):
"export ROS_MASTER_URI=http://192.168.10.101:11311" <- this tells ROS the IP address of the machine that is running the ros master (roscore).
"export ROS_IP=192.168.10.101" <- this tells ROS the IP address of this machine (your PC).
  • On your EduMIP set the ROS environment variables to look for the ros master (roscore) on the PC with the .bashrc commands(add these commands to the end of your  .bashrc file):
 "export ROS_MASTER_URI=http://192.168.10.101:11311" <- this tells ROS the IP address of the machine that is running the ros master (roscore).

"export ROS_IP=192.168.10.102" <- this tells ROS the IP address of this machine (your EduMIP).
  • Test your configuration:
    1. On your PC, in a new shell, run roscore.
    2. On your PC, in a new shell, run “rostopic list”, you should see the standard default topics from the roscore on your PC.
    3. On your EduMIP, in a new shell, run “rostopic list”, you should see the standard default topics from the roscore on your PC. Yay!
    4. You can now publish a topic on one machine, and subscribe to the topic on the other machine. For example
      1. On your EduMip, publish a topic with command “rostopic pub -r 1 my_topic std_msgs/String “hello there”
      2. On your PC, subscribe to this topic with “rostopic echo /my_topic”
      3. On your PC, run “rqt_graph” to visualize the nodes and topics.
This image has an empty alt attribute; its file name is hw3_Rosgraph_edumip.png
rqt_plot diagram depicting the nodes, topics, and data flow between the nodes via the topics for Assignment #3. Click for higher resolution image.
  1. Now you can control your EduMIP from a joystick on your PC:
    1. On your PC run your joy_twist launch file with the command “roslaunch joy_twist joy_twist.launch” to run roscore, joy and joy_twist nodes. Recall that:
      • The joy node publishes sensor_msgs/Joy messages on the topic /joy
      • The joy_twist node subscribes to sensor_msgs/Joy messages on the topic /joy and publishes geometry_msgs/Twist messages the /edumip/cmd topic.
    2. On your EduMIP run edumip_balance_ros with the command “roslaunch edumip_balance_ros edumip_balance_ros”. Recall that:
      • The edumip_balance_ros node subscribes geometry_msgs/Twist messages the /edumip/cmd topic and
      • The edumip_balance_ros node publishes edumip_balance_ros/EduMipState messages on the topic /edumip/state
    3. Stand up your EduMIP and take it for a drive.
    4. Explore the ROS topics, and rqt_graph.

Module 4: Week of February 19, 2019: URDF and Robot State Publisher

Announcements!

  • Need native boot Ubuntu 16.04 – no virtual boxes!
  • Push on time by Tuesday 4:30 ET.
  • Make sure PC and EduMIP are on same SUBNET!
  • Get started early on assignments!

Robots of the Week

The Sentry Autonomous Underwater Vehicle (AUV)

Topics

  • Unified Robot Description Format (URDF)
  • Robot State Publisher

Reading

  • URDF Documentation
  • Joint State Publisher Documentation
  • ROS Workspaces Documentaiton
  • Notes from Class: In class we downloaded the urdef tutorial package into my catkin workspace. Here are few notes on the steps taken to do this. In this example we will download a local copy of the urdef_tutorial ROS package into my catkin workspace~/catkin_ws/src/urdf_tutorial. You can edit the local copy in your workspace. The system copy located here /opt/ros/kinetic/share/urdf_tutorial but you cannot edit these files because they are protected system files. Better to edit your own local copy in your catkin workspace rather than mucking with the system copy. This is an example of workspace overlay where we create a package in a local workspace that ROS will use in preference to the default system package of the same name. Linux commands are shown in bold font. Comments are in italic font.
    • cd ~/catkin_ws/src (cd to ~/catkin_ws/src)
    • git clone https://github.com/ros/urdf_tutorial.git (clone the git tutorial from github. Note that this creates the directory ~/catkin_ws/src/urdf_tutorial and associated subdirectories.)
    • cd ~/catkin_ws (cd to ~/catkin_ws)
    • rm -r devel build (remove the catkin_ws/devel and catkin_ws/build directory trees, which deletes ~/catkin_ws/devel/setup.bash
    • catkin_make (Builds everything in my workspace from scratch, including generate a new ~/catkin_ws/devel/setup.bash)
    • source devel/setup.bash (Source the newly created file ~/catkin_ws/devel/setup.bash to add this new workspace to the ROS bash environment variables, in particular it will add the present workspace to the ROS_PACKAGE_PATH environment variable)
    • echo $ROS_PACKAGE_PATH (Look again at the ROS_PACKAGE_PATH environment variable that was set by the previous command. It should NOW be a string with your catkin workspace listed at the first element, followed by the standard package path like this: ROS_PACKAGE_PATH=/home/llw/catkin_ws/src:/opt/ros/kinetic/share)
    • rospack profile (This command forces rospack to rebuild the cache of the ros package path that is used by roscd. The cache the text file ~/.ros/rospack_cache).
    • roscd urdf_tutorial (Now roscd will take me to my local copy of the urdef tutorial in ~/catkin_ws/src/urdf_tutorial instead of taking me to the system copy located here /opt/ros/kinetic/share/urdf_tutorial )
    • roslaunch urdf_tutorial display.launch model:=urdf/01-myfirst.urdf (Now I can run the tutorial exercise and edit the local URDF files in ~/catkin_ws/src/urdf_tutorial/urdf)
    • Note that the later tutorial urdf files such as urdf/05-visual.urdf refer to a PR2 gripper model mesh files. If you get error messages from RVIZ like “[ERROR] [1393792989.872632824]: Could not load resource [package://pr2_description/meshes/gripper_v0/l_finger.dae]: Unable to open file “package://pr2_description/meshes/gripper_v0/l_finger.dae” then you need to install the PR2 mesh files. You can install the PR2 model files with the command sudo apt-get install ros-kinetic-pr2-common
    • Note that beginning with urdf/05-visual.urdf RVIZ throws lots of warnings like “TIFFFieldWithTag: Internal error, unknown tag 0x829a.” but the program runs OK.

Assignments for This Module

Tutorials

  • Learning URDF Step by Step
    • 1. Building a Visual Robot Model with URDF from Scratch
    • 2. Building a Movable Robot Model with URDF
    • 3. Adding Physical and Collision Properties to a URDF Model
    • 4. Using Xacro to Clean Up a URDF File
  • Learning URDF (including C++ API)
    • 1. Create your own urdf file
    • 2. Parse a urdf file
    • 3. Using the robot state publisher on your own robot
    • 4. Skip this one for now: Start using the KDL parser (You can skip this tutorial for now if you like, it is not required for this module’s assignment.)
    • 5. Using urdf with robot_state_publisher

Assignment #4  Due 4:30PM Tuesday Feb 26, 2019

On your PC clone (with “git clone …”)  the package edumip_msgs into your ~/catkin_ws/src directory from this repo  https://git.lcsr.jhu.edu/lwhitco1/edumip_msgs.git This package defined the message EduMipState.msg.  After you have downloaded this package into your catkin_ws/src directory, run catkin_make to create the message files on your PC.

Here is the definition found in the file esumip_msgs/msg/EduMipState.msg message definition file:

# EduMIP balance controller data
float32 setpoint_phi_dot # commanded average wheel vel trans vel
float32 setpoint_gamma_dot # commanded steering angular vel
float32 setpoint_phi # commanded average wheel pos
float32 phi # average wheel pos
float32 setpoint_gamma # commanded steering angle
float32 gamma # steering angle
float32 setpoint_theta # commanded body tilt
float32 theta # body tilt
float32 d1_u # control command for balnce loop
float32 d3_u # control command for steering loop
float32 dutyL # left motor duty cycle
float32 dutyR # right motor duty cycle

# 2017-02-22 LLW Added odometry data
float32 wheel_angle_L # total rotation of left wheel (radians) (+ is forward)
float32 wheel_angle_R # total rotation of right wheel (radians) (+ is forward)
float32 body_frame_easting # displacemnt of body frame (m) (+ is East )
float32 body_frame_northing # displacemnt of body frame (m) (+ is North)
float32 body_frame_heading # compass heading (radians)
float32 vBatt # battery voltage in volts
bool armed # controllers are active bool running # balance program is running

  • Develop a ROS package named edumip_my_robot for your EduMIP.

  Rqt_plot diagram depicting the nodes, topics, and data flow between the nodes via the topics for Assignment #4. Click for higher resolution image.

  • Your package should consist of at least the following:
    • An URDF file named urdf/edumip_my_robot.urdf’ (or better yet a xacro file urdf/edumip_my_robot.xacro ) describing the robot links and joints. Your link and joint names should be prcisely the following:
      • A body link named “edumip_body”
      • A left wheel link named “wheelL”
      • A right wheel link named “wheelR”
      • A left continuous joint named “jointL” with parent link “edumip_body” and child link “wheelL”.
      • A right continuous joint named “jointR” with parent link “edumip_body” and child link “wheelR”.
      • Here are the measured parameters I used in my xacro file – units are meters and radians:
<!-- Numerical Macros - units are meters and radians -->
<xacro:property name="wheel_separation" value="0.070" /> 
<xacro:property name="wheel_radius" value="0.034" /> 
<xacro:property name="wheel_width"  value="0.010" />
<xacro:property name="body_height"  value="0.1" /> 
<xacro:property name="body_width"   value="0.050" /> 
<xacro:property name="body_length"  value="0.035" />
    •  A C++ node named src/edumip_my_robot_publisher.cpp
    • Your node should subscribe to the /edumip/state message that has recently been expanded to include wheel joint angles and odometry data (i.e. robot X, Y, and heading).
    • Your node should publish the following:
      • sensor_msgs/JointState messages for this robot on the topic /joint_states.. Look at the message definition file edumip_msgs/msgs/EduMipState.msg to see comments on the state message fields.
      • A tf transform from the fixed “world” frame to this robot’s “robot_base” frame that specified the moving position and orientation of the robot with respect to the fixed “world” frame.
      • NOTE:Your node should have a SINGLE callback function for subscribing to the /edumip/state topic, and within this callback function it should publish the the /jount_states topic and the /tf topic.
  • A RVIZ initialization file called “rviz/edumip_my_robot.rviz” that displays your robot_model and tf frames.
  • A launch file named ‘launch/edumip_my_robot.launchthat
    1. Launches a joy node from the system-defined joy ROS package. Recall that the joy node publishes sensor_msgs/Joy messages on the topic /joy
    2. Launches your joy_twist node from the joy_twist package that you wrote for last module’s assignment. Recall that the joy_twist node subscribes to sensor_msgs/Joy messages on the topic /joy and publishes geometry_msgs/Twist messages the /edumip/cmd topic.
    3. Launches your custom edumip_my_robot_publisher C++ node from your edumip_my_robot package that you wrote for this module’s assignment. Recall that this node subscribes to the /edumip/state topic and publishes on the /joint_states topic and the /tf topic as described earlier in this assignment.
    4. Launches a standard robot_state_publisher node from the robot_state_publisher package. Recall that this node subscribes to the /joint_states topic and the robot_description parameter and publishes /tf frames for the robot based on you urdf and the joint_states.
    5. Sets the parameter “robot_description” to load your urdf/edumip_my_robot.urdf’ (or .xacro) that you wrote for this module to model your edumip.
    6. Launches RVIZ specifying the a rviz initialization file entitles rviz/edumip_my_robot.rviz. hat you create to visualize your robot_model (defined by your urdf or xacro file) and visualization of your tf frames.
    • Now you should be able to drive your robot around with your joystick and see your robot model drive around in RVIZ complete with depiction of the robot and its coordinate frames.
    • NOTE that RVIZ will not display your robot model correctly until it receives valid /tf transforms for all of the robot links. If the link /tf transforms are not valid then RVIZ will show errors in the “robot_model” RVIZ GUI, and the robot links will appear white instead of the colors specified in your urdf/xacro file.
    • Hand in your code project name “edumip_my_robot” on https://git-teach.lcsr.jhu.edu and share it with the instructors.
    • Add the instructor (lwhitco1), and the TAs (gbaraba1, clin110, and hshi17) as members of your project, all with DEVELOPER access.
    • Keep all your git-teach assignment repositories private; do not make them public.
RVIZ screen shot for Assignment #4 showing visualization of your robot_model (defined by your urdf or xacro file) and visualization of your tf frames.

Module 5: Week of February 26, 2019: Gazebo Intro, SDF, and worlds

Topics

  • Simulating robots, their environments, and robot-environment interaction with Gazebo
  • Gazebo ROS integration.
  • NOTE: You do not need to install Gazebo. Your full ROS kinetic desktop installation will have installed Gazebo V7.0.0 So DO NOT follow the installation instructions on gazebosim.org. If for some reason the ros gazebo package is not installed, install it with sudo apt-get install ros-kinetic-gazebo-ros ros-kinetic-gazebo-ros-pkgs ros-kinetic-gazebo-ros-control
    • You can verify that Gazebo is installed by issuing the command “gazebo” on the command line – a gazebo window should open after a few seconds delay.
  • NOTE: Gazebo is CPU-intensive, and will not run very well in virtual boxes.

Reading

  • Gazebo Overview
  • Gazebo API
  • SDFormat Specification: The SDF XML file format is a superset of URDF.  SDF files are how Gazebo defines robots and the environment. You can generate SDF from URDF or XACRO on-the-fly, so in practice it is easier to maintain a single XACRO file, and use it to generate URDF and SDF from it on-the fly.

Assignments to for This Module

Tutorials

  • Gazebo Version 7.0 Tutorials
    • Note: In this next set of Gazebo tutorials you will use the  command-line “gazebo”, not the ROS gazebo package.
    • Beginner – First-time Gazebo Users
    • Get Started
      • Skip “Install”. Do not install Gazebo, it was installed when your installed the full ROS kinetic desktop.
      • Quick Start: How to run Gazebo with a simple environment.
      • Gazebo Components: This page provides and overview of the basic components in a Gazebo simulation.
      • Gazebo Architecture: Overview of Gazebo’s code structure.
      • Screen Shots
    • Build a Robot
      • Model Structure and Requirements: How Gazebo finds and load models, and requirements for creating a model.
      • Skip this one for now: How to contribute a model.
      • Make a model: Describes how to make models.  Read this one, but there is no exercise to do so quickly move to the next tutorial…
      • Make a mobile robot: How to make model for a 2-wheeled mobile robot.
      • Import Meshes
      • Attach Meshes
      • Add a Sensor to the Robot
      • Make a simple gripper
      • Attach gripper to robot.
    • Build a World
      • Build a world
      • Modifying a world
      • Skip these for now: “Digital Elevation Models”, “Population of models”, and “Building Editor” for now — you can return to them at a later date when and if you need them.  Digital elevation models can be particularly useful in providing realistic simulated terrain for a simulated robot to explore.
    • Friction: How to set friction properties. Be sure to experiment with the more exhaustive friction example linked at the very end of this tutorial. This is the example that I showed in class with sliding blocks. Modify the gravity, friction, and initial position/orientation of the objects to observe different dynamics.
  • Connect to ROS: ROS integration
    • Note: In these ROS Gazebo tutorials you will use the  ROS Gazebo package (“rosrun gazebo_ros gazebo”), not the command-line “gazebo”.
    • Ros Overview
    • Skip this: Which combination of ROS/Gazebo versions to use You can skip this as you will use the default version 7.0.0 that comes with ROS kinetic — see gazebo ROS installation notes above.
    • Installing gazebo_ros_pkgs
      • Note that in this tutorial in addition to apt-get installing some gazebo binary packages you will also clone the KINETIC branch of git clone https://github.com/ros-simulation/gazebo_ros_pkgs.git into your ROS workspace with the command “git clone https://github.com/ros-simulation/gazebo_ros_pkgs.git -b kinetic-devel”.
    • Using roslaunch: Using roslaunch to start Gazebo, world files and URDF models
    • URDF in Gazebo: Using a URDF in Gazebo
      • Note that you will clone git clone https://github.com/ros-simulation/gazebo_ros_demos.git into your ROS workspace  (~catkin_ws/src) in this tutorial.

Assignment #5  Due 4:30PM Tuesday March 5, 2019

Screen shot of Gazebo screen shot from Assignment #5 showing the EduMIP robot model derived from a XACRO file. Click thumbnail for higher resolution image.

  • Create a new Gazebo ROS package named edumip_my_robot_gazebo_ros with dependencies to at least the following packages: roscpp tf std_msgs sensor_msgs geometry_msgs edumip_msgs gazebo_msgs gazebo_ros.
    • DO not use the directory structure specified in the Creating your own Gazebo ROS Package tutorial and exemplified in the URDF in Gazebo RRBot package that you downloaded and used in this tutorial.
  • Your project should have at least the following sub-directories:
    • edumip_my_robot_gazebo_ros/urdf for your xacro file
    • edumip_my_robot_gazebo_ros/launch for your launch file
    • edumip_my_robot_gazebo_ros/worlds for your world file
  • Create a XACRO file urdf/edumip_my_robot.xacro for your EduMip – you can begin with the XACRO file you created for HW#4. If you did not create a XACRO file for HX#4, then do so now.
    • You can use a xacro file in your launch file to set the robot_descriptionparameter with the launch file command “<param name=”robot_description” command=”$(find xacro)/xacro – –inorder $(find edumip_my_robot_gazebo_ros)/urdf/edumip_my_robot.xacro” />’“.      NOTE: the “inorder” argument is preceeded by two dashes!
    • Add additional statements to your XACRO file so that it can be automatically translated to SDF format for use by Gazebo.
      • Your robot should have one link named “edumip_body” with at least the following attributes:
        • <visual> with
          • <origin>
          • <geometry>
          • <material>
        • <collision>
          • <origin>
          • <geometry>
        • <inertial> with
          • <origin>
          • <mass>
          • <inertia>
      • Your robot should have two wheels named “WheelL” and “WheelR” with at least these attributes:
        • <visual> with
          • <origin>
          • <geometry>
          • <material>
        • <collision>
          • <origin>
          • <geometry>
        • <inertial> with
          • <origin>
          • <mass>
          • <inertia>
      • Your robot should have two joints named “JointL” and “JointR”, each with
        • <parent>
        • <child>
        • <origin>
        • <axis>
    • When you simulate your edumip in gazebo, it will not balance.  It will fall over because gravity is acting on it and at present it has no control program like “edumip_ros_balance” to make it balance.  Next week we will use a gazebo plugin to make the simulated robot balance!
    • Your XACRO file should specify physically realistic numbers for the robot link masses and moments of inertia.  My robot body has a mass of about 0.18 Kg, and my wheels mass of about 0.030 Kg.

Here is how I specified the mass and moments of inertia for my edumip body:

<link name="edumip_body">  
  <visual>  
    <origin xyz="0 0 ${0.5*body_height}" rpy="0 -0.20 0"/>  
    <geometry>  
      <box size="${body_length} ${body_width} ${body_height}"/>  
    </geometry>  
    <material name="Blue">  
      <color rgba="0 0.0 1.0 0.5"/>  
    </material>  
  </visual> 
  <collision>  
    <origin xyz="0 0 ${0.5*body_height}" rpy="0 -0.20 0"/>   
    <geometry>  
      <box size="${body_length} ${body_width} ${body_height}"/>   
    </geometry>  
  </collision>  
  <inertial>  
    <origin xyz="0 0 ${0.5*body_height}" rpy="0 0 0"/>  
    <mass value="0.180"/>  
    <inertia ixx="6.0e-4" ixy="0" ixz="0" iyy="6.0e-4" iyz="0" izz="6.0e-4"/>  
  </inertial>  
</link>  

Here is how I specified the mass and moments of inertia for my robot’s right wheel:

<link name="wheelR">

<visual>
 <origin xyz="0 0 0" rpy="1.57079 0 0" />
 <geometry>
 <cylinder length="${wheel_width}" radius="${wheel_radius}"/>
 </geometry>
 <material name="Green">
 <color rgba="0.0 1.0 0.0 0.5"/>
 </material>
 </visual>

<collision>
 <origin xyz="0 0 0" rpy="1.57079 0 0" />
 <geometry>
 <cylinder length="${wheel_width}" radius="${wheel_radius}"/>
 </geometry>
 </collision>

<inertial>
 <origin xyz="0 0 0" rpy="0 0 0"/>
 <mass value="0.030"/>
 <inertia ixx="1.75e-5" ixy="0" ixz="0" iyy="1.75e-5" iyz="0" izz="1.75e-5"/>
 </inertial>

</link>
  • Your robot should have nice happy colors.
  • Colors are specified differently for RVIZ (urdf) and Gazebo (sdf). Coulomb friction is specified as <mu1> and <mu2> dimensionless parameters. You can add these lines to your XACRO file which will generate SDF-compatible color specifications when the XACRO file is translated to SDF:
 <gazebo reference="edumip_body">
   <material>Gazebo/Blue</material>
   <mu1>0.2</mu1>
   <mu2>0.2</mu2>    
 </gazebo>
 <gazebo reference="wheelR">
   <material>Gazebo/Green</material>
   <mu1>0.2</mu1>
   <mu2>0.2</mu2>    
 </gazebo>
 <gazebo reference="wheelL">
   <material>Gazebo/Red</material>
   <mu1>0.2</mu1>
   <mu2>0.2</mu2>    
 </gazebo>
  • Create a gazebo world file named world/edumip_my_robot.world which provides at least horizontal plane, gravity, and some objects or buildings. Your world file can contain your robot model, or you can spawn your robot model in your launch file. In Gazebo you should be able to cause the robot to move on the plane by applying torques to the wheel (or leg) joints.
  • Your world file
  • Create a launch file named launch/edumip_my_robot_gazebo.launch which launches your robot world with your robot in it — recall that you did this in the assigned tutorial section on roslaunch with gazebo. Your launch file should do the following:
    • Set some parameters – see how the “$(find…” command is used in the tutorial launch files.
      • Set the parameter robot_description to the contents of your xacro file with the command ” <param name=”robot_description” command=”$(find xacro)/xacro – –inorder $(find edumip_my_robot_gazebo_ros)/urdf/edumip_my_robot.xacro” />”  NOTE: the “inorder” argument is preceeded by two dashes!
      • Set the parameter world_name to the name of the world file using similar syntax.
    • Spawn a model of your EduMip with the roslaunch file using the “spawn_model” node in the “gazebo_ros” package with arguments args=”-param robot_description -urdf -model edumip_my_robot”. Read the documentation of this package for details.
  • Run your launch file with roslaunch. It should launch gazebo with your robot model at the origin.
    • Make your robot move around in the Gzebo World:
      • Select your robot with the mouse
      • On the right hand side og the gazebo window, with your mouse swipe open the “Joint” pane.
      • Apply some torque to he joints to make your robot move.
    • Introspect on the topics with rostopic list’ and rostopic echo.
  • Hand in your code project name “edumip_my_robot_gazebo_ros” on https://git-teach.lcsr.jhu.edu and share it with the instructors.
  • Add the instructor (lwhitco1), and the TAs (gbaraba1, clin110, and hshi17) as members of your project, all with DEVELOPER access.
  • Keep all your git-teach assignment repositories private; do not make them public.

Module 6: Week of March 5, 2019: Gazebo physical simulation, ROS Integration

Preliminary Project Ideas Due 4:30PM Thursday March 7, 2018

Write up one or more project ideas here: https://docs.google.com/spreadsheets/d/1-3GTmRvwsIyfB7-bqRkJp-Sz1oA1zOZPS9CTMCErq9A/edit?usp=sharing

Topics

  • Simulating robots, their environments, and robot-environment interaction with Gazebo
  • Gazebo ROS integration.
  • Gazebo Intermediate Concepts

Reading

Assignments to do for This Module

Tutorials

Gazebo screen shot showing the RRBot camera and laser-scanner sensor data. Click for higher resolution image.

.

Gazebo screen shot showing the RRBot camera and laser-scanner sensor data. Click for higher resolution image.
  • Connect to Ros: ROS Integration
    • Gazebo plugins in ROS
      • Note: To visualize the laser in gazebo as shown in the Gazebo figure above and in the tutorial  here you will also need to set the visualize property of the hokuyo laser plugin to true (i.e. “<visualize>true</visualize>”).
      • Note: As mentioned in class, if you are running a PC or VirtualBox without a GPU graphics adapter then the Gazebo laser scanner plugin may not simulate the laser scanner properly, or may crash when you run the launch file with an error message beginning something like [gazebo-1] process has died [pid 3207, exit code 139, cmd /opt/ros/indigo/lib/gazebo_ros/gzserver… – so you will need to modify the rrbot.gazebo sdf file to use non-GPU hokuyo laser plugin as follows:
        • Replace <sensor type=”gpu_ray” name=”head_hokuyo_sensor”> with <sensor type=”ray” name=”head_hokuyo_sensor”>
        • Replace <plugin name=”gazebo_ros_head_hokuyo_controller” filename=”libgazebo_ros_gpu_laser.so”> with <plugin name=”gazebo_ros_head_hokuyo_controller” filename=”libgazebo_ros_laser.so”>
      • Note: To visualize the laser in gazebo as shown in the Gazebo figure on the right of this web page and in the tutorial here you will also need to set the visualize property of the hokuyo laser plugin to true (i.e. “<visualize>true</visualize>”).
    • ROS control Before you do this tutorial be sure that you have the kinetic ros-control packages installed with the command “sudo apt-get install ros-kinetic-ros-control ros-kinetic-ros-controllers”. If these packages are not installed on your system you will get error messages like [ERROR] [WallTime: 1395611537.772305] [8.025000] Failed to load joint_state_controller and [ERROR] [WallTime: 1395611538.776561] [9.025000] Failed to load joint1_position_controller.
    • ROS communication with Gazebo
    • ROS Plugins
      • Note: If you previously downloaded the gazebo_ros_demos tutorial package from this tutorial then you do not need to create a new custom package names “gazebo_tutorials” for this tutorial since the “gazebo_tutorials” package is already present in the directory ~/catkin_ws/src/gazebo_ros_demos/custom_plugin_tutorial.
    • Advanced ROS integration

Assignment #6  Due 4:30PM Tuesday March 12, 2019

Expand upon your package named edumip_my_robot_gazebo_ros that you created for the previous module’s assignment.

  • In your robot’s xacro file urdf/edumip_my_robot.xacro add a plugin from the package edumip_balance_ros_gazebo_plugin that will actively balance your robot.
    • Clone the and build the plugin from this public repository into your ros workspace src directory: https://git.lcsr.jhu.edu/lwhitco1/edumip_balance_ros_gazebo_plugin.git
    • There are a name clashes between the packages edumip_balance_ros_gazebo_plugin and the package gazebo_ros_pkgs that will cause catkin_make to fail, so:
      1. remove the package gazebo_ros_pkgs from your ROS workspace by deleting it entirely or moving it out of the src directory.
      2. Delete the build and devel directory trees from your ros workspace wiht “rm -rf build devel”.
      3. Rcompile everything with “catkin_make -j 1”
    • If you see catkin_make error messages like “CMake Error at /opt/ros/kinetic/share/catkin/cmake/custom_install.cmake:13 (_install): install TARGETS given target “gazebo_ros_utils” which does not exist in this directory.” then you likely neglected to delete gazebo_ros_pkgs from your ROS workspace (see previous point).
  • Compile the plugin with catkin_make
    • Rebuild your package cache with “rospack profile” just to be on the safe side.
    • This Gazebo plugin does the following:
      • Balances the robot with active feedback control
      • Subscribes to twist messages on the topic (e.g. /edumip/cmd )
      • Publishes an EduMipState messages on the topic (e.g. /edumip/state)
      • You need to provide information in your zaxro file specifying the
        • ros debug level
        • controller update rate
        • robot base link name
        • wheel joint names
        • wheel separation and diameter
        • twist command topic to subscribe to
        • EduMipState topic to publish
    • Here is the code I used in my xacro file for the EduMip Gazebo plugin:
<gazebo>
 <plugin name="edumip_balance_ros_gazebo_plugin" filename="libedumip_balance_ros_gazebo_plugin.so">
 <rosDebugLevel>3</rosDebugLevel>
 <updateRate>100</updateRate>
 <robotBaseFrame>edumip_body</robotBaseFrame>
 <leftJoint>jointL</leftJoint>
 <rightJoint>jointR</rightJoint>
 <wheelSeparation>${wheel_separation}</wheelSeparation>
 <wheelDiameter>${wheel_radius*2.0}</wheelDiameter>
 <commandTopic>/edumip/cmd</commandTopic>
 <stateTopic>/edumip/state</stateTopic> 
 </plugin>
</gazebo>
  • Once you have added the plugin xml code to your urdf/edumip_my_robot.xacro file,  test it by running your launch file edumip_my_robot_gazebo.launch to launch gazebo with your world file from last week and spawn a robot model.
    • Your model should balance.
    • You should see a topic /edumip/state, and the plugin should be publishing EduMipState messages on this topic at 10 Hz.
    • If your robot does not balance, then check the mass and interial parameters for the edumip body and wheels that I suggested you use in the previous assignment (see HW #5 above).
  • After you have successfully added and tested the edumip_balance_ros_gazebo_plugin:  In your robot’s xacro file urdf/edumip_my_robot.xacro add camera link and a camera plugin
    1. In your robot’s xacro file add new link on top of the robot named camera_link.
    2. In your robot’s xacro file add new fixed joint named camera_joint with parent link edumip_body and child link camera_link
    3. In your robot’s xacro file add a camera plugin
      1. The camera should be located on the camera_link
      2. The name of the camera should be camera1′
      3. The camera plugin should publish images on the topic /edumip/camera1
    • Here is the code that I used in my xacro file for the camera plugin:
<gazebo reference="camera_link">  
 <sensor type="camera" name="camera1">    
  <update_rate>30.0</update_rate>    
  <camera name="head">      
   <horizontal_fov>1.3962634</horizontal_fov>      
         
   <clip>        
    <near>0.02</near>        
    <far>300</far>      
   </clip>      
   <noise>        
    <type>gaussian</type>        
    <mean>0.0</mean>        
    <stddev>0.007</stddev>      
   </noise>    
  </camera>    
  <plugin name="camera_controller" filename="libgazebo_ros_camera.so">      
   <robotNamespace>edumip</robotNamespace>      
   <alwaysOn>true</alwaysOn>      
   <updateRate>10.0</updateRate>      
   <cameraName>camera1</cameraName>      
   <imageTopicName>image_raw</imageTopicName>      
   <cameraInfoTopicName>camera_info</cameraInfoTopicName>      
   <frameName>camera_link</frameName>      
   <hackBaseline>0.07</hackBaseline>      
   <distortionK1>0.0</distortionK1>      
   <distortionK2>0.0</distortionK2>      
   <distortionK3>0.0</distortionK3>      
   <distortionT1>0.0</distortionT1>      
   <distortionT2>0.0</distortionT2>    
  </plugin>  
 </sensor> 
</gazebo> 

Gazebo screen shot for HW #6 showing the EduMIP robot at the gas station. Click for higher resolution image.

RVIZ screen shot for HW #6 showing the EduMIP visualized, and showing the robot camera image generated by the camera plugin, and showing the odometry data published by the differential drive controller plugin. Click for higher resolution image.

  • Use your launch file from the previous module (launch/edumip_my_robot_gazebo.launch) to launch gazebo – you should be able to see the camera link in the gazebo window.  Check to verify that this launch file:
    • runs gazebo with your world file from your package’s world directory file specified as an argument.
    • loads the robot_description parameter from your xacro file located in your package’s urdf directory.
    • spawns your robot model with the “spawn_model” executable from the package “gazebo ros”.
  • Use rqt_image_view and select the topic /edumip/camera1/image_raw to see the images generated by the camera plugin.
  • After you have successfully added the camera plugin to your xacro file and successfully tested it: Add two C++ nodes that you previously developed to this ROS package to make it stand-alone:
    • Copy the source files edumip_my_robot_state_publisher.cpp and joy_twist_node.cpp from your previous assignments into this packages src directory.
    • Edit your CMakeLists.txt to compile these executables to new and unique names, for example:
add_executable(edumip_my_robot_state_publisher_hw6 src/edumip_my_robot_state_publisher.cpp) target_link_libraries(edumip_my_robot_state_publisher_hw6 ${catkin_LIBRARIES}) add_executable(joy_twist_node_hw6 src/joy_twist_node.cpp) target_link_libraries(joy_twist_node_hw6 ${catkin_LIBRARIES})
  • Create a RVIZ initialization file in the rviz directory of your package named rviz/edumip_gazebo_rviz_hw6.rviz. This rviz initialization file should include visualization of the following data:
    1. RobotModel of parameter robot_description
    2. TF of topic /tf
    3. Image display of topic /edumip/camera1/image_raw
  • Create a new launch file named launch/edumip_my_robot_rviz.launch that does the following:
    1. Loads the robot_description parameter from your xacro file urdf/edumip_my_robot.xacro.
    2. Launches your edumip_my_robot_state_publisher_hw6 node like you did in HW #4.
    3. Launches a state_publisher node from the robot_state_publisher (like you did in HW #4) that subscribes to the topic /joint_states and publishes on /tf. This is a system-defined node, you do not need to write it.
    4. Launches RVIZ with the initialization file rviz/edumip_gazebo_rviz_hw6.rviz that you created for this assignment.
  • Create a new launch file named  launch/edumip_my_robot_joy.launch that does the following:
    • runs a joy node
    • runs the joy_twist_node_hw6 node
  • Run your three launch files –
    • launch/edumip_my_robot_joy.launch launches a joy node and a your joy_twist_node C++ node (from your previous assignment HW #3).
    • launch/edumip_my_robot_gazebo.launch launches Gazebo with your world loaded and spawns your EduMip model.
    • launch/edumip_my_robot_rviz.launch launches RVIZ, your robot joint state publisher C++ node (from previous assignment HW#4) , and a robot_state_publisher.
    • You should be able to drive your robot around in Gazebo with your joystick, and see the robot visualized simultaneously in both Gazebo and RVIZ. In RVIZ you should be able to see the robot, its odometry trail, and the camera image.
    • Kill everything when you are done.
  • Once you have successfully implemented and tested the above: Add a simulated laser scanner to your robot:
    • Add a link to the top of your robot xacro named “hokuyo_link” and connect it to the edumip_body link with a fixed joint.
    • Add a laser scanner Gazebo ROS plugin – follow the example in the rrbot tutorial.
    • Run your gazebo, joy, and rviz launch files and see the simulated laser scanner mean in the Gazebo Window (remember to set “visualize” to “true” as noted above.
    • Add a laser scan display marker in Rvis to visualize the laser scan data in Rviz. Save your new rviz/edumip_gazebo_rviz_hw6.rviz file.
    • Drive the robot around in gazebo and see the robot, camera image, and laser scan visualized in rviz.
    • Kill everything when you are done.
  • Modify your robot joint state publisher C++ node (from previous assignment HW#4) so that, in addition to publishing the robot joint states and edumip_body tf transform (you did this in HW #4), add additional code to that it also publishes an odometry message for the robot edumip_body frame on the topic /edumip/odometry .  You only need to populate the frame IDs and the Pose data in the Odometry message, you  do not need to populate the covariances or velocity data.
    • See the following pages:
    • Run your gazebo, joy, and rviz launch files.
    • Add an odometry marker to your rviz display. Save your new rviz/edumip_gazebo_rviz_hw6.rviz file.
    • Test it in simulation with Gazebo: Drive the robot around in gazebo and see the robot, odometry markers, camera image, and laser scan visualized in rviz.

rqt_graph plot of the topic transport for the simulated EduMip Robot with simulated camera and laser scanner topics, and a C++ node publishing edumip_body tf frames, joint_states, and Odometry messages.

rqt_graph robot of the real EduMip and a C++ node publishing edumip_body tf frames, joint_states, and Odometry messages.

  • Test it in the real-world with your EduMip: Now run your joy and rviz launch filers, but do not launch gazebo. Instead of gazebo, launch your actual edumip balance program on your edumip.  Dive it around and see that it works just like the simiulation, minus the simiulated sensors.
  • Check your project directory structure and files:  Your project should have the following directories and files:
    • edumip_my_robot_gazebo_ros/
      • CMakelists.txt
      • package.xml
      • launch/
        • edumip_my_robot_gazebo.launch
        • edumip_my_robot_joy.launch
        • edumip_my_robot_rviz.launch
      • rviz/
        • edumip_gazebo_rviz_hw6.rviz
      • src/
        • edumip_my_robot_state_publisher.cpp (revised to publish Odometry messages on the /edumip/odometry topic)
        • joy_twist_node.cpp
      • urdf/
        • edumip_my_robot.xacro
      • worlds/
        • edumip_my_robot.world
  • Push It: Push your finished package edumip_my_robot_gazebo_ros  to https://git-teach.lcsr.jhu.edu share it with the course instructors. This is the same git repo that you used for the previous assignment – just update it with this module’s work (add, commit, push).
  • Add the instructor (lwhitco1), and the TAs (gbaraba1, clin110, and hshi17) as members of your project, all with DEVELOPER access.
  • DEMO YOUR ROBOT SIMULATION IN GAZEBO with RVIZ DISPLAY of your EduMIP robot model and the Odometry markers displayed TO YOUR INSTRUCTORS – you can demo during office hours or after class on Tuesday

Screenshot of  HW #6 RVIZ visualization of simulated EduMip showing EduMip mode, Odometry markers, image stream from simulated camera, and red markers showing the data from the simulated laser scanner.

Module 7:   Week of March 13, 2019: Turtlebot-2 Simulation in Gazebo, SLAM Navigation, Adaptive Monte-Carlo Localization

This assignment demonstrates a larger robot system using sophisticated ROS packages to use a simulate mobile robot and  Simultaneous Localization and Mapping (SLAM) algorithms to construct a 2D map, and then to use the map to do 2D adaptive monte-carlo navigation of this robot, all using existing available ROS packages.

Topics

Reading

Assignments to do for This Module

Screenshot of Gazebo Turtlebot Simulation with command “roslaunch turtlebot_gazebo turtlebot_world.launch”
rqt_graph of the Gazebo simulation with keyop command input

Tutorials

  • NOTE that the turtlebot tutorials are not all updated to ROS Kinetic, but the tutorials for ROS Jade should work just fine when kinetic-specific tutorials are unavailable.
  • Read 2.1 Turtlebot Developer Habits and and 2.2 Interacting with your Turtlebot at this link: http://wiki.ros.org/Robots/TurtleBot
  • Install the turtlebot packages and the turtlebot simulator Gazebo and Rviz packages.
    • Update your Linux system and ROS packages and Install the turtlebot ROS packages as follows:
      1. Update your PCs Linux system and ROS packages with the commands first sudo apt-get update and then sudo apt-get dist-upgradeTHIS IS IMPORTANT, DO NOT SKIP THIS STEP! If you have not updated recently, the command may pull down several hundreds of MB from the Ubuntu and ROS repositories.
      2. Install the turtlebot ROS packages as described in the Turtlebot Installation Tutorial – item 3.1 Turtlebot Installation: Installing software onto the turtlebot at this link. Do the “Ubuntu Package Install” with this command for the KINETIC version of these packages:
sudo apt-get install ros-kinetic-turtlebot ros-kinetic-turtlebot-apps ros-kinetic-turtlebot-interactions ros-kinetic-turtlebot-simulator ros-kinetic-kobuki-ftdi ros-kinetic-rocon-multimaster ros-kinetic-ar-track-alvar-msgs
  • NOTE 1: After you install the above packages, kill your login shells and start a clean shells so that additional required gazebo environment variables are set.
  • NOTE 2: You DO NOT need to do a source installation
  • NOTE 3: You do not need to run the command rosrun kobuki_ftdi create_udev_rules because your notebook computer will not be physically connected to the kobuki base with a USB cable. The notebook computers that are physically on the turtlebot will be connected to the kbuki base with a USB cable. Your computer will communicate to the turtlebot’s on-board netbook via WiFi.
  • Read about the Turtlebot simulator package for ROS kinetic.
  • Do the “(Section) 6: Simulation” Tutorials for Gazebo (do not do the “Stage” simulator tutorials)
    • Turtlebot Gazebo Bringup Guide: See the simulated turtlebot in Gazebo.When you run the command roslaunch turtlebot_gazebo turtlebot_world.launch if you get the error “Invalid <arg> tag: environment variable ‘TURTLEBOT_GAZEBO_WORLD_FILE’ is not set.”, refer to the NOTE 1 item on installation above.
    • Explore the Gazebo world: Cruise around in the Gazebo world and use RViz to “see” what’s in it.
      • NOTE 4: Do not attempt to install ROS Indigo or Jade packages, we are using ROS kineticSo install only ROS kinetic packages.
      • NOTE 4.1: You may find it convenient to clone these two repositories of turtlebot files into your catkin_ws/src so that you can edit the demo files: https://github.com/turtlebot/turtlebot_interactions.git  and https://github.com/turtlebot/turtlebot_simulator.git  After you clone the repository, be sure to run “catkin_make” and then “rospack profile”, and then kill all of your open shells/terminals, and open clean terminals.
      • NOTE 5: Use this command to get keyboard control to drive the robot “roslaunch kobuki_keyop keyop.launch”, it is easier to use (supports arrow keys) than the alternative command “roslaunch turtlebot_teleop keyboard_teleop.launch”
      • NOTE 5.1: The tutorial’s RVIZ initialization file has some errors in the topic names. For example: when displaying the “DepthCloud” visualization in RVIZ, set its “Color Image Topic” to /camera/rgb/image_raw – this will overlay the Kinect camera image on the Kinect’s depth cloud.
    • Make a map and navigate with it: Use the navigation stack to create a map of the Gazebo world and start navigation based on it.
      • NOTE 6: The last time I checked, there was a bug in the ROS Kinetic package turtlebot_gazebo in the package’s launch file gmapping_demo.launch — its full path is /opt/ros/kinetic/share/turtlebot_gazebo/launch/gmapping_demo.launch (what is the bug?). If the gazebo turtlebot simulator crashes continually on launch, you can work around this bug by pulling down a copy of this package source into our own ros catkin workspace with the commands:cd ~/catkin_ws/src (varies depending the name and location of your catkin workspace).
        • git clone https://github.com/turtlebot/turtlebot_simulator
        • cd ..
        • catkin_make
        • source devel/setup.bash
        • rospack profile
        • Check that you have overlaid this package with your local copy. Now when you give the command roscd turtlebot_gazebo your default direcectory should be set to ~/catkin_ws/src/turtlebot_simulator/turtlebot_gazebo.
      • To command the turtlebot motion from the keyboard, this node allows you to use the arrow-keys (up, down, left, right): roslaunch kobuki_keyop safe_keyop.launch
      • Save your map to files with your name as part of the filename – for example I might use the command rosrun map_server map_saver -f map_by_louis to generate the map occupancy grid data file map_by_louis.pgm and its associated meta-data file map_by_louis.yaml.
        • Examine the contents of the .pgm map file with an image viewer such as gimp.
        • Examine the contents of the .yaml metadata map with a text editor or more.
      • Note 7: To run the turtlebot AMCL navigation demo, you need to specify the full path name of the .yaml map file, e.g. roslaunch turtlebot_gazebo amcl_demo.launch map_file:=/home/llw/my_first_map.yaml
      • NOTE 8: Under ROS kinetic the turtlebot AMCL mapping demo seems to ahve some wonky parameters that cause the robot to rotate (twirl) considerably when given a 2-D navigation goal. My workaround was to run rqt_reconfigure (see below for command) and tune some parameters:
        • I set the node move_base parameter “recovery_behavior_enabled” to false (unchecked).
        • I set the node move_base parameter “clearing_rotation_allowed” to false (unchecked).
        • I set the node move_base navigation parameter max_rot_vel s to 1.0 (was 5).
        • Note that setting these parameters with rqt_reconfigure is ephemeral – the parameters return to the default values specified in the AMCL package then the package is re-launched.
      • Be sure to
        • List and examine the nodes, topics, service
        • Echo some of the published topics
        • Run rqt_graph to visualize the topic data paths
        • Run rqt_reconfigure with the command rosrun rqt_reconfigure rqt_reconfigure to see some of the configuration parameters of packages that you are running.
Turtlebot Gazebo Simulation (left) and Gmapping SPAM mapping visualization in RVIZ (right).
rqt_graph of Turtlebot Gazebo Simulation and Gmapping SPAM mapping visualization.
Turtlebot Gazebo Simulation (left) and Adaptive Monte-Carlo Localization (AMCL) with Move-Base 2-D motion planning visualization in RVIZ (right).
Turtlebot Gazebo Simulation and Adaptive Monte-Carlo Localization (AMCL) with Move-Base 2-D motion planning.
rqt_reconfigure showing some AMCL parameters with dynamic_reconfigure.
rqt_reconfigure showing some move_base and DWAPlannerROS parameters with dynamic_reconfigure.

Assignment #7   Due 5PM Friday March 29, 2019

Make a SLAM map and navigate with it with a simulated robot and worls in Gazebo! Do this this assigned tutorial completely and carefully: Make a SLAM map and navigate with it

  • Be sure to
    • List and examine the nodes, topics, service
    • Echo some of the the published topics
    • Run rqt_graph to visualize the topic data paths
    • Save your map to files with your name as part of the filename – for example I might use the command rosrun map_server map_saver -f map_by_louis to generate the map occupancy grid data file map_by_louis.pgm and its associated meta-data file map_by_louis.yaml.
      • Examine the contents of the .pgm map file with an image viewer such as gimp.
      • Examine the contents of the .yaml metadata map with a text editor or more.
  • Note: To run the AMCL mapping demo, you need to specify the full path name of the .yaml map file, e.g. roslaunch turtlebot_gazebo amcl_demo.launch map_file:=/home/llw/my_first_map.yaml
  • Email the two map files (.pgm and .yaml) that you generated to the instructor and the TAs with the subject line “530.707 RSP HW#7 by your_firstname your_lastname”.
  • Demonstrate to the TAs or instructor using your map to navigate the playground world with amcl particle filter navigation (you did this in the last part of the SLAM navigation tutorial).

Module 8: Week of March 18, 2019: Independent Project

Week 1 of 6: Project Proposal: Week of March 18, 2019

Topics

  • Formulate your class project and
  • Form your team
  • Write your project proposal.
  • Get your project proposal approved by Prof. Whitcomb.

Reading

Project reports, Articles, and Photos of Previous Independent Class Projects

Assignments to do this week

Plan your independent class project and form your team

Here are the project rules:

  • Form a project team of no more than 3-4 members. You can work independently if you prefer.
  • Formulate a project that
    • Performs at least two specific tasks
    • Uses at least two sensors
    • At least one of the tasks must be performed autonomously or semi-autonimously (i.e. the not just pure teleopertion)
    • Your project MUST include several original ROS nodes that you programmed by you from scratch.
  • Review your preliminary project formulation with the Instructor and TAs during regular office hours and after classes this week. Ask them questions.
  • Determine the availability of all required hardware.
  • Conduct tests of any critical required existing ROS packages.
  • IMPORTANT NOTE:These rules can be waived with the permission of the instructor if you have a good rationale for doing something different.

Assignment #8: – Class Project Proposal Due 5PM Wednesday March 27, 2019

  • Project Proposal – create A SINGLE PDF description of your proposed project and submit it to the Instructor and TAs.
  • ONE of your team members should create a GIT repository entitled 530_707_Independent_Project and share it with the team members  with developer access.
    • Create a text file containing the name of your project and he names of he team members
    • Create a directory entitled “reports
    • Commit your project proposal to the repository with the title 01_Project_Proposal.pdf
  • Email the Instructor and TAs to notify them that  “530.707 Proposal for Project YOUR PROJECT NAME by NAMES OF ALL TEAM MEMBERS” in email subject line.
  • Submit a single project proposal document for each project team.
  • Your proposal should include
    • Project Title – give your project a NAME
    • Project Author(s) (i.e. team members),
    • Date
    • Description of proposed project. What does it do? Describe! How does a user interact with it?
    • Software
      • List and description of the NEW ros packages you will program from scratch to accomplish your project.
      • List of the major existing ROS packages you will utilize
      • List what testing you have done on these existing ROS packages.
    • Hardware and infrastructure
      • List and description of any new or additional robots, hardware, and infrastructure needed to implement your project.
      • List of existing and robots, hardware, infrastructure needed to implement your project.
      • Describe availability of all required robots, hardware, and infrastructure.
      • List what testing you have done on existing available hardware.
      • Include a detailed list of all new robots, hardware, or insfrastructure required to be purchased or borrowed, including quantity, description, model number, source/vendor, unit cost, total cost, etc.
    • Project location of operation and testing. Describe in detail where you will operate and test your project. Be specific – you must provide building name, room numbers, specific hallways (Building and floor). Candidates include the Robotics Teaching Lab 170 Wyman Park Building, and other hallways and rooms and hallways in the Wyman Park Building and Hackerman Hall. Some projects will be operated outside on campus – you must specify the specific locations on campus. Be specific!
    • Project Advisors:
      • Required: Course Project Advisor: TAs and or LLW. Please contact them in advance to ask if they will serve. You must schedule a standing meeting to meet with them every week.
        • Name the advisor (you must have asked them first!).
        • What is the day and time of your standing weekly meeting.
      • Optional: Outside Advisor – if you have additional advisors for your project.
    • Safety Plan
      • Are there any safety risks?
      • If so, can they be managed and/or mitigated to an acceptable level fo safety?
      • If so, what is the plan for risk management and/or mitigation?
    • References – list of any references cited in your text
    • Project Timeline: For each week, give the hardware and software development and testing goals for the week.
      1. Week 1 hardware and software development and testing goals.
      2. Week 2 hardware and software development and testing goals.
      3. Week 3 hardware and software development and testing goals.
      4. Week 4 hardware and software development and testing goals.
      5. Week 5 hardware and software development and testing goals. Your goal should be to have your project completed by the end of Week 5 – the last week of classes. This should include your project poster for the class poster session, and also your project report.
  • Your submitted proposal will be reviewed by the Instructor and TAs, and you will be notified of the result. It will either be “approved” or “not approved” – if “not approved” you will be asked to revise and resubmit the proposal.

Week 2 of 6: Project Implementation and Testing & Project Weekly Progress Report #1: Due 5PM Wednesday April 3, 2019

  • Project Weekly Progress Report – create A SINGLE PDF report of your progress.
  • Your Project Weekly Progress Report should include
    • Heading:
      • Project Title
      • Weekly Progress Report #1
      • Project Author(s) (i.e. team members),
      • Date
    • The body should contain at least the following sections:
      • 0. Weekly Meeting with Project Advisor (THIS IS NEW)
        • Date and location of meeting
        • Names of all persons present at the meeting.
        • Names of team members absent from the meeting.
        • Brief description of topics discussed.
      • 1. This Week’s Goals
        • State this past week’s goals quoted from the “Project Timeline” of your Project Proposal.
      • 2. This Week’s Progress (NOTE REVISED FORMAT)
        • For each team member report the following:
        • Teammember Full Name:
          • Week’s Goals Accomplished: State the goals you HAVE accomplished this week.  Describe each briefly.   When possible include a few representative figures with captions (photos, screenshots, or data plots) illustrating your system in  operation.
          • Week’s Goals Not Accomplished: State the goals you have HAVE NOT accomplished this week.  Describe each briefly.  Discuss that the obstacles were.
      • 3. Changes in Project Scope/Goals
        • Discuss briefly any changes in the project scope that are different from those stated in the Project Proposal.  You must get changes in scope approved by Prof. Whitcomb!
      • 4. Lessons Learned (NOTE REVISED FORMAT)
        • For each team member list their name and report important lessons they learned this past week.
      • 5. Next Week’s Goals (NOTE REVISED FORMAT)
        • For each team member list their name and state their next week’s goals quoted from the “Project Timeline” of your Project Proposal.
  • ONE of your team members should have already created a GIT repository entitled 530_707_Independent_Project and shared it with the team members  with developer access.
  • In the directory entitled “reports” of your project repository, commit your project report  with the title 02_Project_Weekly_Report_2019_04_03.pdf
  • Email the Instructor and TAs to notify them that  “530.707 Project Weekly Report for YOUR PROJECT NAME by NAMES OF ALL TEAM MEMBERS” in email subject line.

Week 3 of 6: Project Implementation and Testing & Project Weekly Progress Report #2: Due 5PM Wednesday April 10, 2019

  • Follow the outline from the first weekly report, above, to prepare your weekly report.
  • In the directory entitled “reports” of your project repository, commit your project report  with the title 03_Project_Weekly_Report_2019_04_10.pdf
  • Email the Instructor and TAs to notify them that  “530.707 Project Weekly Report for YOUR PROJECT NAME by NAMES OF ALL TEAM MEMBERS” in email subject line.

Week 4 of 6: Project Implementation and Testing & Project Weekly Progress Report #3: Due 5PM Wednesday April 17, 2019

  • Follow the outline from the first weekly report, above, to prepare your weekly report.
  • In the directory entitled “reports” of your project repository, commit your project report  with the title 04_Project_Weekly_Report_2019_04_17.pdf
  • Email the Instructor and TAs to notify them that  “530.707 Project Weekly Report for YOUR PROJECT NAME by NAMES OF ALL TEAM MEMBERS” in email subject line.

Week 5 of 6 Project Implementation and Testing & Project Weekly Progress Report #4: Due 5PM Wednesday April 24, 2019

  • Follow the outline from the first weekly report, above, to prepare your weekly report.
  • In the directory entitled “reports” of your project repository, commit your project report  with the title 05_Project_Weekly_Report_2019_04_24.pdf
  • Email the Instructor and TAs to notify them that  “530.707 Project Weekly Report for YOUR PROJECT NAME by NAMES OF ALL TEAM MEMBERS” in email subject line.

Week6 of 6 Project Implementation and Testing & Project Weekly Progress Report #5: Due 5PM Wednesday May 1, 2019

  • Follow the outline from the first weekly report, above, to prepare your weekly report.
  • In the directory entitled “reports” of your project repository, commit your project report  with the title 06_Project_Weekly_Report_2019_05_01.pdf
  • Email the Instructor and TAs to notify them that  “530.707 Project Weekly Report for YOUR PROJECT NAME by NAMES OF ALL TEAM MEMBERS” in email subject line.

Independent Project  Demonstration, Monday May 6, 2019 or Tuesday May 14, 2019

Each team will choose a 30 minute time slot to demonstrate their project to the Instructors, TAs, and other students, faculty, staff, and possibly some local press.

Here is the demo SCHEDULE: https://docs.google.com/spreadsheets/d/1Z2i-C25W6wR9N3rAR2NMUpXqnuQA-iIdFT0fPVsVzCI/edit?usp=sharing

Independent Project Poster Session, 2-4PM Tuesday May 14, 2019)

  • Final presentations will be in Hackerman Hall 2-4PM Tuesday May 14, 2019.
  • This poster session is scheduled in the time slot for our class final exam, so it should not conflict with other final exams, per the JHU Registrar’s final exam schedule: https://studentaffairs.jhu.edu/registrar/wp-content/uploads/sites/23/2019/02/Spring-2019-Final-Exam-Schedule-Final.pdf
  • I posted a poster template for a 48” (wide) x 36” (tall) poster here https://jh.box.com/v/530-707-Dropbox in the file named Robot_Systems_Programming_Poster_Template_Rev_01.pptx
  • We will provide 48”x36” foam-core boards and poster-stands.
  • You need to print your own poster.
  • The printers are both loaded with 42” wide HP Heavyweight Coated Paper, so you can print your poster sideways to save paper.
  • Do not wait until the last day to print your poster!
  • If you can demo your project at the poster session, please do so. If not, showing a video of your project in action is desirable.
  • Please commit the .PPT and PDF of your poster to the “reports” directory of your project git repository, with file names like these:06_Project_Poster_Your_Project_Team_Name.pdf
    06_Project_Poster_Your_Project_Team_Name.pptfor example   06_Project_Poster_UR5_Plays_Jenga.pdf
    06_Project_Poster_UR5_Plays_Jenga.ppt

Project Videos for upload to the Course YouTube Channel!

PROJECT VIDEOS: Not required, but greatly appreciated. If you made a video and would like me to load your video to the class youtube channel (https://www.youtube.com/channel/UC_HYa5JUr2yKZAM0rYtAiFQ) , please commit the video file to your project git repo in the “reports” directly with a file name like:

530707_Project_Video_YOUR PROJECT_NAME.mp3

for example

530707_Project_Video_EduMIP_SLAM_and_UR5_Transporting.mp3
or
530707_Project_Video_EduMIP_SLAM_and_UR5_Transporting.mov

Independent Project Final report is due by end of exam period: Thursday May 16, 2019

Here is the outline:

  • Your project report should include
    • Project Title, Author(s) (i.e. team members), Date
    • A Section on Description of proposed project goals. Be detailed. Use figures.
    • A Section on Software
      • List and description of the new ros packages you implemented to accomplish your project. List the primary authors. . Include git repository URL for each package.
      • List of the major existing ROS packages you utilized
      • Use Figures as needed.
    • A Section on Hardware and infrastructure
      • List and description of any new or additional hardware and infrastructure you needed to implement your project.
      • List of existing and infrastructure you needed to implement your project.
      • Figures/Photos as needed.
    • A Section on contributions of each team member. For EACH team member by name have a subsection that details for that team member the following:
      • What was the individual team member’s principal contribution to the project?
      • What does individual team member know now that she/he wishes they knew at the start of the project?
      • What are the individual team member’s biggest lessons learned that will help them on future projects?
    • A section with sample Data Products (if any) from the project
    • A section with a link to brief online video of your project in action (desired but not required).
    • Suggestions for future projects
    • References – list of any references cited in your text
  • Please commit your project final report to the same directory by May 176 2019, with the names07_Project_Final_Report_Your_Project_Team_Name.pdf
    07_Project_Final_Report_Your_Project_Team_Name.pptfor example   07_Project_Final_Report_UR5_Plays_Jenga.pdf
    07_Project_Final_Report_Plays_Jenga.ppt Please email me when you have uploaded your final report with “530.707 Project Final Report for YOUR PROJECT NAME by NAMES OF ALL TEAM MEMBERS” in email subject line.

HOW TO ORDER COMPONENTS FOR YOUR PROJECT

Some Tips and Notes on Projects and Programming the EduMIP in ROS

Istalling ROS Packages on the EduMIP:

The ROS installation /op/ros/kinetic on your EduMIP was compiled from source on the EduMIP, it was NOT installed with “apt-get install ros-kinetic-desktop-full”. If you want to install additional ROS packages on your EduMIP, we suggest that you download the KINETIC branch of the package souce code into catkin_ws/src and compile it with “catkin_make”

  • So, for example, if I want to install the ROS “serial” package on my edumip, I would need to clone a copy of the source and compile it:
  • Now your EduMIP should use the serial package from your workspace.
  • On my PC, in contrast, I can use apt-get to install this binary package with the command “sudo apt-get install ros-kinetic-serial”

ROS Can’t Find my ROS Package

If your ROS environment does not seem to recognize that you have new packages in your catkin_ws/src, try updating your rospack profile with the command “rospack profile”, and update your rosdep cache with the command “rosdep update”.

Compiling and Linking to the Robtoics Cape “C” Libraries

The robotics cape library and associated header files are already installed on your EduMIP. The link library is /usr/lib/libroboticscape.so, and the two top-level header files are rc_usefulincludes.h and roboticscape.h.

You can refer to the edumip_balance_ros project (https://git.lcsr.jhu.edu/lwhitco1/edumip_balance_ros) to see how to use the robotics cape C header files and C link library with your C++ ROS Node.

See edumip_balance_ros/src/edumip_balance_ros.cpp to see how to include the C header files in a C++ program with the extern “C” directive.

See edumip_balance_ros/CMakeLists.txt to see how to link your program to the roboticscape C library.

Edumip_balance_ros Package

EduMIP Horizontal Configuration

Several teams propose to employ the EduMIP in a horizontal configuration with a caster wheel. To do this you will need to replace the “edumip_balance_ros” project with your own new code to command the wheel servos and to read and publish the wheel encoders. As I reviewed in class yesterday, one possible package you could use is the ROS differential_drive package (http://wiki.ros.org/differential_drive). To use the differential_drive package you will need to write a new C++ node for the edumip that (a) Reads the wheel encoders and publishes them on topics and (b) Subscribes to motor command topics and commands effort to the wheel motors.

Interfacing to Serial Port Devices

To interface your C++ code to a serial device such as the USB ultrasonic sensor demonstrated in class, you will need to be able to open, read, and write to the serial ports. A good package for this is the ROS serial package (http://wiki.ros.org/serial). You can clone a copy of the src into your ROS workspace with the command “git clone https://github.com/wjwwood/serial.git” and compile it with “catkin_make”. The “examples” directory has a somewhat complex example of serial port usage. A simpler example is available here: https://github.com/garyservin/serial-example

Running catkin_make from within emacs

Add something like the following to your ~/.emacs initialization file. This example binds the compile command to “ESC-x | ” :

; bind C-x | to compile
(global-set-key "\C-x|" 'compile)

; override default compile command
(setq compile-command "cd /home/llw/ros_catkin_ws; catkin_make ")

Ethics

Students are encouraged to work in groups to learn, brainstorm, and collaborate in learning how to solve problems.

Assignments, Code, and ROS Packages: Your final writeups, code, and ROS packages for lab assignments #1-#7 must be done independently without reference to any notes from group sessions, the work of others, or other sources such as the internet.

While working on your final assignments, you may refer to your own class notes, your own laboratory notes, the course web pages, the course ROS tutorials and the text.

Disclosure of Outside Sources: If you use outside sources other than your class notes and your text to solve problems in your assignments (i.e. if you have used sources such as your roommate, study partner, the Internet, another textbook, a file from your office-mate’s files) then you must disclose the outside source and what you took from the source in your writeup/Code.

In this course, we adopt the ethical guidelines articulated by Professor Lester Su for M.E. 530.101 Freshman experiences in mechanical engineering I, which are quoted with permission as follows:

Cheating is wrong. Cheating hurts our community by undermining academic integrity, creating mistrust, and fostering unfair competition. The university will punish cheaters with failure on an assignment, failure in a course, permanent transcript notation, suspension, and/or expulsion.

Offenses may be reported to medical, law or other professional or graduate schools when a cheater applies. Violations can include cheating on exams, plagiarism, reuse of assignments without permission, improper use of the Internet and electronic devices, unauthorized collaboration, alteration of graded assignments, forgery and falsification, lying, facilitating academic dishonesty, and unfair competition. Ignorance of these rules is not an excuse.

On every exam, you will sign the following pledge: “I agree to complete this exam without unauthorized assistance from any person, materials or device. [Signed and dated]”

For more information, see the guide on “Academic Ethics for Undergraduates” and the Ethics Board web site (http://ethics.jhu.edu).

I do want to make clear that I’m aware that the vast majority of students are honest, and the last thing I want to do is discourage students from working together. After all, working together on assignments is one of the most effective ways to learn, both through learning from and explaining things to others. The ethics rules are in place to ensure that the playing field is level for all students. The following examples will hopefully help explain the distinction between what constitutes acceptable cooperation and what is not allowable.

Student 1: Yo, I dunno how to do problem 2 on the homework, can you clue me in?

Student 2: Well, to be brief, I simply applied the **** principle
that is thoroughly explained in Chapter **** in the course text.

Student 1: Dude, thanks! (Goes off to work on problem.)

– This scenario describes an acceptable interaction. There is nothing wrong with pointing someone in the right direction.


Student Y: The homework is due in fifteen minutes and I haven’t
done number 5 yet! Help me!

Student Z: Sure, but I don’t have time to explain it to you, so
here. Don’t just copy it, though.
(Hands over completed assignment.)

Student Y: I owe you one, man. (Goes off to copy number 5.)

– This scenario is a textbook ethics violation on the part of
both students. Student Y’s offense is obvious; student Z is
guilty by virtue of facilitating plagiarism, even though he/she
is unaware of what student Y actually did.


Joe Student: Geez, I am so swamped, I can’t possibly write up the
lab report and do the lab data calculations before it’s all due.

Jane student: Well, since we were lab partners and collected all
the data together…maybe you could just use my Excel spreadsheet
with the calculations, as long as you did the write-up yourself….

Joe Student: Yeah, that’s a great idea!

– That is not a great idea. By turning in a lab report with Jane’s
spreadsheet included, Joe is submitting something that isn’t his
own work.


Study group member I: All right, since there’s three of us and
there’s six problems on the homework, let’s each do two. I’ll
do one and two and give you copies when I’m done.

Study group member II: Good idea, that’ll save us a lot of work.
I’ll take three and five.

Study group member III: Then I guess I’ll do four and six. Are you
guys sure this is OK? Seems fishy to me.

Study group member I: What’s the problem? It’s not like we’re
copying the entire assignment. Two problems each is still a lot
of work.

– This is clearly wrong. Copying is copying even if it’s only
part of an assignment.


Mike (just before class): Hey, can you help me? I lost my
calculator, so I’ve got all the problems worked out but I
couldn’t get the numerical answers. What is the answer for
problem 1?

Ike: Let’s see (flips through assignment)… I got 2.16542.

Mike: (Writing) Two point one six five four two…what about
number 2?

Ike: For that one… I got 16.0.

Mike: (Writing) Sixteen point oh…great, got it, thanks.
Helping out a friend totally rules!

– Helping out a friend this way does not rule, totally or
partially. As minor as this offense seems, Mike is still
submitting Ike’s work as his own when Mike gets the numerical
answer and copies it in this way.