Difference between revisions of "CISC367 S2023"

From class_wiki
Jump to: navigation, search
(Schedule)
(Schedule)
 
(54 intermediate revisions by the same user not shown)
Line 221: Line 221:
 
|style="background:rgb(102, 204, 255)"|5
 
|style="background:rgb(102, 204, 255)"|5
 
|Feb. 22
 
|Feb. 22
|Visualization and simulation<!--Motion planning-->
+
|More ROS + Controllers<!--Motion planning-->
|Connecting to the lidar sensor<!--Background, kinematics, PID control-->
+
|ROS 2 workspaces and packages, subscribing and publishing, and timer callbacks; basic feedback controller concepts<!--Background, kinematics, PID control-->
|[https://index.ros.org/r/rplidar_ros/#humble rplidar_ros]
+
|[http://nameless.cis.udel.edu/class_data/cisc367_s2023/fs_chap1.pdf FS Chap. 1]
 
<!--
 
<!--
 
*[http://robots.stanford.edu/probabilistic-robotics/ppt/wheeled-locomotion.ppt Thrun ''et al.'' wheeled locomotion slides]
 
*[http://robots.stanford.edu/probabilistic-robotics/ppt/wheeled-locomotion.ppt Thrun ''et al.'' wheeled locomotion slides]
Line 232: Line 232:
 
|6
 
|6
 
|Feb. 24
 
|Feb. 24
|Visualization and simulation<!--Motion planning-->
+
|Controllers<!--Motion planning-->
|ROS Gazebo<!--Homing controllers, occupancy grid and grid-based planning -->
+
|Waypoint following, line following, trajectory and wall following (pure pursuit)<!--Homing controllers, occupancy grid and grid-based planning -->
|[https://classic.gazebosim.org/tutorials Gazebo classic tutorials]
+
|[http://nameless.cis.udel.edu/class_data/cisc367_s2023/978-3-319-54413-7_4.pdf RVC Chap. 4-4.1.2]
 
<!--* LaValle Chaps. [http://msl.cs.uiuc.edu/planning/ch1.pdf 1 (skip 1.5)], [http://planning.cs.uiuc.edu/ch15.pdf 15.3-15.3.2]-->
 
<!--* LaValle Chaps. [http://msl.cs.uiuc.edu/planning/ch1.pdf 1 (skip 1.5)], [http://planning.cs.uiuc.edu/ch15.pdf 15.3-15.3.2]-->
|<!--[https://docs.google.com/presentation/d/17THeILZrX6DaOOofNjkphXf2gmDUjhU_7XMcWvIkwq0/edit slides]-->
+
|[https://docs.google.com/presentation/d/1qDLmy_PEbZuUffjOEfV65U1L_WLD1RNjjiOSrVTa0ro/edit?usp=sharing slides]
 
|-
 
|-
 
|style="background:rgb(102, 204, 255)"|7
 
|style="background:rgb(102, 204, 255)"|7
 
|Mar. 1
 
|Mar. 1
|Estimation<!--Motion planning-->
+
|HW #2<!--Motion planning-->
|Least-squares line-fitting, Kalman filter,<br>outlier rejection with RANSAC<!--Explanation of nav code (ROS [http://www.ros.org/wiki/costmap_2d costmap_2d] and [http://www.ros.org/wiki/navfn navfn], trajectory following, replanning), putting moving objects in Gazebo world-->
+
|Coding time<!--Explanation of nav code (ROS [http://www.ros.org/wiki/costmap_2d costmap_2d] and [http://www.ros.org/wiki/navfn navfn], trajectory following, replanning), putting moving objects in Gazebo world-->
|Scikit-learn [https://scikit-learn.org/stable/modules/linear_model.html#ordinary-least-squares ordinary least-squares] and [https://scikit-learn.org/stable/modules/linear_model.html#ransac-regression RANSAC]
+
|
 
<!--*[http://www.kavrakilab.org/iros2011 MP workshop at IROS 2011]-->
 
<!--*[http://www.kavrakilab.org/iros2011 MP workshop at IROS 2011]-->
|
+
|[https://docs.google.com/document/d/1N1xhL7z1E6cC12X1Ch6hbqC1o7yJ6hfZ7yRN5Kl1e1Q/edit?usp=sharing Starter code for Python <tt>wanderer</tt>]<br>
<!--[https://docs.google.com/presentation/d/1i-CH1gupMKolIyMOGTRu0aMCMw4BcBEfh1NUuV14ZlU/edit slides] -->
+
[https://docs.google.com/presentation/d/1MNZsoDwsiCzoJiQ-Ck4ljHeW-1wnk0a3_2DmCu01WXM/edit?usp=sharing slides]  
 
|-
 
|-
 
|8
 
|8
 
|Mar. 3
 
|Mar. 3
|Controllers
+
|Costmaps and discrete motion planning
|Basic feedback and PID control concepts<!--Bug variants, [http://www.ros.org/wiki/sbpl search-based planning] -->
+
|Connecting to the lidar sensor; representing the environment as a map and basic planning
|[http://nameless.cis.udel.edu/class_data/cisc367_s2023/fs_chap1.pdf FS Chap. 1]
+
|[https://index.ros.org/r/rplidar_ros/#humble rplidar_ros]; [http://nameless.cis.udel.edu/class_data/cisc367_s2023/mr_chap10.4.pdf MR Chap. 10.4], <br>[https://people.eecs.berkeley.edu/~pabbeel/cs287-fa13/slides/MappingWithKnownPoses.pdf Abbeel slides] (skip SLAM, reflection maps)
 
<!--
 
<!--
 
*[http://www.cs.cmu.edu/~motionplanning/lecture/Chap2-Bug-Alg_howie.pdf Choset's Bug slides]
 
*[http://www.cs.cmu.edu/~motionplanning/lecture/Chap2-Bug-Alg_howie.pdf Choset's Bug slides]
 
*[http://www.youtube.com/watch?v=r1zbuLc8RhI ROSCon MP talk]
 
*[http://www.youtube.com/watch?v=r1zbuLc8RhI ROSCon MP talk]
 
-->
 
-->
|''HW #2 due''<!--[https://docs.google.com/presentation/d/1aHs6nhzTlx9K0Fj09vD01NwfthKoPFIhWyVdqi-PjSA/edit slides]-->
+
|[https://docs.google.com/presentation/d/1p6VmrFwmCLdWQIxuaNHbg7660AZBGWHLxfgNo6F0-tI/edit?usp=sharing slides]
 
|-
 
|-
 
|style="background:rgb(102, 204, 255)"|9
 
|style="background:rgb(102, 204, 255)"|9
 
|Mar. 8
 
|Mar. 8
|Controllers<!--Perception-->
+
|Estimation<!--Perception-->
|Waypoint following, wall following, line following
+
|Least-squares line-fitting and <br>outlier rejection with RANSAC for lidar scans
|[http://nameless.cis.udel.edu/class_data/cisc367_s2023/978-3-319-54413-7_4.pdf RVC Chap. 4-4.1.2]
+
|Scikit-learn [https://scikit-learn.org/stable/modules/linear_model.html#ordinary-least-squares ordinary least-squares] and [https://scikit-learn.org/stable/modules/linear_model.html#ransac-regression RANSAC]
|<!--(see slides)<br>[https://docs.google.com/presentation/d/1O1HKKrjEmOUZcMZW1U_5j--AZWzWJHY_ZW3xbukRurc/edit slides] -->
+
|[https://docs.google.com/presentation/d/1mtrBRl3HeYwuw4aLGJoLWxNBgxMU0yv9un-D9qEzl04/edit?usp=sharing slides]<br>''HW #2 due''
 
|-
 
|-
 
|10
 
|10
 
|Mar. 10
 
|Mar. 10
|Motion planning
+
|Localization
|ROS <tt>nav2</tt> package
+
|Particle filters, MCL
|[https://navigation.ros.org/ nav2], [https://navigation.ros.org/tutorials/docs/navigation2_on_real_turtlebot3.html Turtlebot3 navigation tutorial]
+
|[http://robots.stanford.edu/probabilistic-robotics/ppt/particle-filters.ppt Thrun particle filtering slides]
|HW #3
+
|[https://docs.google.com/presentation/d/1wqMBu4MxXyffFIyB_KSxss0mvaBTmLYXq9ogNlyZFSw/edit?usp=sharing slides]<br> [https://docs.google.com/document/d/18gidhrrhHwwZiF3EWQZqZ96kaL8gtWZA8sBHyTiXA3k/edit?usp=sharing HW #3]
 
|-
 
|-
 
|style="background:rgb(102, 204, 255)"|11
 
|style="background:rgb(102, 204, 255)"|11
 
|Mar. 15
 
|Mar. 15
|Motion planning
+
|HW #3
|More nav2 details
+
|Coding time
 
|
 
|
 
|
 
|
Line 281: Line 281:
 
|12
 
|12
 
|Mar. 17
 
|Mar. 17
|Motion planning
+
|HW #3
|Discrete search, randomized search, path smoothing
+
|Quiz then coding time
|[http://lavalle.pl/planning/ch1.pdf PA Chap. 1-1.3], [http://nameless.cis.udel.edu/class_data/cisc367_s2023/mr_chap10.1.pdf MR Chap. 10-10.1]
+
|
 
|Quiz #1
 
|Quiz #1
 
|-
 
|-
 
|style="background:rgb(102, 204, 255)"|13
 
|style="background:rgb(102, 204, 255)"|13
 
|Mar. 22
 
|Mar. 22
|Localization
+
|SLAM
|Particle filters, MCL
+
|ROS <tt>slam_toolbox</tt> package
|[http://robots.stanford.edu/probabilistic-robotics/ppt/particle-filters.ppt Thrun particle filtering slides]
+
|[https://github.com/SteveMacenski/slam_toolbox slam_toolbox Github], [http://nameless.cis.udel.edu/class_data/cisc829/oct18/thrun_fastslam.pdf Thrun FastSLAM slides]<!--[https://navigation.ros.org/tutorials/docs/navigation2_with_slam.html nav2 Navigating while mapping tutorial]-->
|''HW #3 due''
+
|[https://docs.google.com/presentation/d/17_J6F7mgkX69VI8iRt2O6zU2b4DsS3OLjhI3_HJj_OQ/edit?usp=sharing slides]
 
|-
 
|-
 
|14
 
|14
 
|Mar. 24
 
|Mar. 24
|Mapping<!--Localization -->
+
|HW #3<!--Simulation--><!--Localization -->
|Occupancy grids with known robot poses<!--Inference/tracking, Markov localization -->
+
|<!--Randomized search, path smoothing--><!--ROS Gazebo--><!--Inference/tracking, Markov localization -->
|[https://people.eecs.berkeley.edu/~pabbeel/cs287-fa13/slides/MappingWithKnownPoses.pdf Abbeel slides, adapted from ''Probabilistic Robotics'']<!--[http://www.asl.ethz.ch/education/master/mobile_robotics/Lecture7b.pdf ETH localization 1] (through slide 31), [http://www.asl.ethz.ch/education/master/mobile_robotics/Lecture8.pdf ETH localization 2] (through slide 41),-->
+
|<!--[http://lavalle.pl/planning/ch1.pdf PA Chap. 1-1.3], [http://nameless.cis.udel.edu/class_data/cisc367_s2023/mr_chap10.1.pdf MR Chap. 10-10.1]--><!--[https://classic.gazebosim.org/tutorials Gazebo classic tutorials]--><!--[http://www.asl.ethz.ch/education/master/mobile_robotics/Lecture7b.pdf ETH localization 1] (through slide 31), [http://www.asl.ethz.ch/education/master/mobile_robotics/Lecture8.pdf ETH localization 2] (through slide 41),-->
|
+
|''HW #3 due''
 
|-
 
|-
 
|style="background:rgb(102, 204, 255)"|
 
|style="background:rgb(102, 204, 255)"|
Line 323: Line 323:
 
|15
 
|15
 
|Apr. 7
 
|Apr. 7
|SLAM
+
|Computer vision
|ROS <tt>slam_toolbox</tt> package
+
|Connecting to the RGB-D camera
|[https://github.com/SteveMacenski/slam_toolbox slam_toolbox Github], [https://navigation.ros.org/tutorials/docs/navigation2_with_slam.html nav2 Navigating while mapping tutorial]
+
|[https://github.com/IntelRealSense/realsense-ros realsense-ros Github], basic OpenCV color processing
|HW #4
+
|[https://docs.google.com/document/d/1cRttNSmWXfSkUzC7snVSrJpz82R7oiioWd0fGGxnDDg/edit?usp=sharing HW #4 / #5]
 
|-
 
|-
 
|style="background:rgb(102, 204, 255)"|16
 
|style="background:rgb(102, 204, 255)"|16
 
|Apr. 12
 
|Apr. 12
|SLAM<!--Localization -->
+
|Computer vision<!--Localization -->
|<!--Particle filtering, Monte Carlo localization (ROS [http://www.ros.org/wiki/amcl amcl])-->
+
|Line finding<!--Particle filtering, Monte Carlo localization (ROS [http://www.ros.org/wiki/amcl amcl])-->
|[http://nameless.cis.udel.edu/class_data/cisc829/oct18/thrun_fastslam.pdf Thrun FastSLAM slides]<!--[http://robots.stanford.edu/probabilistic-robotics/ppt/particle-filters.ppt Thrun particle filtering slides], [http://nameless.cis.udel.edu/class_data/cisc829/oct4/UTenn_Parker_Localization.pdf U. Tennessee Monte Carlo Localization],  [http://robots.stanford.edu/papers/fox.aaai99.ps.gz Fox et al. AAAI 1999 paper], [http://www.youtube.com/watch?v=uiIi2rSKWAU Nao localization at U. Freiburg]-->
+
|[https://docs.opencv.org/3.4/d9/db0/tutorial_hough_lines.html Hough lines]<!--[http://robots.stanford.edu/probabilistic-robotics/ppt/particle-filters.ppt Thrun particle filtering slides], [http://nameless.cis.udel.edu/class_data/cisc829/oct4/UTenn_Parker_Localization.pdf U. Tennessee Monte Carlo Localization],  [http://robots.stanford.edu/papers/fox.aaai99.ps.gz Fox et al. AAAI 1999 paper], [http://www.youtube.com/watch?v=uiIi2rSKWAU Nao localization at U. Freiburg]-->
|
+
|[https://docs.google.com/presentation/d/1Fm2bfCp97l0buYP6xlHOw9j187zppfknAYc6JVIzJbI/edit?usp=sharing slides]
 
|-
 
|-
 
|17
 
|17
 
|Apr. 14
 
|Apr. 14
 
|Computer vision<!--Mapping-->
 
|Computer vision<!--Mapping-->
|Connecting to the RGB-D camera<!--SLAM (ROS [http://www.ros.org/wiki/gmapping gmapping])-->
+
|Tags/fiducials<!--SLAM (ROS [http://www.ros.org/wiki/gmapping gmapping])-->
|[https://github.com/IntelRealSense/realsense-ros realsense-ros Github]<!--[http://nameless.cis.udel.edu/class_data/cisc829/oct18/thrun_fastslam.pdf Thrun FastSLAM slides], [http://www.youtube.com/watch?v=7iIDdvCXIFM gmapping demo], [http://www.youtube.com/watch?v=F8pdObV_df4 Darmstadt mapping]-->
+
|[https://github.com/AprilRobotics/apriltag AprilTag Github], [https://github.com/christianrauch/apriltag_ros apriltag_ros]<!--[http://nameless.cis.udel.edu/class_data/cisc829/oct18/thrun_fastslam.pdf Thrun FastSLAM slides], [http://www.youtube.com/watch?v=7iIDdvCXIFM gmapping demo], [http://www.youtube.com/watch?v=F8pdObV_df4 Darmstadt mapping]-->
|''HW #4 due''<!--[[CISC829_F2012_HW3 | HW #3]]-->
+
|[https://docs.google.com/presentation/d/1-60DuJZxCiqbybz7Ck4hdjuQ6ZJ24PVbw31Dq3h2aAM/edit?usp=sharing slides]
 
|-
 
|-
 
|style="background:rgb(102, 204, 255)"|18
 
|style="background:rgb(102, 204, 255)"|18
 
|Apr. 19
 
|Apr. 19
|Computer vision<!--Mapping-->
+
|HW #4<!--Mapping-->
|Color, AprilTags
+
|Coding time
|[https://github.com/AprilRobotics/apriltag AprilTag Github], [https://github.com/christianrauch/apriltag_ros apriltag_ros]<!--UC Berkeley notes on [http://nameless.cis.udel.edu/class_data/cisc829/oct23/gmapping.pdf gmapping], [http://nameless.cis.udel.edu/class_data/cisc829/oct23/scan-matching.pdf scan-matching]-->
+
|<!--UC Berkeley notes on [http://nameless.cis.udel.edu/class_data/cisc829/oct23/gmapping.pdf gmapping], [http://nameless.cis.udel.edu/class_data/cisc829/oct23/scan-matching.pdf scan-matching]-->
|HW #5
+
|
 
|-
 
|-
 
|19
 
|19
 
|Apr. 21
 
|Apr. 21
|Computer vision<!--Mapping-->
+
|HW #4<!--Mapping-->
|Off-the-shelf object detectors<!--RGB-D SLAM (ROS [http://www.ros.org/wiki/rgbdslam rgbdslam])-->
+
|Coding time<!--RGB-D SLAM (ROS [http://www.ros.org/wiki/rgbdslam rgbdslam])-->
|[https://github.com/Ar-Ray-code/YOLOv5-ROS ROS2 wrapper around YOLOv5] (not tested)<!--[http://nameless.cis.udel.edu/class_data/cisc829/oct25/engelhard_rgbd_slam_slides.pdf RGBD SLAM slides], [http://nameless.cis.udel.edu/class_data/cisc829/oct25/endres12icra.pdf ICRA 2012 RGBD SLAM paper], U. Texas slides on SIFT features [http://nameless.cis.udel.edu/class_data/cisc829/oct25/grauman_lecture15_local_features.pdf 1], [http://nameless.cis.udel.edu/class_data/cisc829/oct25/grauman_lecture16_bow.pdf 2]-->
+
|<!--[http://nameless.cis.udel.edu/class_data/cisc829/oct25/engelhard_rgbd_slam_slides.pdf RGBD SLAM slides], [http://nameless.cis.udel.edu/class_data/cisc829/oct25/endres12icra.pdf ICRA 2012 RGBD SLAM paper], U. Texas slides on SIFT features [http://nameless.cis.udel.edu/class_data/cisc829/oct25/grauman_lecture15_local_features.pdf 1], [http://nameless.cis.udel.edu/class_data/cisc829/oct25/grauman_lecture16_bow.pdf 2]-->
|
+
|''HW #4 due''
 
|-
 
|-
 
|style="background:rgb(102, 204, 255)"|20
 
|style="background:rgb(102, 204, 255)"|20
 
|Apr. 26
 
|Apr. 26
|Computer vision
+
|HW #5
|Training your own object detector
+
|Coding time
|[https://github.com/ultralytics/yolov5/wiki/Train-Custom-Data Training YOLOv5 on custom data]
+
|
 
|
 
|
 
|-
 
|-
 
|21
 
|21
 
|Apr. 28
 
|Apr. 28
|3-D point cloud processing
+
|Computer vision
|Obstacles as fitted ground plane outliers; ROS <tt>perception_pcl</tt> package
+
|Getting depth/3-D point cloud from the RealSense
|[https://github.com/ros-perception/perception_pcl/tree/ros2 perception_pcl]
+
|[https://github.com/IntelRealSense/realsense-ros realsense-ros Github], [https://github.com/ros-perception/perception_pcl/tree/ros2 perception_pcl] for further analysis
|''HW #5 due''
+
|[https://docs.google.com/presentation/d/14VjFGrkuAs2aSm0eA9A--8IjM-bsVk0Bb4kOfKtnRCc/edit?usp=sharing slides]<br>''HW #5 due''
 
|-
 
|-
 
|style="background:rgb(102, 204, 255)"|22
 
|style="background:rgb(102, 204, 255)"|22
 
|May 3
 
|May 3
|Miscellanous
 
|
 
|
 
 
|HW #6
 
|HW #6
 +
|Coding time
 +
|<!--[https://navigation.ros.org/ nav2], [https://navigation.ros.org/tutorials/docs/navigation2_on_real_turtlebot3.html Turtlebot3 navigation tutorial]-->
 +
|[https://docs.google.com/document/d/1c8_zHq2k7myrNJDrRX763M46lIiy27zwKTNoUf8qpfk/edit?usp=sharing HW #6]
 
|-
 
|-
 
|23
 
|23
 
|May 5
 
|May 5
|Learning
+
|Computer vision
|Dynamical tasks, reinforcement learning
+
|Object detection
|[https://github.com/Farama-Foundation/Gymnasium Gymnasium Github]
+
|[https://github.com/ultralytics/yolov5 YOLOv5] (Python, lots of requirements), [https://github.com/ultralytics/yolov5/wiki/Train-Custom-Data Training YOLOv5 on custom data]
|
+
|[https://docs.google.com/presentation/d/1HDvgGdONIZXZcbZfJaXkRHhLD_gTOTZK1rHa_hA2yhU/edit?usp=sharing slides]
 
|-
 
|-
 
|style="background:rgb(102, 204, 255)"|24
 
|style="background:rgb(102, 204, 255)"|24
 
|May 10
 
|May 10
|Ethics & societal issues
+
|Learning<!--Ethics & societal issues-->
|Professional responsibilities, robots as caregivers and soldiers
+
|Dynamical tasks, reinforcement learning<!--Professional responsibilities, robots as caregivers and soldiers-->
|[https://www.acm.org/code-of-ethics ACM], [https://www.ieee.org/about/corporate/governance/p7-8.html IEEE] codes of ethics; [http://nameless.cis.udel.edu/class_data/849_f2018/coe_for_robotics_engineers.pdf proposed code for robotics engineers];<br>[http://moralmachines.blogspot.com/2009/01/ethical-frontiers-of-robotics.html "The Ethical Frontiers of Robotics", N. Sharkey (2008)]
+
|[https://github.com/Farama-Foundation/Gymnasium Gymnasium Github]<!--[https://www.acm.org/code-of-ethics ACM], [https://www.ieee.org/about/corporate/governance/p7-8.html IEEE] codes of ethics; [http://nameless.cis.udel.edu/class_data/849_f2018/coe_for_robotics_engineers.pdf proposed code for robotics engineers];<br>[http://moralmachines.blogspot.com/2009/01/ethical-frontiers-of-robotics.html "The Ethical Frontiers of Robotics", N. Sharkey (2008)]-->
|Quiz #2
+
|[https://docs.google.com/presentation/d/1V3S2bHhwFFzejrBvXQ_FN1Y3hBSFxRaKeW2LLIV46yc/edit?usp=sharing slides]
 
|-
 
|-
 
|25
 
|25
 
|May 12
 
|May 12
|FULL PERIOD FOR DEMOS/COMPETITION
+
|HW #6
 +
|Quiz then coding time
 
|
 
|
|
+
|Quiz #2
|''HW #6 due''
 
 
|-
 
|-
 
|
 
|
|
+
|May 17
|NO FINAL EXAM
+
|HW #6 demo/competition
|
 
 
|
 
|
 
|
 
|
 +
|''HW #6 due''
 
|}
 
|}

Latest revision as of 06:23, 10 May 2023

Course information

Title CISC367-012 Introduction to Mobile Robot Programming
Description A hands-on approach to implementing mobile robot algorithms on a small wheeled platform, both in simulation and reality. We will review the fundamentals of kinematics, planning, sensing, and control, as well as getting acquainted with higher-level concepts related to navigation, tracking, mapping, and learning.
When Wednesdays and Fridays, 8:40-9:55 am. When there is a homework due, no more than the first 30 minutes of each class will be in lecture format. The rest of the class period (and optionally the subsequent office hours) will be spent working on the robots.
Where Feb. 8: Willard 215

Feb. 10 and beyond: Smith 211

Instructor Christopher Rasmussen, 446 Smith Hall, cer@cis.udel.edu
Office hours Mondays, 9-10:30 am and Wednesdays and Fridays, 9:55-11 am in Smith 211 (starting Feb. 10)
Grading
  • 90% 6 programming homeworks
  • 10% 2 quizzes

Programming assignments will be graded on how many of the subtasks you complete or demonstrate successfully.

For the overall course grade, a preliminary absolute mark will be assigned to each student based on the percentage of the total possible points they earn according to the standard formula: A = 90-100, B = 80-90, C = 70-80, etc., with +'s and -'s given for the upper and lower third of each range, respectively. Based on the distribution of preliminary grades for all students (i.e., "the curve"), the instructor may increase these grades monotonically to calculate final grades. This means that your final grade can't be lower than your preliminary grade, and your final grade won't be higher than that of anyone who had a higher preliminary grade.

I will try to keep you informed about your standing throughout the semester. If you have any questions about grading or expectations at any time, please feel free to ask me.

Academic policies Programming projects should be demo'd in class on the deadline day and uploaded to Canvas by midnight of that day (with a grace period of a few hours afterward...after sunrise is definitely late). A late homework is a 0 without a valid prior excuse. To give you a little flexibility, you have 3 "late days" to use on homeworks to extend the deadline to the next class period without penalty. No more than one late day may be used per assignment. Late days will automatically be subtracted, but as a courtesy please notify the instructor in an e-mail of your intention to use late days before the deadline.

Assignment submissions should consist of a directory containing all code (your .cpp/.py files, etc.), any output data generated (e.g., images, movies, etc.), and an explanation of your approach, what worked and didn't work, etc. contained in a separate text or HTML file. Do not submit executables or .o files, please! The directory you submit for each assignment should be packaged by tar'ing and gzip'ing it or just zip'ing it.

Students can discuss problems with one another in general terms, but must work independently or within their teams as specified for each assignment. This also applies to online and printed resources: you may consult them as references (as long as you cite them), but the words and source code you turn in must be yours alone. The University's policies on academic dishonesty are set forth in the student code of conduct here.

Instructions/Resources

Robot

Yes, our robot platform is a Roomba. Except it can't clean.

  • Platform: iRobot Create 3 running ROS2 Humble
  • SBC: Raspberry Pi 4 B running Ubuntu 22.04 and ROS2 Humble
  • Lidar: Slamtec RPLIDAR A1
  • Camera: RealSense D435
Operating system

Working backwards from the ROS requirements there are 3 options for your laptop...

  • Linux: Ubuntu Desktop 22.04 LTS. Installation instructions
  • Windows: Version 10.
  • MacOS: Mojave 10.14. Several students have had success on HW #1 by installing Ubuntu in a UTM VM. Bridging mode was necessary for wifi
ROS2

We are using ROS2, to be exact, and the Humble Hawksbill version. My laptop and the Raspberry Pi's on the robots are running Ubuntu, so you will have your best support for this option. I have personally tried installing ROS2 Humble on Windows and it seems to be fine, so I at least have the ability to test things in this environment. I have no access to a Mac and therefore no experience and no ability to help troubleshoot issues there. A lot of public sample code and tutorials are available in both C++ and Python, but expect mostly C++ examples from me. I am agnostic about which of these two languages you use for homeworks and projects, but you will get your best support from me in C++.

Readings
  • K. Lynch and F. Park, Modern Robotics (MR), 2019
  • P. Corke, Robotics, Vision, and Control (RVC), 2017
  • K. Astrom and R. Murray, Feedback Systems (FS), 2008
  • S. Lavalle, Planning Algorithms (PA), 2006
Gazebo

This is a 3-D robot simulator


Schedule

Note: The blue squares in the "#" column below indicate Wednesdays.

# Date Topic Notes Readings/links Assignments/slides
1 Feb. 8 Introduction Background, course information slides
2 Feb. 10 hello robot, hello ROS ROS basics, Create ROS interface Creating & building ROS2 packages, ROS2 nodes,
ROS2 topics, Create 3 topics
slides
HW #1
3 Feb. 15 Kinematics/Dynamics Degrees of freedom, configuration space; wheeled systems (unicycle/car vs. differential drive) MR Chap. 2, PA Chap. 13.1.2

slides

4 Feb. 17 Kinematics/Dynamics URDFs, ROS rviz2 and tf2, odometry rviz2 user manual (Turtlebot3), Introduction to tf2, MR Chap. 13.4 slides
HW #1 due
5 Feb. 22 More ROS + Controllers ROS 2 workspaces and packages, subscribing and publishing, and timer callbacks; basic feedback controller concepts FS Chap. 1 slides
HW #2
6 Feb. 24 Controllers Waypoint following, line following, trajectory and wall following (pure pursuit) RVC Chap. 4-4.1.2 slides
7 Mar. 1 HW #2 Coding time Starter code for Python wanderer

slides

8 Mar. 3 Costmaps and discrete motion planning Connecting to the lidar sensor; representing the environment as a map and basic planning rplidar_ros; MR Chap. 10.4,
Abbeel slides (skip SLAM, reflection maps)
slides
9 Mar. 8 Estimation Least-squares line-fitting and
outlier rejection with RANSAC for lidar scans
Scikit-learn ordinary least-squares and RANSAC slides
HW #2 due
10 Mar. 10 Localization Particle filters, MCL Thrun particle filtering slides slides
HW #3
11 Mar. 15 HW #3 Coding time
12 Mar. 17 HW #3 Quiz then coding time Quiz #1
13 Mar. 22 SLAM ROS slam_toolbox package slam_toolbox Github, Thrun FastSLAM slides slides
14 Mar. 24 HW #3 HW #3 due
Mar. 29 NO CLASS
Spring break
Mar. 31 NO CLASS
Spring break
Apr. 5 NO CLASS
Instructor away
15 Apr. 7 Computer vision Connecting to the RGB-D camera realsense-ros Github, basic OpenCV color processing HW #4 / #5
16 Apr. 12 Computer vision Line finding Hough lines slides
17 Apr. 14 Computer vision Tags/fiducials AprilTag Github, apriltag_ros slides
18 Apr. 19 HW #4 Coding time
19 Apr. 21 HW #4 Coding time HW #4 due
20 Apr. 26 HW #5 Coding time
21 Apr. 28 Computer vision Getting depth/3-D point cloud from the RealSense realsense-ros Github, perception_pcl for further analysis slides
HW #5 due
22 May 3 HW #6 Coding time HW #6
23 May 5 Computer vision Object detection YOLOv5 (Python, lots of requirements), Training YOLOv5 on custom data slides
24 May 10 Learning Dynamical tasks, reinforcement learning Gymnasium Github slides
25 May 12 HW #6 Quiz then coding time Quiz #2
May 17 HW #6 demo/competition HW #6 due