Difference between revisions of "CISC367 S2023"

From class_wiki
Jump to: navigation, search
(Schedule)
(Schedule)
 
(193 intermediate revisions by the same user not shown)
Line 22: Line 22:
 
|-
 
|-
 
|align="right"|'''Office hours'''
 
|align="right"|'''Office hours'''
|Wednesdays and Fridays, 9:55-11 am in Smith 211 (starting Feb. 10)
+
|Mondays, 9-10:30 am and Wednesdays and Fridays, 9:55-11 am in Smith 211 (starting Feb. 10)
 
|-
 
|-
 
|align="right"|'''Grading'''  
 
|align="right"|'''Grading'''  
 
|
 
|
*85% 6 programming homeworks (10% for HW #1 and 15% for HW #2-6) 
+
*90% 6 programming homeworks  
*15% 3 quizzes
+
*10% 2 quizzes
  
 
Programming assignments will be graded on how many of the subtasks you complete or demonstrate successfully.   
 
Programming assignments will be graded on how many of the subtasks you complete or demonstrate successfully.   
Line 37: Line 37:
 
|-
 
|-
 
|align="right"|'''Academic policies'''
 
|align="right"|'''Academic policies'''
|Programming projects are due by midnight of the deadline day (with a grace period of a few hours afterward...after sunrise is definitely late).  A late homework is a 0 without a valid prior excuse.  To give you a little flexibility, you have 6 "late days" to use on homeworks to extend the deadline by one day each without penalty.  No more than three late days may be used per assignment.  Late days will automatically be subtracted, but as a courtesy please notify the instructor in an e-mail of your intention to use late days before the deadline.
+
|Programming projects should be demo'd in class on the deadline day and uploaded to Canvas by midnight of that day (with a grace period of a few hours afterward...after sunrise is definitely late).  A late homework is a 0 without a valid prior excuse.  To give you a little flexibility, you have 3 "late days" to use on homeworks to extend the deadline ''to the next class period'' without penalty.  No more than one late day may be used per assignment.  Late days will automatically be subtracted, but as a courtesy please notify the instructor in an e-mail of your intention to use late days before the deadline.
  
Assignment submissions should consist of a directory containing all code (your .cpp files, makefile, etc.), any output data generated (e.g., images, movies, etc.), and an explanation of your approach, what worked and didn't work, etc. contained in a separate text or HTML file.  Do not submit executables or .o files, please!  The directory you submit for each assignment should be packaged by tar'ing and gzip'ing it or just zip'ing it.  The resulting file should be e-mailed to the instructor (cer@cis.udel.edu).   
+
Assignment submissions should consist of a directory containing all code (your .cpp/.py files, etc.), any output data generated (e.g., images, movies, etc.), and an explanation of your approach, what worked and didn't work, etc. contained in a separate text or HTML file.  Do not submit executables or .o files, please!  The directory you submit for each assignment should be packaged by tar'ing and gzip'ing it or just zip'ing it.   
  
Students can discuss problems with one another in general terms, but must work independently on programming assignments.  This also applies to online and printed resources: you may consult them as references (as long as you cite them), but the words and source code you turn in must be yours alone.    The University's policies on academic dishonesty are set forth in the student code of conduct [http://www.udel.edu/stuguide/22-23/code.html here].  
+
Students can discuss problems with one another in general terms, but must work independently or within their teams as specified for each assignment.  This also applies to online and printed resources: you may consult them as references (as long as you cite them), but the words and source code you turn in must be yours alone.    The University's policies on academic dishonesty are set forth in the student code of conduct [http://www.udel.edu/stuguide/22-23/code.html here].  
 
|}
 
|}
  
==Instructions==
+
==Instructions/Resources==
 
{| class="wikitable" border="0" cellpadding="5" width="1600px"
 
{| class="wikitable" border="0" cellpadding="5" width="1600px"
 
!width="5%"|
 
!width="5%"|
Line 70: Line 70:
 
Working backwards from the ROS requirements there are 3 options for your laptop...
 
Working backwards from the ROS requirements there are 3 options for your laptop...
 
* Linux: Ubuntu Desktop 22.04 LTS.  [https://ubuntu.com/tutorials/install-ubuntu-desktop#1-overview Installation instructions]
 
* Linux: Ubuntu Desktop 22.04 LTS.  [https://ubuntu.com/tutorials/install-ubuntu-desktop#1-overview Installation instructions]
* Windows: Version 10.  No more, no less
+
* Windows: Version 10.   
* MacOS: Mojave 10.14
+
* MacOS: Mojave 10.14.  Several students have had success on HW #1 by installing Ubuntu in a [https://mac.getutm.app/ UTM] VM.  Bridging mode was necessary for wifi
 
|-
 
|-
 
|align="right"|'''[https://docs.ros.org/en/humble/Releases/Release-Humble-Hawksbill.html ROS2]'''
 
|align="right"|'''[https://docs.ros.org/en/humble/Releases/Release-Humble-Hawksbill.html ROS2]'''
Line 95: Line 95:
 
* Further changes to make are detailed in 8/30 lecture notes, but [[CISC829_F2012_TurtleBot_Changes|here they are in one place]]
 
* Further changes to make are detailed in 8/30 lecture notes, but [[CISC829_F2012_TurtleBot_Changes|here they are in one place]]
 
-->
 
-->
 +
|-
 +
|align="right"|'''Readings'''
 +
|
 +
* K. Lynch and F. Park, ''Modern Robotics'' (MR), 2019
 +
* P. Corke, ''Robotics, Vision, and Control'' (RVC), 2017
 +
* K. Astrom and R. Murray, ''Feedback Systems'' (FS), 2008
 +
* S. Lavalle, [http://lavalle.pl/planning/ ''Planning Algorithms''] (PA), 2006
 
|-
 
|-
 
|align="right"|'''[http://gazebosim.org/ Gazebo]'''
 
|align="right"|'''[http://gazebosim.org/ Gazebo]'''
Line 180: Line 187:
 
!width="22%"|Topic  
 
!width="22%"|Topic  
 
!width="15%"|Notes
 
!width="15%"|Notes
!width="32%"|Readings
+
!width="32%"|Readings/links
 
!width="20%"|Assignments/slides
 
!width="20%"|Assignments/slides
 
|-
 
|-
Line 188: Line 195:
 
|Background, course information
 
|Background, course information
 
|
 
|
|<!--[https://docs.google.com/presentation/d/1IqhJwouU42WX4ojwXB62rDXLaOGeVT7mc8Pg2PdZL6Q/edit slides]-->
+
|[https://docs.google.com/presentation/d/1YlBdcK4n1xprJ1EvvRiFsiYp1fWaI6n-O_R7i4ELcHk/edit?usp=sharing slides]
 
|-
 
|-
 
|2
 
|2
 
|Feb. 10
 
|Feb. 10
 
|hello robot, hello ROS
 
|hello robot, hello ROS
|ROS packages, nodes, topics<!--Installation, hello ROS-->
+
|ROS basics, Create ROS interface<!--Installation, hello ROS-->
|
+
|[https://docs.ros.org/en/humble/Tutorials/Beginner-Client-Libraries.html Creating & building ROS2 packages], [https://docs.ros.org/en/humble/Tutorials/Beginner-CLI-Tools/Understanding-ROS2-Nodes/Understanding-ROS2-Nodes.html ROS2 nodes],<br>[https://docs.ros.org/en/humble/Tutorials/Beginner-CLI-Tools/Understanding-ROS2-Topics/Understanding-ROS2-Topics.html ROS2 topics], [https://iroboteducation.github.io/create3_docs/api/ros2/ Create 3 topics]
|<!--[https://docs.google.com/presentation/d/1VaJg_Aa_ujbid7QYn1w6ykwD8gO-Efe9tyMLvp-D7NM/edit slides]-->HW #1
+
|[https://docs.google.com/presentation/d/1-1muKV86WozEMa3fSaM2hSziz1N9ggM6BrqqXCQ6Qrg/edit?usp=sharing slides]<br>[https://docs.google.com/document/d/1O-Lk3YdKJYOX0Pc4UWk37c7hMzzTy3PCZESInEYB1Ak/edit?usp=sharing HW #1]
 
|-
 
|-
 
|style="background:rgb(102, 204, 255)"|3
 
|style="background:rgb(102, 204, 255)"|3
 
|Feb. 15
 
|Feb. 15
|<!--ROS/Gazebo -->
+
|Kinematics/Dynamics<!--ROS/Gazebo -->
|<!--Writing simple ROS programs-->
+
|Degrees of freedom, configuration space; wheeled systems (unicycle/car vs. differential drive)<!--Writing simple ROS programs-->
 +
|[http://nameless.cis.udel.edu/class_data/cisc367_s2023/mr_chap2.pdf MR Chap. 2], [http://lavalle.pl/planning/ch13.pdf PA Chap. 13.1.2]
 
|  
 
|  
|
+
[https://docs.google.com/presentation/d/1wvJSl9fRQoGhRQ02BJgbhqncMszmAafsvNy3Nl0Cbdk/edit?usp=sharing slides]
<!--[https://docs.google.com/presentation/d/1v7SPJ-PntFw81_zTQXloVFvv0XU4kJXFjaQ8wfn6k3M/edit slides]-->
 
 
|-
 
|-
 
|4
 
|4
 
|Feb. 17
 
|Feb. 17
|<!--ROS/Gazebo-->
+
|Kinematics/Dynamics<!--ROS/Gazebo-->
|<!--Odometry, ground truth positions-->
+
|URDFs, ROS <tt>rviz2</tt> and <tt>tf2</tt>, odometry<!--Odometry, ground truth positions-->
|
+
|[https://turtlebot.github.io/turtlebot4-user-manual/software/rviz.html rviz2 user manual (Turtlebot3)], [https://docs.ros.org/en/humble/Tutorials/Intermediate/Tf2/Introduction-To-Tf2.html Introduction to tf2], [http://nameless.cis.udel.edu/class_data/cisc367_s2023/mr_odometry.pdf MR Chap. 13.4] <!-- [https://docs.ros.org/en/humble/Tutorials/Advanced/Simulators/Gazebo.html robot_state_publisher]-->
|''HW #1 due''<!--[https://docs.google.com/presentation/d/1j-oEgsL4DRU_EwcmBwKiCwmEYsKa3owOYrnZaMtW1Rg/edit slides] -->
+
|[https://docs.google.com/presentation/d/1HLQ4T4XW0T3jS112b8XO9CeJV1pmjZQVmhG-mPCn0gU/edit?usp=sharing slides]<br>''HW #1 due''<!--[https://docs.google.com/presentation/d/1j-oEgsL4DRU_EwcmBwKiCwmEYsKa3owOYrnZaMtW1Rg/edit slides] -->
 
|-
 
|-
 
|style="background:rgb(102, 204, 255)"|5
 
|style="background:rgb(102, 204, 255)"|5
 
|Feb. 22
 
|Feb. 22
|<!--Motion planning-->
+
|More ROS + Controllers<!--Motion planning-->
|ROS tf2, rviz, Gazebo<!--Background, kinematics, PID control-->
+
|ROS 2 workspaces and packages, subscribing and publishing, and timer callbacks; basic feedback controller concepts<!--Background, kinematics, PID control-->
|
+
|[http://nameless.cis.udel.edu/class_data/cisc367_s2023/fs_chap1.pdf FS Chap. 1]
 
<!--
 
<!--
 
*[http://robots.stanford.edu/probabilistic-robotics/ppt/wheeled-locomotion.ppt Thrun ''et al.'' wheeled locomotion slides]
 
*[http://robots.stanford.edu/probabilistic-robotics/ppt/wheeled-locomotion.ppt Thrun ''et al.'' wheeled locomotion slides]
 
*LaValle, Chap. [http://planning.cs.uiuc.edu/ch13.pdf 13.1.2]  
 
*LaValle, Chap. [http://planning.cs.uiuc.edu/ch13.pdf 13.1.2]  
 
-->
 
-->
|HW #2<!--[https://docs.google.com/presentation/d/1oGMcAJ5JgQKJ-32R_aOcCzvvZIdU3wqWq9Cxt7pks6s/edit slides]-->
+
|[https://docs.google.com/presentation/d/1nt3iCgqcPpAbYf-wsvbfmXEB5r_c-PiW9naKdUBKbkc/edit?usp=sharing slides]<br>[https://docs.google.com/document/d/1Wi37IMjnFE_q8lPkglzqp5TNFQhS9ziT0adj7g196hw/edit?usp=sharing HW #2]<!--[https://docs.google.com/presentation/d/1oGMcAJ5JgQKJ-32R_aOcCzvvZIdU3wqWq9Cxt7pks6s/edit slides]-->
 
|-
 
|-
 
|6
 
|6
 
|Feb. 24
 
|Feb. 24
|<!--Motion planning-->
+
|Controllers<!--Motion planning-->
|<!--Homing controllers, occupancy grid and grid-based planning -->
+
|Waypoint following, line following, trajectory and wall following (pure pursuit)<!--Homing controllers, occupancy grid and grid-based planning -->
|
+
|[http://nameless.cis.udel.edu/class_data/cisc367_s2023/978-3-319-54413-7_4.pdf RVC Chap. 4-4.1.2]
 
<!--* LaValle Chaps. [http://msl.cs.uiuc.edu/planning/ch1.pdf 1 (skip 1.5)], [http://planning.cs.uiuc.edu/ch15.pdf 15.3-15.3.2]-->
 
<!--* LaValle Chaps. [http://msl.cs.uiuc.edu/planning/ch1.pdf 1 (skip 1.5)], [http://planning.cs.uiuc.edu/ch15.pdf 15.3-15.3.2]-->
|<!--[https://docs.google.com/presentation/d/17THeILZrX6DaOOofNjkphXf2gmDUjhU_7XMcWvIkwq0/edit slides]-->
+
|[https://docs.google.com/presentation/d/1qDLmy_PEbZuUffjOEfV65U1L_WLD1RNjjiOSrVTa0ro/edit?usp=sharing slides]
 
|-
 
|-
 
|style="background:rgb(102, 204, 255)"|7
 
|style="background:rgb(102, 204, 255)"|7
 
|Mar. 1
 
|Mar. 1
|<!--Motion planning-->
+
|HW #2<!--Motion planning-->
|<!--Explanation of nav code (ROS [http://www.ros.org/wiki/costmap_2d costmap_2d] and [http://www.ros.org/wiki/navfn navfn], trajectory following, replanning), putting moving objects in Gazebo world-->
+
|Coding time<!--Explanation of nav code (ROS [http://www.ros.org/wiki/costmap_2d costmap_2d] and [http://www.ros.org/wiki/navfn navfn], trajectory following, replanning), putting moving objects in Gazebo world-->
 
|
 
|
 
<!--*[http://www.kavrakilab.org/iros2011 MP workshop at IROS 2011]-->
 
<!--*[http://www.kavrakilab.org/iros2011 MP workshop at IROS 2011]-->
|
+
|[https://docs.google.com/document/d/1N1xhL7z1E6cC12X1Ch6hbqC1o7yJ6hfZ7yRN5Kl1e1Q/edit?usp=sharing Starter code for Python <tt>wanderer</tt>]<br>
<!--[https://docs.google.com/presentation/d/1i-CH1gupMKolIyMOGTRu0aMCMw4BcBEfh1NUuV14ZlU/edit slides] -->
+
[https://docs.google.com/presentation/d/1MNZsoDwsiCzoJiQ-Ck4ljHeW-1wnk0a3_2DmCu01WXM/edit?usp=sharing slides]  
 
|-
 
|-
 
|8
 
|8
 
|Mar. 3
 
|Mar. 3
|<!--Motion planning-->
+
|Costmaps and discrete motion planning
|<!--Bug variants, [http://www.ros.org/wiki/sbpl search-based planning] -->
+
|Connecting to the lidar sensor; representing the environment as a map and basic planning
|
+
|[https://index.ros.org/r/rplidar_ros/#humble rplidar_ros]; [http://nameless.cis.udel.edu/class_data/cisc367_s2023/mr_chap10.4.pdf MR Chap. 10.4], <br>[https://people.eecs.berkeley.edu/~pabbeel/cs287-fa13/slides/MappingWithKnownPoses.pdf Abbeel slides] (skip SLAM, reflection maps)
 
<!--
 
<!--
 
*[http://www.cs.cmu.edu/~motionplanning/lecture/Chap2-Bug-Alg_howie.pdf Choset's Bug slides]
 
*[http://www.cs.cmu.edu/~motionplanning/lecture/Chap2-Bug-Alg_howie.pdf Choset's Bug slides]
 
*[http://www.youtube.com/watch?v=r1zbuLc8RhI ROSCon MP talk]
 
*[http://www.youtube.com/watch?v=r1zbuLc8RhI ROSCon MP talk]
 
-->
 
-->
|''HW #2 due''<!--[https://docs.google.com/presentation/d/1aHs6nhzTlx9K0Fj09vD01NwfthKoPFIhWyVdqi-PjSA/edit slides]-->
+
|[https://docs.google.com/presentation/d/1p6VmrFwmCLdWQIxuaNHbg7660AZBGWHLxfgNo6F0-tI/edit?usp=sharing slides]
 
|-
 
|-
 
|style="background:rgb(102, 204, 255)"|9
 
|style="background:rgb(102, 204, 255)"|9
 
|Mar. 8
 
|Mar. 8
|<!--Perception-->
+
|Estimation<!--Perception-->
|<!--Point clouds, estimation (least squares, RANSAC)-->
+
|Least-squares line-fitting and <br>outlier rejection with RANSAC for lidar scans
|
+
|Scikit-learn [https://scikit-learn.org/stable/modules/linear_model.html#ordinary-least-squares ordinary least-squares] and [https://scikit-learn.org/stable/modules/linear_model.html#ransac-regression RANSAC]
| <!--(see slides)<br>[https://docs.google.com/presentation/d/1O1HKKrjEmOUZcMZW1U_5j--AZWzWJHY_ZW3xbukRurc/edit slides] -->
+
|[https://docs.google.com/presentation/d/1mtrBRl3HeYwuw4aLGJoLWxNBgxMU0yv9un-D9qEzl04/edit?usp=sharing slides]<br>''HW #2 due''
 
|-
 
|-
 
|10
 
|10
 
|Mar. 10
 
|Mar. 10
|<!--Perception-->
+
|Localization
|ROS nav2<!--Kinect tilt controller, plane segmentation in PCL-->
+
|Particle filters, MCL
|
+
|[http://robots.stanford.edu/probabilistic-robotics/ppt/particle-filters.ppt Thrun particle filtering slides]
|HW #3<!--[https://docs.google.com/presentation/d/1jYSurX5AJqmjbbRH5kb3gqibpHbuPFvB1X4UgyO51Eg/edit slides]-->
+
|[https://docs.google.com/presentation/d/1wqMBu4MxXyffFIyB_KSxss0mvaBTmLYXq9ogNlyZFSw/edit?usp=sharing slides]<br> [https://docs.google.com/document/d/18gidhrrhHwwZiF3EWQZqZ96kaL8gtWZA8sBHyTiXA3k/edit?usp=sharing HW #3]
 
|-
 
|-
 
|style="background:rgb(102, 204, 255)"|11
 
|style="background:rgb(102, 204, 255)"|11
 
|Mar. 15
 
|Mar. 15
|<!--Perception-->
+
|HW #3
|<!--Point cloud matching, registration-->
+
|Coding time
|<!--RSS 2011 tutorials on [http://www.pointclouds.org/assets/rss2011/05_features.pdf keypoints], [http://www.pointclouds.org/assets/rss2011/07_range_images.ppt range images] -->
+
|
|<!--[https://docs.google.com/presentation/d/1uMyp_AcAt8ah4rfl_Jz3QZOFWE4vSVh3SW5W4bcsi4o/edit slides]-->
+
|
 
|-
 
|-
 
|12
 
|12
 
|Mar. 17
 
|Mar. 17
|<!--Perception ->
+
|HW #3
|<!--More details on registration and associated PCL functions -->
+
|Quiz then coding time
|
 
 
|
 
|
 +
|Quiz #1
 
|-
 
|-
 
|style="background:rgb(102, 204, 255)"|13
 
|style="background:rgb(102, 204, 255)"|13
 
|Mar. 22
 
|Mar. 22
|Localization
+
|SLAM
|
+
|ROS <tt>slam_toolbox</tt> package
|
+
|[https://github.com/SteveMacenski/slam_toolbox slam_toolbox Github], [http://nameless.cis.udel.edu/class_data/cisc829/oct18/thrun_fastslam.pdf Thrun FastSLAM slides]<!--[https://navigation.ros.org/tutorials/docs/navigation2_with_slam.html nav2 Navigating while mapping tutorial]-->
|''HW #3 due''
+
|[https://docs.google.com/presentation/d/17_J6F7mgkX69VI8iRt2O6zU2b4DsS3OLjhI3_HJj_OQ/edit?usp=sharing slides]
 
|-
 
|-
 
|14
 
|14
 
|Mar. 24
 
|Mar. 24
|SLAM<!--Localization -->
+
|HW #3<!--Simulation--><!--Localization -->
|<!--Inference/tracking, Markov localization -->
+
|<!--Randomized search, path smoothing--><!--ROS Gazebo--><!--Inference/tracking, Markov localization -->
|<!--[http://www.asl.ethz.ch/education/master/mobile_robotics/Lecture7b.pdf ETH localization 1] (through slide 31), [http://www.asl.ethz.ch/education/master/mobile_robotics/Lecture8.pdf ETH localization 2] (through slide 41),-->
+
|<!--[http://lavalle.pl/planning/ch1.pdf PA Chap. 1-1.3], [http://nameless.cis.udel.edu/class_data/cisc367_s2023/mr_chap10.1.pdf MR Chap. 10-10.1]--><!--[https://classic.gazebosim.org/tutorials Gazebo classic tutorials]--><!--[http://www.asl.ethz.ch/education/master/mobile_robotics/Lecture7b.pdf ETH localization 1] (through slide 31), [http://www.asl.ethz.ch/education/master/mobile_robotics/Lecture8.pdf ETH localization 2] (through slide 41),-->
|
+
|''HW #3 due''
 
|-
 
|-
 
|style="background:rgb(102, 204, 255)"|
 
|style="background:rgb(102, 204, 255)"|
Line 316: Line 323:
 
|15
 
|15
 
|Apr. 7
 
|Apr. 7
|SLAM
+
|Computer vision
|ROS slam_toolbox
+
|Connecting to the RGB-D camera
|
+
|[https://github.com/IntelRealSense/realsense-ros realsense-ros Github], basic OpenCV color processing
|HW #4
+
|[https://docs.google.com/document/d/1cRttNSmWXfSkUzC7snVSrJpz82R7oiioWd0fGGxnDDg/edit?usp=sharing HW #4 / #5]
 
|-
 
|-
 
|style="background:rgb(102, 204, 255)"|16
 
|style="background:rgb(102, 204, 255)"|16
 
|Apr. 12
 
|Apr. 12
|<!--Localization -->
+
|Computer vision<!--Localization -->
|<!--Particle filtering, Monte Carlo localization (ROS [http://www.ros.org/wiki/amcl amcl])-->
+
|Line finding<!--Particle filtering, Monte Carlo localization (ROS [http://www.ros.org/wiki/amcl amcl])-->
|<!--[http://robots.stanford.edu/probabilistic-robotics/ppt/particle-filters.ppt Thrun particle filtering slides], [http://nameless.cis.udel.edu/class_data/cisc829/oct4/UTenn_Parker_Localization.pdf U. Tennessee Monte Carlo Localization],  [http://robots.stanford.edu/papers/fox.aaai99.ps.gz Fox et al. AAAI 1999 paper], [http://www.youtube.com/watch?v=uiIi2rSKWAU Nao localization at U. Freiburg]-->
+
|[https://docs.opencv.org/3.4/d9/db0/tutorial_hough_lines.html Hough lines]<!--[http://robots.stanford.edu/probabilistic-robotics/ppt/particle-filters.ppt Thrun particle filtering slides], [http://nameless.cis.udel.edu/class_data/cisc829/oct4/UTenn_Parker_Localization.pdf U. Tennessee Monte Carlo Localization],  [http://robots.stanford.edu/papers/fox.aaai99.ps.gz Fox et al. AAAI 1999 paper], [http://www.youtube.com/watch?v=uiIi2rSKWAU Nao localization at U. Freiburg]-->
|
+
|[https://docs.google.com/presentation/d/1Fm2bfCp97l0buYP6xlHOw9j187zppfknAYc6JVIzJbI/edit?usp=sharing slides]
 
|-
 
|-
 
|17
 
|17
 
|Apr. 14
 
|Apr. 14
 
|Computer vision<!--Mapping-->
 
|Computer vision<!--Mapping-->
|Color<!--SLAM (ROS [http://www.ros.org/wiki/gmapping gmapping])-->
+
|Tags/fiducials<!--SLAM (ROS [http://www.ros.org/wiki/gmapping gmapping])-->
|<!--[http://nameless.cis.udel.edu/class_data/cisc829/oct18/thrun_fastslam.pdf Thrun FastSLAM slides], [http://www.youtube.com/watch?v=7iIDdvCXIFM gmapping demo], [http://www.youtube.com/watch?v=F8pdObV_df4 Darmstadt mapping]-->
+
|[https://github.com/AprilRobotics/apriltag AprilTag Github], [https://github.com/christianrauch/apriltag_ros apriltag_ros]<!--[http://nameless.cis.udel.edu/class_data/cisc829/oct18/thrun_fastslam.pdf Thrun FastSLAM slides], [http://www.youtube.com/watch?v=7iIDdvCXIFM gmapping demo], [http://www.youtube.com/watch?v=F8pdObV_df4 Darmstadt mapping]-->
|''HW #4 due''<!--[[CISC829_F2012_HW3 | HW #3]]-->
+
|[https://docs.google.com/presentation/d/1-60DuJZxCiqbybz7Ck4hdjuQ6ZJ24PVbw31Dq3h2aAM/edit?usp=sharing slides]
 
|-
 
|-
 
|style="background:rgb(102, 204, 255)"|18
 
|style="background:rgb(102, 204, 255)"|18
 
|Apr. 19
 
|Apr. 19
|Computer vision<!--Mapping-->
+
|HW #4<!--Mapping-->
|Object detection
+
|Coding time
 
|<!--UC Berkeley notes on [http://nameless.cis.udel.edu/class_data/cisc829/oct23/gmapping.pdf gmapping], [http://nameless.cis.udel.edu/class_data/cisc829/oct23/scan-matching.pdf scan-matching]-->
 
|<!--UC Berkeley notes on [http://nameless.cis.udel.edu/class_data/cisc829/oct23/gmapping.pdf gmapping], [http://nameless.cis.udel.edu/class_data/cisc829/oct23/scan-matching.pdf scan-matching]-->
|HW #5
+
|
 
|-
 
|-
 
|19
 
|19
 
|Apr. 21
 
|Apr. 21
|3-D point cloud processing<!--Mapping-->
+
|HW #4<!--Mapping-->
|<!--RGB-D SLAM (ROS [http://www.ros.org/wiki/rgbdslam rgbdslam])-->
+
|Coding time<!--RGB-D SLAM (ROS [http://www.ros.org/wiki/rgbdslam rgbdslam])-->
 
|<!--[http://nameless.cis.udel.edu/class_data/cisc829/oct25/engelhard_rgbd_slam_slides.pdf RGBD SLAM slides], [http://nameless.cis.udel.edu/class_data/cisc829/oct25/endres12icra.pdf ICRA 2012 RGBD SLAM paper], U. Texas slides on SIFT features [http://nameless.cis.udel.edu/class_data/cisc829/oct25/grauman_lecture15_local_features.pdf 1], [http://nameless.cis.udel.edu/class_data/cisc829/oct25/grauman_lecture16_bow.pdf 2]-->
 
|<!--[http://nameless.cis.udel.edu/class_data/cisc829/oct25/engelhard_rgbd_slam_slides.pdf RGBD SLAM slides], [http://nameless.cis.udel.edu/class_data/cisc829/oct25/endres12icra.pdf ICRA 2012 RGBD SLAM paper], U. Texas slides on SIFT features [http://nameless.cis.udel.edu/class_data/cisc829/oct25/grauman_lecture15_local_features.pdf 1], [http://nameless.cis.udel.edu/class_data/cisc829/oct25/grauman_lecture16_bow.pdf 2]-->
|
+
|''HW #4 due''
 
|-
 
|-
 
|style="background:rgb(102, 204, 255)"|20
 
|style="background:rgb(102, 204, 255)"|20
 
|Apr. 26
 
|Apr. 26
|
+
|HW #5
|
+
|Coding time
 
|
 
|
 
|
 
|
Line 358: Line 365:
 
|21
 
|21
 
|Apr. 28
 
|Apr. 28
|
+
|Computer vision
|
+
|Getting depth/3-D point cloud from the RealSense
|
+
|[https://github.com/IntelRealSense/realsense-ros realsense-ros Github], [https://github.com/ros-perception/perception_pcl/tree/ros2 perception_pcl] for further analysis
|''HW #5 due''
+
|[https://docs.google.com/presentation/d/14VjFGrkuAs2aSm0eA9A--8IjM-bsVk0Bb4kOfKtnRCc/edit?usp=sharing slides]<br>''HW #5 due''
 
|-
 
|-
 
|style="background:rgb(102, 204, 255)"|22
 
|style="background:rgb(102, 204, 255)"|22
 
|May 3
 
|May 3
|<!--Miscellaneous-->
 
|
 
|
 
 
|HW #6
 
|HW #6
 +
|Coding time
 +
|<!--[https://navigation.ros.org/ nav2], [https://navigation.ros.org/tutorials/docs/navigation2_on_real_turtlebot3.html Turtlebot3 navigation tutorial]-->
 +
|[https://docs.google.com/document/d/1c8_zHq2k7myrNJDrRX763M46lIiy27zwKTNoUf8qpfk/edit?usp=sharing HW #6]
 
|-
 
|-
 
|23
 
|23
 
|May 5
 
|May 5
|
+
|Computer vision
|  
+
|Object detection
|
+
|[https://github.com/ultralytics/yolov5 YOLOv5] (Python, lots of requirements), [https://github.com/ultralytics/yolov5/wiki/Train-Custom-Data Training YOLOv5 on custom data]
|
+
|[https://docs.google.com/presentation/d/1HDvgGdONIZXZcbZfJaXkRHhLD_gTOTZK1rHa_hA2yhU/edit?usp=sharing slides]
 
|-
 
|-
 
|style="background:rgb(102, 204, 255)"|24
 
|style="background:rgb(102, 204, 255)"|24
 
|May 10
 
|May 10
|Learning
+
|Learning<!--Ethics & societal issues-->
|Reinforcement learning
+
|Dynamical tasks, reinforcement learning<!--Professional responsibilities, robots as caregivers and soldiers-->
|
+
|[https://github.com/Farama-Foundation/Gymnasium Gymnasium Github]<!--[https://www.acm.org/code-of-ethics ACM], [https://www.ieee.org/about/corporate/governance/p7-8.html IEEE] codes of ethics; [http://nameless.cis.udel.edu/class_data/849_f2018/coe_for_robotics_engineers.pdf proposed code for robotics engineers];<br>[http://moralmachines.blogspot.com/2009/01/ethical-frontiers-of-robotics.html "The Ethical Frontiers of Robotics", N. Sharkey (2008)]-->
|
+
|[https://docs.google.com/presentation/d/1V3S2bHhwFFzejrBvXQ_FN1Y3hBSFxRaKeW2LLIV46yc/edit?usp=sharing slides]
 
|-
 
|-
 
|25
 
|25
 
|May 12
 
|May 12
 +
|HW #6
 +
|Quiz then coding time
 
|
 
|
|
+
|Quiz #2
|
 
|''HW #6 due''
 
 
|-
 
|-
 
|
 
|
|
+
|May 17
|NO FINAL EXAM
+
|HW #6 demo/competition
|
 
 
|
 
|
 
|
 
|
 +
|''HW #6 due''
 
|}
 
|}

Latest revision as of 06:23, 10 May 2023

Course information

Title CISC367-012 Introduction to Mobile Robot Programming
Description A hands-on approach to implementing mobile robot algorithms on a small wheeled platform, both in simulation and reality. We will review the fundamentals of kinematics, planning, sensing, and control, as well as getting acquainted with higher-level concepts related to navigation, tracking, mapping, and learning.
When Wednesdays and Fridays, 8:40-9:55 am. When there is a homework due, no more than the first 30 minutes of each class will be in lecture format. The rest of the class period (and optionally the subsequent office hours) will be spent working on the robots.
Where Feb. 8: Willard 215

Feb. 10 and beyond: Smith 211

Instructor Christopher Rasmussen, 446 Smith Hall, cer@cis.udel.edu
Office hours Mondays, 9-10:30 am and Wednesdays and Fridays, 9:55-11 am in Smith 211 (starting Feb. 10)
Grading
  • 90% 6 programming homeworks
  • 10% 2 quizzes

Programming assignments will be graded on how many of the subtasks you complete or demonstrate successfully.

For the overall course grade, a preliminary absolute mark will be assigned to each student based on the percentage of the total possible points they earn according to the standard formula: A = 90-100, B = 80-90, C = 70-80, etc., with +'s and -'s given for the upper and lower third of each range, respectively. Based on the distribution of preliminary grades for all students (i.e., "the curve"), the instructor may increase these grades monotonically to calculate final grades. This means that your final grade can't be lower than your preliminary grade, and your final grade won't be higher than that of anyone who had a higher preliminary grade.

I will try to keep you informed about your standing throughout the semester. If you have any questions about grading or expectations at any time, please feel free to ask me.

Academic policies Programming projects should be demo'd in class on the deadline day and uploaded to Canvas by midnight of that day (with a grace period of a few hours afterward...after sunrise is definitely late). A late homework is a 0 without a valid prior excuse. To give you a little flexibility, you have 3 "late days" to use on homeworks to extend the deadline to the next class period without penalty. No more than one late day may be used per assignment. Late days will automatically be subtracted, but as a courtesy please notify the instructor in an e-mail of your intention to use late days before the deadline.

Assignment submissions should consist of a directory containing all code (your .cpp/.py files, etc.), any output data generated (e.g., images, movies, etc.), and an explanation of your approach, what worked and didn't work, etc. contained in a separate text or HTML file. Do not submit executables or .o files, please! The directory you submit for each assignment should be packaged by tar'ing and gzip'ing it or just zip'ing it.

Students can discuss problems with one another in general terms, but must work independently or within their teams as specified for each assignment. This also applies to online and printed resources: you may consult them as references (as long as you cite them), but the words and source code you turn in must be yours alone. The University's policies on academic dishonesty are set forth in the student code of conduct here.

Instructions/Resources

Robot

Yes, our robot platform is a Roomba. Except it can't clean.

  • Platform: iRobot Create 3 running ROS2 Humble
  • SBC: Raspberry Pi 4 B running Ubuntu 22.04 and ROS2 Humble
  • Lidar: Slamtec RPLIDAR A1
  • Camera: RealSense D435
Operating system

Working backwards from the ROS requirements there are 3 options for your laptop...

  • Linux: Ubuntu Desktop 22.04 LTS. Installation instructions
  • Windows: Version 10.
  • MacOS: Mojave 10.14. Several students have had success on HW #1 by installing Ubuntu in a UTM VM. Bridging mode was necessary for wifi
ROS2

We are using ROS2, to be exact, and the Humble Hawksbill version. My laptop and the Raspberry Pi's on the robots are running Ubuntu, so you will have your best support for this option. I have personally tried installing ROS2 Humble on Windows and it seems to be fine, so I at least have the ability to test things in this environment. I have no access to a Mac and therefore no experience and no ability to help troubleshoot issues there. A lot of public sample code and tutorials are available in both C++ and Python, but expect mostly C++ examples from me. I am agnostic about which of these two languages you use for homeworks and projects, but you will get your best support from me in C++.

Readings
  • K. Lynch and F. Park, Modern Robotics (MR), 2019
  • P. Corke, Robotics, Vision, and Control (RVC), 2017
  • K. Astrom and R. Murray, Feedback Systems (FS), 2008
  • S. Lavalle, Planning Algorithms (PA), 2006
Gazebo

This is a 3-D robot simulator


Schedule

Note: The blue squares in the "#" column below indicate Wednesdays.

# Date Topic Notes Readings/links Assignments/slides
1 Feb. 8 Introduction Background, course information slides
2 Feb. 10 hello robot, hello ROS ROS basics, Create ROS interface Creating & building ROS2 packages, ROS2 nodes,
ROS2 topics, Create 3 topics
slides
HW #1
3 Feb. 15 Kinematics/Dynamics Degrees of freedom, configuration space; wheeled systems (unicycle/car vs. differential drive) MR Chap. 2, PA Chap. 13.1.2

slides

4 Feb. 17 Kinematics/Dynamics URDFs, ROS rviz2 and tf2, odometry rviz2 user manual (Turtlebot3), Introduction to tf2, MR Chap. 13.4 slides
HW #1 due
5 Feb. 22 More ROS + Controllers ROS 2 workspaces and packages, subscribing and publishing, and timer callbacks; basic feedback controller concepts FS Chap. 1 slides
HW #2
6 Feb. 24 Controllers Waypoint following, line following, trajectory and wall following (pure pursuit) RVC Chap. 4-4.1.2 slides
7 Mar. 1 HW #2 Coding time Starter code for Python wanderer

slides

8 Mar. 3 Costmaps and discrete motion planning Connecting to the lidar sensor; representing the environment as a map and basic planning rplidar_ros; MR Chap. 10.4,
Abbeel slides (skip SLAM, reflection maps)
slides
9 Mar. 8 Estimation Least-squares line-fitting and
outlier rejection with RANSAC for lidar scans
Scikit-learn ordinary least-squares and RANSAC slides
HW #2 due
10 Mar. 10 Localization Particle filters, MCL Thrun particle filtering slides slides
HW #3
11 Mar. 15 HW #3 Coding time
12 Mar. 17 HW #3 Quiz then coding time Quiz #1
13 Mar. 22 SLAM ROS slam_toolbox package slam_toolbox Github, Thrun FastSLAM slides slides
14 Mar. 24 HW #3 HW #3 due
Mar. 29 NO CLASS
Spring break
Mar. 31 NO CLASS
Spring break
Apr. 5 NO CLASS
Instructor away
15 Apr. 7 Computer vision Connecting to the RGB-D camera realsense-ros Github, basic OpenCV color processing HW #4 / #5
16 Apr. 12 Computer vision Line finding Hough lines slides
17 Apr. 14 Computer vision Tags/fiducials AprilTag Github, apriltag_ros slides
18 Apr. 19 HW #4 Coding time
19 Apr. 21 HW #4 Coding time HW #4 due
20 Apr. 26 HW #5 Coding time
21 Apr. 28 Computer vision Getting depth/3-D point cloud from the RealSense realsense-ros Github, perception_pcl for further analysis slides
HW #5 due
22 May 3 HW #6 Coding time HW #6
23 May 5 Computer vision Object detection YOLOv5 (Python, lots of requirements), Training YOLOv5 on custom data slides
24 May 10 Learning Dynamical tasks, reinforcement learning Gymnasium Github slides
25 May 12 HW #6 Quiz then coding time Quiz #2
May 17 HW #6 demo/competition HW #6 due