Difference between revisions of "CISC849 S2018"

From class_wiki
Jump to: navigation, search
(Schedule)
(Possible Papers to Present (not a complete list))
Line 49: Line 49:
 
==Possible Papers to Present (not a complete list)==
 
==Possible Papers to Present (not a complete list)==
  
 +
* [https://arxiv.org/pdf/1604.07316.pdf End to End Learning for Self-Driving cars], Bojarski ''et al.'', 2016
 
* "Collaborative mapping of an earthquake-damaged building via ground and aerial robots", N. Michael ''et al.'', JFR 2012. ''UAV, UGV, disaster, mapping''
 
* "Collaborative mapping of an earthquake-damaged building via ground and aerial robots", N. Michael ''et al.'', JFR 2012. ''UAV, UGV, disaster, mapping''
 
* "Vision Based Victim Detection from Unmanned Aerial Vehicles", M. Andriluka ''et al.'', IROS 2010.  ''UAV, person detection''
 
* "Vision Based Victim Detection from Unmanned Aerial Vehicles", M. Andriluka ''et al.'', IROS 2010.  ''UAV, person detection''

Revision as of 10:58, 31 January 2018

Course information

Title CISC849 Robot Vision and Learning
Shortened URL https://goo.gl/ektJij
Description Survey of image-based 2-D and 3-D sensing algorithms for mobile robot navigation and interaction, including motion estimation, obstacle segmentation, terrain modeling, and object recognition, with a particular focus on deep learning techniques to dramatically improve performance.
When Tuesdays and Thursdays, 11-12:15 pm
Where Smith 102A
Instructor Christopher Rasmussen, 446 Smith Hall, cer@cis.udel.edu
Office hours Tuesdays and Thursdays, 3:30-4:15 pm
Grading
  • 20% Oral paper presentation (individual or pairs, 30 minutes)
  • 30% Two programming assignments (individual)
  • 50% Final project (teams of 1-3)
    • 10% = 2 page proposal, including planned methods, citations of relevant papers, data sources, and division of labor
    • 10% = Joint 15-minute presentation on final results, with accompanying slides
    • 30% = Actual results and estimated effort, factoring in difficulty of problem tackled
Academic policies Programming projects are due by midnight of the deadline day (with a grace period of a few hours afterward...after sunrise is definitely late). A late homework is a 0 without a valid prior excuse. To give you a little flexibility, you have 6 "late days" to use on homeworks to extend the deadline by one day each without penalty. No more than three late days may be used per assignment. Late days will automatically be subtracted, but as a courtesy please notify the instructor in an e-mail of your intention to use late days before the deadline. See submission instructions below.

Students can discuss problems with one another in general terms, but must work independently on programming assignments. This also applies to online and printed resources: you may consult them as references (as long as you cite them), but the words and source code you turn in must be yours alone. The University's policies on academic dishonesty are set forth in the student code of conduct here.

Homeworks Assignment submissions should consist of a directory containing all code (your .cpp files, makefile, etc.), any output data generated (e.g., images, movies, etc.), and an explanation of your approach, what worked and didn't work, etc. contained in a separate text or HTML file. Do not submit executables or .o files, please! The directory you submit for each assignment should be packaged by tar'ing and gzip'ing it or just zip'ing it. The resulting file should be submitted through Canvas.

You may develop your C/C++ code in any fashion that is convenient--that is, with any compiler and operating system that you want. However, we will be grading your homework on a Linux system with a makefile, and so you must avoid OS- and hardware-specific functions and provide a makefile for us that will work (like one of the templates above).

Possible Papers to Present (not a complete list)

  • End to End Learning for Self-Driving cars, Bojarski et al., 2016
  • "Collaborative mapping of an earthquake-damaged building via ground and aerial robots", N. Michael et al., JFR 2012. UAV, UGV, disaster, mapping
  • "Vision Based Victim Detection from Unmanned Aerial Vehicles", M. Andriluka et al., IROS 2010. UAV, person detection
  • "Biped Navigation in Rough Environments using On-board Sensing", J. Chestnutt, Y. Takaoka, K. Suga, K. Nishiwaki, J. Kuffner, and S. Kagami, IROS 2009. Footstep planning, ladar, plane fitting
  • "Real-Time Navigation in 3D Environments Based on Depth Camera Data", D. Maier, A. Hornung, and M. Bennewitz, Humanoids 2012. RGB-D, localization, mapping, planning
  • "Robotic Grasping of Novel Objects using Vision", A. Saxena, J. Driemeyer, A. Ng, IJRR 2008. Grasping, learning
  • "Self-supervised Monocular Road Detection in Desert Terrain", H. Dahlkamp, A. Kaehler, D. Stavens, S. Thrun, and G. Bradski, 2006. DARPA GC, color similarity, segmentation
  • "Multi-Sensor Lane Finding in Urban Road Networks", A. Huang, D. Moore, M. Antone, E. Olson, S. Teller, RSS 2008. DARPA UC, edge detection, robust curve fitting, tracking
  • "High fidelity day/night stereo mapping with vegetation and negative obstacle detection for vision-in-the-loop walking", M. Bajracharya et al., IROS 2013. LS3, dense stereo depth, visual odometry
  • "Autonomous Door Opening and Plugging In with a Personal Robot", W. Meeussen et al., IROS 2010. PR2, grasping

Instructions for Homeworks

Operating system
  • If you don't have a Linux distribution running currently, it's not hard to add one to your machine. The Ubuntu website has comprehensive instructions on installing it from different sources (CD-ROM, USB stick, etc.) on a separate partition (aka "dual booting"). I recommend version 16.04.3 LTS.
  • You will get an account on my lab workstation for the duration of the course.
Software
  • ROS
    • Installation instructions (Kinetic, Ubuntu 16.04, Desktop-Full Install). This will take about 430 Mb of space, and it includes PCL for 3-D point cloud processing and OpenCV for computer vision/image processing. We will mainly be using ROS for these included libraries and the visualization functionality of the rviz tool, so don't worry about "learning" ROS. If you're curious, links to more information are below.
    • "Cheatsheet"
    • Tutorial videos
    • Rviz user guide
  • PCL
    • Do not install separately--we will just use version (1.7) included in ROS Kinetic
  • OpenCV
    • Do not install separately--we will just use version (3) included in ROS Kinetic

Schedule

Note: The blue squares in the "#" column below indicate Tuesdays.

# Date Topic Links/Readings/videos Assignments/slides
1 Feb. 6 Background, introduction to DARPA Robotics Challenge
2 Feb. 8 DRC algorithm components ARGOS challenge overview (8:00)
3 Feb. 13 Introduction to ROS ETH ROS mini course (in particular: overview, RViz, TF), ETH case study "ANYmal at the ARGOS Challenge"
4 Feb. 15 PCL tutorial
5 Feb. 20
Register/add deadline Feb. 19
Plane/obstacle/object segmentation (3-D) HW #1
6 Feb. 22 HW #1 strategies
7 Feb. 27 Image classification background Stanford CS231n Image classification slides
8 Mar. 1 Introduction to convolutional neural networks Stanford CS231n Convolutional Neural Networks slides
HW #1 due
9 Mar. 6 Introduction to OpenCV (in ROS), deep learning libraries,
detection & segmentation background
Stanford CS231n Deep Learning software slides, Detection and Segmentation slides
10 Mar. 8 More on detection, tracking YOLOv2, Redmon and Farhadi (CVPR 2017) HW #2
11 Mar. 13 Localization ETH localization lectures: 1 2
12 Mar. 15 Particle filters, localization Humanoid Robot Localization in Complex Indoor Environments, A. Hornung, K. Wurm, M. Bennewitz, IROS 2010. Monte Carlo localization Thrun particle filtering slides
13 Mar. 20 Motion planning background HW #2 due
14 Mar. 22 Perception for stepping "Learning Locomotion over Rough Terrain using Terrain Templates", M. Kalakrishnan, J. Buchli, P. Pastor, and S. Schaal, IROS 2009 Paper presentation choice due Friday, March 23
Mar. 27 NO CLASS
Spring break
Mar. 29 NO CLASS
Spring break
15 Apr. 3 Reinforcement learning; project kick-off
16 Apr. 5 DARPA Urban Challenge Urban Challenge highlights, Stanford clips
17 Apr. 10

Withdraw deadline Apr. 9

Student paper presentations Project proposal due Monday, April 9
18 Apr. 12 Student paper presentations
19 Apr. 17
Student paper presentations
20 Apr. 19 Mid-project review; student paper presentation
21 Apr. 24 Student paper presentations
22 Apr. 26 Motion planning background
23 May 1 Miscellaneous
24 May 3 Final project review
25 May 8 "Bonus" material
26 May 10 Miscellaneous
27 May 15 Final project presentations Final project due