Difference between revisions of "CISC849 F2023 Project"
(Blanked the page) |
|||
(3 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
+ | The final project is your opportunity to explore a mobile robot topic of your choosing. You may use ROS or APIs/libraries/simulators (with appropriate citations for *any* code written by someone else), as long as you are adding some value -- i.e., not just downloading code and running it, or running somebody's training script on somebody's data. Potential topics: | ||
+ | * Reinforcement learning or imitation learning in simulation (Isaac/Mujoco/Unity ML) | ||
+ | * Deep learning for visual object detection/segmentation | ||
+ | * Use AprilTags in the environment or attached to particular objects for mapping/localization/manipulation | ||
+ | * Some demonstration of lidar-based SLAM or localization on one of the robots | ||
+ | * Add a 3-D printed physical part to one of the robots to enable it to carry out a particular task | ||
+ | * Use the pan servo on <tt>yoshi</tt> to track and follow a person while using the lidar for obstacle avoidance | ||
+ | * Some other amazing idea that you have... | ||
+ | |||
+ | You may work alone or as part of a pair. Previously-formed teams do NOT automatically continue for the project -- you and your partner need to affirm that you want to continue working together on the project. So please send me an e-mail with your proposal and partner name, if applicable, as soon as possible but no later than Friday, November 17 so that I can give some feedback. I don't just want a copy-paste from the list above but also at least a paragraph on specific libraries/software/datasets you might use and what deliverables you are targeting. Time slots for 30-minute in-person demos (not in front of the class) on Monday, December 11 and Tuesday, December 12 (reading day) will be posted after Thanksgiving break. Your code and write-up must also be submitted on the day of your demo. Demos will take place in the lab. |
Latest revision as of 13:24, 7 December 2023
The final project is your opportunity to explore a mobile robot topic of your choosing. You may use ROS or APIs/libraries/simulators (with appropriate citations for *any* code written by someone else), as long as you are adding some value -- i.e., not just downloading code and running it, or running somebody's training script on somebody's data. Potential topics:
- Reinforcement learning or imitation learning in simulation (Isaac/Mujoco/Unity ML)
- Deep learning for visual object detection/segmentation
- Use AprilTags in the environment or attached to particular objects for mapping/localization/manipulation
- Some demonstration of lidar-based SLAM or localization on one of the robots
- Add a 3-D printed physical part to one of the robots to enable it to carry out a particular task
- Use the pan servo on yoshi to track and follow a person while using the lidar for obstacle avoidance
- Some other amazing idea that you have...
You may work alone or as part of a pair. Previously-formed teams do NOT automatically continue for the project -- you and your partner need to affirm that you want to continue working together on the project. So please send me an e-mail with your proposal and partner name, if applicable, as soon as possible but no later than Friday, November 17 so that I can give some feedback. I don't just want a copy-paste from the list above but also at least a paragraph on specific libraries/software/datasets you might use and what deliverables you are targeting. Time slots for 30-minute in-person demos (not in front of the class) on Monday, December 11 and Tuesday, December 12 (reading day) will be posted after Thanksgiving break. Your code and write-up must also be submitted on the day of your demo. Demos will take place in the lab.