Autonomous Suitcase

September 2016 [Completed]

Overview

My second experience at PennApps was PennApps XIV, where my team consisting of Nikunj Khetan, Megan Lau, Owen Li, and myself built a drivetrain for a suitcase to autonomously follow someone around. We had a single longboard truck with dual motors mounted to it, and a spare Microsoft Kinect (3D Camera) lying around so we decided to build a device that could track a person with a purpose, such as a self-following suitcase! The next 36 hours were spent working on this project under the mentorship of Rapahel Chang, Anurag Makineni, and Brent Yi, and were able to scrap together a semi-functional machine – it could track a person and the motors would move it in the general correct direction, but in the end, we ran out of time to finish testing and fine tuning the controls.

Development

The drivetrain frame was laser cut out of quarter inch plywood with glue and tight finger joints to fix all the pieces together. Two brushless outrunners were mounted to a longboard truck for a single reduction drivetrain, which was then mounted at an angle to the frame. The two motors each had a VESC (Vedder’s Electronic Speed Controller) and was controlled via an ODROID-XU4. The code was written in Python, built on top of ROS.

The Microsoft Kinect, a 3D Camera, interfaced with ROS through a library we found online. It published a linear and an angular velocity Vector based on the point cloud it sees to our drivetrain controller which would interpret these two values and spin the right and left motors independently such that the drivetrain would move at the defined speed and direction. I worked on the code and electronics system for this project.