Duckie Town is a system of miniature artificially intelligent self-driving cars. Local Positioning System is a navigation system that uses an overhead view from a webcam and unique visual identifiers (AR tags) on the ducky bots to locate the Duckie bots.
The webcam reads the AR tags and computes their location using ArUco, an open source tag-tracking library. It then feeds their position to a server where the bots can retrieve their location. Learning computer vision and AI was challenging. Compiling open source software together is much more challenging than it appears. Every Stack has to be compiled and has to not interfere with the other elements.
At the beginning of the studio, we looked at the Original Duckie town. They attempted to read signed and street lines to navigate. I realized that it would be beneficial to assist this navigation and recognition with some larger sense of location using something like GPS. With this sense of position, you could know the local speed limit where stop signs and intersections are and actually navigate from point a to point B. Throughout the studio, I worked primarily on implementing software to identify the tag I looked at multiple different open source software and stacks for implementing them it was challenging to create a functional stack and creating one became the main focus. I knew it wasn't feasible to get all of the other navigation information associated with the location implemented in the 2 weeks. I focused on tag recognition and transmitting the location to the duckie bot. Many of the stacks I tried failed to build. I finally settled on a virtual machine running Ubuntu with OpenCV and the Aruco library running on it. To transmit that data I used ROS the robot operating systems communication suit to create a topic and post the locations to them.