Daria and Sina's Duckie Bot

Daria and Sina's Duckie Bot

Daria Plotz and Sena Ball
1 / 15

Daria's Brief: The Duckiebot is a self-driving robot that navigates Duckie Town, a miniature city. It was inspired by self-driving car technology, including lane, light, and color detection. The DuckieBot is made up of a simple plastic chassis, a Pi camera, and a Raspberry Pi computer that processes all the programs that control the bot. To control the robot, the Pi camera records a live video stream, which is then processed by the Raspberry Pi according to pre-written Python programs. To detect the traffic lights, it crops the image and adjusts the image's color values to find the contours of a traffic light;  it then uses color recognition to see what color the light is. To detect lanes, the computer uses color recognition to find the center dashed yellow lines, right white lines, and red stop lines. It then uses Canny Edge Detection and the Hough Transform to find the edges of these lines. After combining all of the endpoints of the lines on each side of the lane marker, it finds the best fit lines for both sets of points and averages them to determine the center of the lane marker, which the robot then follows. 

The Nuvie Town Studio teaches students a lot about coding and computer visions. Students end up knowing how to write complex color detection, lane detection, and other algorithms while still being accessible by people without a strong background in computer science. These algorithms are still powerful and rewarding, even at a basic level. 

Sina's Brief:

The Duckie-Bot is a small, car-like robot that can drive autonomously to navigate a miniature town. It is inspired by self-driving cars and the computer vision behind them, including image, light, and color detection. An onboard Raspberry Pi controls the Duckie-Bot’s two wheels and filters a camera feed from the front of the robot to detect key features of the road. To detect the features, the feed is cropped to the region of the image where the feature is located. Then it’s converted to HSV (hue, saturation, and value), a different way of representing colors that makes it easier to detect lights and contrast. The third set of filters varies with the feature they are trying to detect, but in each case, they look for contrasting parts of a specific color of the image. For streets, that might be a yellow line that contrasts against the black street. For lights, it could be a bright green light against the background. Once the feature is detected, the robot determines what to do with the information, for example: turn to follow the line, or stop at the red light, and then powers the motors that turn or stop the corresponding wheels.

The Duckie-Bot provides students with a basic model of a self-driving vehicle without the expense. Operating the robot requires knowledge of computer vision, how to filter images to detect key features, determine what to do with the information gathered, and program navigation. The Duckie-Bot shows that computer vision and self-driving technology is easily accessible, and makes the artificial intelligence behind it seem less like a mystery, and more like a puzzle which pieces just need to be put in the right place.