Color-Based Line Detection

Daria Plotz and Sina Ball

This is a lane detection test on a pre-recorded video. The blue line shows where the Duckie Bot thinks the white line is. The green line shows where the Duckie Bot thinks the yellow line is. The red circle shows the intersection point between the two lines. 

Presentation

Maximus Reisner
1 / 17

Modeled after MIT's Duckietown, NuVie Town provides a miniature environment in which small autonomous vehicles can use computer vision to follow roads and obey traffic signals.  A ceiling-mounted camera is used for local position similar to how GPS is able to give the current and surrounding locations. 

This project will require learning how to code in python (and a bit in C++) and use the OpenCV library along with a few other smaller libraries to make the robot run. Additionally, the concept of telling a robot exactly what to do is very different from directing a person what to do since humans can interperate and deduce what is needed while robotics cannot. However, the coolest part is how this project is a predecessor of current technologies that are emerging and are going to be a prevalent part of human life in the near future.

The process for this project was fairly straightforward. It felt like progress was made every day. Although, this does not mean that challenges were not met. For example, there were a lot of issues with the multiprocessing which kept using all the computer's memory and crashing the program, but even this was eventually solved through a different method of memory allocation. In the end, the final result was a stable program that could navigate the streets of Duckietown.

Max's Duckiebot Video

Maximus Reisner

video

Aveen Nagpal

Final

Louie Adamian

Duckie Town is a system of miniature artificially intelligent self-driving cars. Local Positioning System is a navigation system that uses an overhead view from a webcam and unique visual identifiers (AR tags) on the ducky bots to locate the Duckie bots.

The webcam reads the AR tags and computes their location using ArUco, an open source tag-tracking library. It then feeds their position to a server where the bots can retrieve their location. Learning computer vision and AI was challenging. Compiling open source software together is much more challenging than it appears. Every Stack has to be compiled and has to not interfere with the other elements.

At the beginning of the studio, we looked at the Original Duckie town. They attempted to read signed and street lines to navigate. I realized that it would be beneficial to assist this navigation and recognition with some larger sense of location using something like GPS. With this sense of position, you could know the local speed limit where stop signs and intersections are and actually navigate from point a to point B. Throughout the studio, I worked primarily on implementing software to identify the tag I looked at multiple different open source software and stacks for implementing them it was challenging to create a functional stack and creating one became the main focus. I knew it wasn't feasible to get all of the other navigation information associated with the location implemented in the 2 weeks. I focused on tag recognition and transmitting the location to the duckie bot. Many of the stacks I tried failed to build. I finally settled on a virtual machine running Ubuntu with OpenCV and the Aruco library running on it. To transmit that data I used ROS the robot operating systems communication suit to create a topic and post the locations to them.

Daria and Sina's Duckie Bot

Daria Plotz and Sina Ball


Daria's Brief: The Duckiebot is a self-driving robot that navigates Duckie Town, a miniature city. It was inspired by self-driving car technology, including lane, light, and color detection. The DuckieBot is made up of a simple plastic chassis, a Pi camera, and a Raspberry Pi computer that processes all the programs that control the bot. To control the robot, the Pi camera records a live video stream, which is then processed by the Raspberry Pi according to pre-written Python programs. To detect the traffic lights, it crops the image and adjusts the image's color values to find the contours of a traffic light;  it then uses color recognition to see what color the light is. To detect lanes, the computer uses color recognition to find the center dashed yellow lines, right white lines, and red stop lines. It then uses Canny Edge Detection and the Hough Transform to find the edges of these lines. After combining all of the endpoints of the lines on each side of the lane marker, it finds the best fit lines for both sets of points and averages them to determine the center of the lane marker, which the robot then follows. 

The Nuvie Town Studio teaches students a lot about coding and computer visions. Students end up knowing how to write complex color detection, lane detection, and other algorithms while still being accessible by people without a strong background in computer science. These algorithms are still powerful and rewarding, even at a basic level. 

Sina's Brief:

The Duckie-Bot is a small, car-like robot that can drive autonomously to navigate a miniature town. It is inspired by self-driving cars and the computer vision behind them, including image, light, and color detection. An onboard Raspberry Pi controls the Duckie-Bot’s two wheels and filters a camera feed from the front of the robot to detect key features of the road. To detect the features, the feed is cropped to the region of the image where the feature is located. Then it’s converted to HSV (hue, saturation, and value), a different way of representing colors that makes it easier to detect lights and contrast. The third set of filters varies with the feature they are trying to detect, but in each case, they look for contrasting parts of a specific color of the image. For streets, that might be a yellow line that contrasts against the black street. For lights, it could be a bright green light against the background. Once the feature is detected, the robot determines what to do with the information, for example: turn to follow the line, or stop at the red light, and then powers the motors that turn or stop the corresponding wheels.

The Duckie-Bot provides students with a basic model of a self-driving vehicle without the expense. Operating the robot requires knowledge of computer vision, how to filter images to detect key features, determine what to do with the information gathered, and program navigation. The Duckie-Bot shows that computer vision and self-driving technology is easily accessible, and makes the artificial intelligence behind it seem less like a mystery, and more like a puzzle which pieces just need to be put in the right place.

made from pi to export photos

Maximus Reisner

Project board

Maximus Reisner

Ducky-bot

Richard Lourie

The DuckieBot is a self-driving robot that can navigate through a small scale town. Inspired by self-driving cars that are already on the market, the DuckieBot is a mechanically simple robot that drives and records video that the onboard Raspberry PI can process to steer the robot. A camera sends information to the computer, which identifies lanes and directs the robot to move around them according to the rules of the road.  

When making this project, I have gained a lot of insight into how self-driving cars work and how difficult they are to engineer. It's actually relatively easy to make a robot that can drive around in a perfect world. The hard part of making self-driving cars is actually implementing all the tiny possibility edge cases in which something unexpected happens. For example, what if something that looks like a lane to a robot falls onto the track? The robot would begin tracking this instead of the actual lanes. To make a safe self-driving car, all of these low-probability events need to be accounted for. 



for the streets

Aveen Nagpal

The Duckiebot is a vehicle that can navigate the streets of Duckie town either under remote control or under the control of a Raspberry Pi. The ability to control itself allows Duckie-Bot to be mostly self-sufficient, needing to stop to charge only every 12-18 hours. The Duckiebot uses its indicator lights in order to communicate its next action in a way that can be understood by human drivers. The Duckiebot is powered by two individual motors that are mounted parallel to each other, allowing the Duckiebot to turn; a ball caster provides stability by giving the Duckiebot a third point of contact to the ground. The Duckiebot was made to teach Nuvu students how self-driving cars work and how to build and use a Raspberry Pi, in preparation for more complex future projects.