Session 3: Aug 3-14, 2020
The robot future is near. Artificial intelligence is becoming ubiquitous in today's society from the factory floor to our homes. Sometimes learned and others hardcoded, the intelligence that informs the behavior of devices such as self-driving cars, toys or nursebots functions in a narrow, pre-defined domain. But as the complexity of our systems grows, as self-driving cars become truly autonomous, or as our passive toys start to become playmates, our interactions with robots will become ever more important and intimate. What if robots could understand human nuance and react to it? How might this affect the way we treat the robot, and vice versa? This studio is an exploration of what makes something conscious and how us thinking that something is conscious has an effect on us when we interact with it.
In this studio, students will work with experts in human-robot interaction and create a robot that uses sensors and effectors to emulate a more “human” interaction. Students will use microprocessor electronics, 3D modeling software, and prototyping techniques to design, build and activate their social robots. They will learn how to create behaviors using external sensors (from simple switches and buttons to heat/temperature, light, gas, touch) and actuators (such as motors, lights, speakers, solenoids, valves, fans) into their designs to create responsive social robots.
Physics (Electricity, Magnetism)
Robotics (Arduino, Sensors, Actuators)
- Enrolling students must be any of the following:
- High School Student
- Post-High School Gap Year Student