The first two weeks were spent prototyping and brainstorming. Mostly we worked on the first iteration of the robotic arm, using an iPhone as a stand in for the monitor. The first week we spent a lot of time on learning about the software Unity to learn how to create scenes for the device, and exploring precedents to base them off of. The second week we worked on the mechanics.
The arm is composed of 4 servos, one for the base joint, one for the mid joint, and two for the top joint that is connected to the iPhone. We were originally going to use only three, but after the first try we realized that the project was greatly improved with another axis of motion for the monitor so we could follow the head at all angles. We then had the challenge of supporting the weight of all the servos on the base, while supporting the phone at the top of the arm. We decided to have a series of servos that got smaller as they got to the top, with a 3D printed series of parts connected to the smaller servos to stabilize them and connect them to the iPhone.
Once this was completely assembled and wired up we decided to simulate the movement with an Xbox controller, since we knew the head tracking technology wouldn’t be done for quite some time. About a day was spent troubleshooting adjusting the speed, acceleration, and sensitivity of the controllers. This proved important because in the beginning the parts were being ripped apart from the motors moving too far and too fast. In the end we had slow and steady controls that were optimal for our somewhat flimsy first prototype.