https://cambridge.nuvustudio.com/studios/augmented-reality-games-s3/ar-chess#tab-portfolio
https://cambridge.nuvustudio.com/studios/augmented-reality-games-s3/ar-chess#tab-portfolio
Our Language Learning app is meant to assist language students and educators by providing a way to immediately translate the names of the most common objects around them.
We designed this app to simply recognize objects in real time and give the user a translation of the object in the language they are learning. To accomplish this, we used augmented reality to recognize different image targets and display a three-dimensional model of the particular object created using Maya software as well as text showing the actual Spanish translation. This technology gives a broad audience of users who are looking to become more comfortable with a language the ability to easily identify the objects that affect their lives the most.
People with autism have a hard time interacting with other people due to their lack of ability to detect one's emotions. This usually leads to awkward conversations and interactions, such as a person with autism acting cheerful around someone feeling sad, not knowing that they are sad. The Emotion Interpreter is an app that allows people to "scan" one's face. Then immediately develop a description of their current emotions, as well as the indications of these emotions so they can learn to detect emotions on their own. Also, there will be interaction hints, so they know what to do when one is a certain emotional state.
As we were brainstorming for ideas, we saw examples of emotion detectors which scanned one's face and detected the certain emotion based on their facial expressions. We were interested in this, and we thought about ways that this technology could be changed to help others. We thought about how people with autism have a hard time knowing what emotions other people are feeling, and this usually leads to awkward conversations. So we wanted to use the idea of a facial expression scanner to help them. We quickly realized that the applications we were using, Maya and Unity, made it very hard for us to make such a complicated app as an emotion detector, so we decided to use those apps and try to make something similar. We took pictures of people online feeling happy, sad, and angry, and we also took pictures of Zach making happy, sad, and angry faces, and put them in Unity so that if they were scanned, a popup would appear with the emotion, the indication, and interaction hints. We tried matching objects, and we tried to get Zach to make the same face to trick the app into thinking he was the picture. However, this was almost impossible, so we turned to solve our problem of how our application didn't look as good as we wanted. We played around with animations and eventually developed a cube that had the emotion and indications on one side, and emotions and interaction hints on the other side, and the cube would rotate slowly so the text could be read. Even though we didn't achieve what we wanted to accomplish in the beginning, we hope that applications similar to the Emotion Interpreter can help those with autism in the future.