Myself and a team of 5 others developed a native iOS application that was meant to aid children with autism in emotion recognition. This app was developed in a 4 week time frame.
My role as a swift developer on the project involved aiding in general development of the app. My main focus was the connection to the Microsoft Azure API. Utilizing this API, I was able to place the emojis onto the picture that we sent to the service. This process was complicated, but the solution I built was able to take a picture with any amount of faces and place the proper emotions on each of the people in that photo according to Azure.