Category Archives: Project 2

Project 2: Judgmental Robot

Group members: Matthew Conlen, Michael Gisi, Lauren Korany Context For project 2, our goal was to comment on the concept of judging based on physical characteristics. Based on a cultural norm of ‘attractiveness’, we attempted to explore the nature of ratings through performance/interactive art. By using classifying code to implement machine learning, we can create [...]

Project 2 – Global Perspective

Our project started as an attempt to compare the similarities and differences of RSS feeds of news sources from around the world. Origianlly we thought to sort the stories from each RSS feed into general categories of business, sports, etc. and display the importance and number of each story to the country of origin. However, upon finding a [...]

Project 2 :: Emotion Music

Andrew Hainen, Nikhil Mangla, Syed Wajahat Karim Our project is a machine learning application that learns what proper emotion to associate with a song based on the characteristics of the song. Conceptual Breakdown Our idea was to have a computer react in a realistic fashion to music in a way that a human would react.  [...]

Project 2 – Twitter Haiku

Project 2 – Twitter Haikus Adam Kidder, Kevin Shih, Cassandra Yaple I.  Conceptual Overview: This project aims to create art by challenging one of the deepest forms of poetic expression with what is considered one of the most pointed, shallow forms of communication on the web.  By creating haikus computationally with machine learning algorithms we [...]

Project 2: “Possessive Programing”

Dylan Box, Nick Peters, Josh Winters Concept: The development of emotional machines is an important step in bridging the gap of human and robotic interactions. Emotional machines allow us to feel empathy for the robot, causing interactions that work towards the interest of both the user and the object, preserving each party’s integrity of purpose. [...]

Project 2- Human beat

Alyssa and Andy Human Beat MVI_0824 Originally, we were very interested in ideas of confidence and what a confident/not confident body looks like in terms of posture. We wanted to use machine learning to classify when a body was in a confident stance versus a not confident stance. We discussed whether we would teach the [...]

Project 2 – Can you make Bart smile?

Playing with a computer – Can you make Bart smile? video: http://adaptiveart.eecs.umich.edu/2010/blog/wp-content/uploads/2010/11/proj2-bart.swf Overview- Our project reads peoples’ gestures, using a webcam as an input, and outputs emotions displayed on the screen with an animated face. The face on the display is not mirroring the individual using the program, but is acting in response to the [...]