Team: Drew Pompa
Originally, I set out to create a type of “digital pet.” While this idea brings to mind images of Tamagotchi and Pokemon, I was aiming more for the real. The plan was to project a digital pet into a very realistic pet environment- that is, an aquarium, fully equipped with wood shavings and other details to complete the scene. For interaction, I had a stereo mic set up against the front wall of the aquarium (though hidden), which could detect when someone was tapping on the glass and come up with an idea of where the tap occurred, triggering action in the pet. The system worked pretty well when I was testing it (in a quiet room), but I noticed it was very sensitive to noise, which I assumed Tishman Hall would contain lots of. Despite my efforts at thresholding, I couldn’t seem to produce a “tight” response from the pet. Therefore, I had to change how the viewer interacts with the piece. I updated the concept, adding a mouse-based level of interactivity. However, this seemed to destroy the “analog” feel of the project. I finally decided to go back to my original, sound-based design (although slightly altered and with less equipment), and one all-nighter later, it was complete.
In this project, I aimed to explore the boundaries between the digital and the real through the notion of pets. Presented with a box containing a “pet,” how would users interact with it? Would their decisions be based on experience with real pets? I wanted to find out, and challenge people to interact with a small and simple piece.
The major technical component of the project deals with sound. Taps are detected using audio thresholding. In fact, three thresholds are present, and exceeding each one contributes to a different emotional effect on the pet. A slight notion of direction of taps was also included, but only if there was a significant discrepancy between the left and right channels of audio. However, this usually didn’t happen, because the mics are so close to each other, and they usually recorded about the same level from both channels. When activated, the pet will favor that side (left or right) of the screen when exploring the source of the tap.
The rest of the code works to animate and update the pet. Routine procedures include moving it to a new location, changing its color/expression, causing it to blink, and other actions. While none of these sorts of methods were very difficult to work out, it has been a challenge to call them sparingly enough to keep the program running smoothly while still presenting the “realistic” actions of a creature.
The hardware involved with this project is quite conventional. The “black box” itself is a cardboard box with black felt glued onto it. Inside is a computer monitor, which receives the video from the Processing code on my laptop via VGA. All sound is recorded using the laptop’s internal stereo mic, since the laptop is also inside of the box.
I’m very happy with the project, and I think viewers also enjoyed it. It was very simple and easy to use, yet autonomous, since it essentially challenged you to interact with it. I’m especially pleased with the overall look of the project. By stripping unnecessary details away, and reducing the visuals to the pixel-level, I feel like I’ve actually strengthened the aesthetic considerably. By framing it in a simple black box and presenting it in a “no-nonsense” manner on a table, the simplicity of the piece was emphasized. I feel like the linked video below displays these results quite well.