Smart Trash – Physical Computing Final

Katie and I worked together on this project as our Physical Computing Final.

This project started as an attempt to apply what we had learned in our physical computing class to street furniture. Both Katie and I are fairly urban oriented (natives of Detroit and NYC respectively) and were eager to connect our new ITP knowledge with our prior interests.  The trash can emerged as our point of focus since its probably the most ubiquitous form of street furniture and one of the least interactive. Over the course of this project I think we learned valuable lessons in terms of our approach to interactive work and completing physical projects in general.

FullSizeRender

Thinking:

Once we had selected the trash can as object of interest we spent some time looking at different types of cans, reccent ideas in this area and doing some field research.

IMG_1795

We reached a number of important conclusions about trash behaviors:

  • Many people do not want to hold onto “garbage” for an extended period of time
  • Many people do not want to litter
  • Many people do not want to touch a public trash can and will avoid doing so even if it means littering
  • Many people would like to recycle but are frustrated by the lack of trash differentiation

These user behaviors work together to produce some strange results :

One of the most technological trash cans in wide spread use is the Big Belly solar compactor. The advantage with these cans is that they are self contained (self powered), aware of how full they are,  relatively easy to clean out, and most importantly they compact the garbage as the can fills. This decreases both the number of trash receptacles required for a given area and the frequency at which they need to be cleaned out. This saves a lot of time and money for any agency that is responsible for keeping streets clean. The glaring failure of the Big Belly trash cans in our opinion and (others) is that they require the user to grab a handle to open and close a little door to deposit their trash. As a result in this has become a fairly common scene in NYC neighborhoods where these have been deployed: 

12205102_10206675333256331_1346964932_n
(Thanks to Jordan Frand)

In addition to user behavior we also identified a number of other key points about street trash:

  • Geography is important – E.g. trash receptacles near a fast food restaurant will overflow with trash related to that business while an empty can might be down the street
  • Collection is the primary concern – Getting people to recycle or put their trash in the trash can doesn’t matter if theres no one to pick it up and deal with it.
  • Cost and durability – Street furniture needs to be durable and affordable on a large scale

Based on these findings we came up with a few directions for improvement.  We wanted to have the can open and close as a way to better contain the garbage inside of it, to protect from wind and rain, and to indicate if the can was full. We knew we wanted this opening and closing to be hands free and to be interesting/entertaining for people so that they thought more about their interaction with the garbage can. We wanted to build a system where additional components could be attached easily depending on the needs for that specific geography. One example of this in action was made by Sandra Hoj out of a tube for used coffee cups.

We also wanted this “system” to sit ontop of existing garbage cans in NYC. In reviewing the types of trash cans suitable for street use we discovered that the common light metal mesh cans cost around $100-250 while heavier cans that don’t fall over as easily can cost up to $850. We thought that this gave us an interesting opportunity to create a product that attached to the lighter, more easily moved and cleaned models that would make the total cost less than a heavier model. Lastly in followup conversations with Tom Igoe, our professor, we thought that data collection would be an important goal as well. Measuring the weight and volume of garbage would give us a diagnostic tool that could be used to determine what attachments or design changes worked and didn’t work.

 

Making:

In developing our product we had to repeatedly revise, scale back, and pivot in many areas. Our final product ended up being very different from what we set out to do but going from concept to creation was definitely educational. We initially focused on developing the lid and opening mechanism that we thought would serve as the base for our motor, sensors, and other attachments. As we thought about opening and closing mechanisms we wanted something that was visually interesting but also quick enough to prevent garbage from spilling out if the can was knocked over. Ultimately we landed on trying to produce an iris diaphragm. Our first attempts at building an Iris were promising but as we scaled up we ran into a constant stream of problems.

 

IMG_5665
Our first Iris

 

We made over four different Iris’ with a ton of help from our classmate Franklin using different materials and designs but each of them failed for one reason or another (sources we used for Iris design 1,2,3) . With certain designs the issue seemed to be materials but then it would be hardware or problems with the gears for the motor. Honestly, I don’t really know why we had so much trouble with the Iris but I suspect that the designs we were using didn’t scale up to the size we needed it to be with our level of precision. Lacking a base to build the rest of our trash system and the time it took to build each Iris sent us scrambling. While we had most of the electronics laid out and programed not having anything to attach them to left us at dead end. (Video of electronics coming)

Using what we had available we built a different type of closure and built a box around our prototype trash can to act as the housing for the electronics.


Due to our time crunch we primarily focused on the opening and closing interaction. We had the lid slide open using a stepper motor, stop, determine if their was still motion happening in the view a motion sensor, and if not close again. In addition to this we used a trio of FSR sensors on the bottom of the can to detect changes in weight. If the can detected that something had been placed inside it would let out a short three tone sound to let the user know that their contribution was welcomed. Lastly we had a trip ball sensor that would disable the opening and closing response if the can was not right side up. Here is a video of the final product in action (SmarTrash).

While I certainly would have preferred to build something closer to our original concept I am grateful for the experience I gained in this attempt and and happy with the final product as a demonstration of the skills I’ve acquired in physical computing and elsewhere at ITP this semester.

Astor Place Documentary – Final Project

Working with Jordan, Osama, and Yiting from my sound and video class we created a short documentary on the history of the Astor place subway station.

A Brief History of Astor Place from Yiting Liu on Vimeo.

Creating this video used each of the skills we learned in class. We started out by storyboarding our concept, then shooting the various frames. Once we had a good deal of footage we began editing it into a cohesive film. At this point we recorded much of the sound for the project in the form of narration and shot a few extra clips that we felt were needed or as replacements for bad shots. After a couple days of staring at a computer screen and scrubbing back and fourth we finally added some background music, transitions, location bubbles and titles. Ultimately, I think this was a great exercise in the skills we learned over the first half of the semester. I am happy with the final project and look forward to producing more media like this during my time at ITP.

Honk Box – ICM Final

Here is a demo version of the Honk Box with pre-recorded street sounds and honking (Honk Box) (video) .

Screen Shot 2015-12-02 at 10.34.53 AMOverview and origin:
Honk Box detects, records and reacts to car horns. As Honk Box hears more and more honking it updates its expression to reflect either approval, disappointment, annoyance, or sadness. In the demo version the expression updates as total honks in that particular session climb higher as well as during individual periods of intense honking.

I developed Honk Box as a way to document and hopefully reduce noise pollution around major intersections.

The bridge I live next to is loud all day and night (DNAinfo). The goal of Honk Box is for it to be displayed in the view of drivers, reacting in real time to honking. Part of the inspiration for this project is the “Your speed is…” signs that are meant to encourage drivers to slow down. The idea is that if people know their behavior is observable, pollutive, and potentially shameful they may be less likely to engage inthat activity. Noise pollution, especially from cars, has become a more pressing issue as transit infrastructure in general becomes more congested and urban density has increased.

Your speed is...

Honk Box detects honks my measuring the number of continuous loud peaks in the ambient sound.   By default if more than 3 peaks of sufficient loudness persist for more than 1/8th of a second a honk will be counted. The settings panel can be used to adjust sensitivity.IMG_1867

 

Process:

Honk Box originally started out as an idea for my physical computing final. I envisioned a physical box that would look like a street sign with an LED light board to display the faces and the total honk count. However in considering the amount of programming involved in the project and my lack of fabrication skills at this point I decided to make a web based version instead. While considering this transition from a physical device to a web app I decided that as long as it lives on the web it should be usable/deployable in a number of different contexts and on a wide range of devices. As such I spent a great deal of time making sure to incorporate design that was responsive to screen sizes so that it could be run on phones, tablets, or computers. In addition I decided I wanted some form of control panel so users could adjust the honk detection variables to suit their needs. Currently honk box is adapting well to most devices I’ve tested it on but I have run into some browser issues that I hope to iron out.

Another feature I really wanted to incorporate was data logging. I feel that the problem with honking is that it is very disruptive but at the same time ephemeral and semi-anonymous. This makes policing and assessing the problem difficult. So in addition to the emotional feedback the face provides to drivers I thought I would be valuable to have a record of just how “bad” honking really is at any given location over time. Unfortunately this data feature is not fully implemented but it is something I’d like to return to when I have a better idea of how to store and organize the data. This did however lead me to create the data screen which shows the FFT and spectrogram analysis of the live sound.

Screen Shot 2015-12-01 at 11.50.04 PMDeveloping and learning about spectrogtraphs was a major tuning point in how I was attempting to detect the honk sounds. Prior to building the spectrograph I was relying on a mixture of loudness detection and a sum of least squares comparison between previously analyzed “honk Sounds” and the average ambient sounds. This method kind of worked but could easily be fooled by other loud sounds such as motorcycle engines and kneeling busses. Seeing how the spectrograph represented the honk sounds I realized that honks are relatively distinct in the pattern they create. Where as most street sounds tend to either flood all frequencies or move through a number of frequency ranges honking created nice flat lines across the spectrograph. Ultimately the honk detection I used would seek out these “lines” in the sound which allowed me to filter out most false positives. So while the current data tab was originally a calibration and development tool I left it in as part of the final program so that end users could adjust the settings in response to the same sound behaviors I saw but in their local environment.

Going forward I will be thinking about ways to improve this system. A couple of directions I’m already interested in include using machine learning techniques to better identify honking, and using some database technology to store honk data from multiple locations simultaneously.

Please contact me with advice, comments, or questions.

 

Acknowledgements:

Reaction faces come from the Noun Project and were created by Dani Rolli

I received a ton of help from Josh Kramer on sound analysis, as well as  Peter Glennon and Alec Horwitz on programing.