Honk Box – ICM Final

Here is a demo version of the Honk Box with pre-recorded street sounds and honking (Honk Box) (video) .

Screen Shot 2015-12-02 at 10.34.53 AMOverview and origin:
Honk Box detects, records and reacts to car horns. As Honk Box hears more and more honking it updates its expression to reflect either approval, disappointment, annoyance, or sadness. In the demo version the expression updates as total honks in that particular session climb higher as well as during individual periods of intense honking.

I developed Honk Box as a way to document and hopefully reduce noise pollution around major intersections.

The bridge I live next to is loud all day and night (DNAinfo). The goal of Honk Box is for it to be displayed in the view of drivers, reacting in real time to honking. Part of the inspiration for this project is the “Your speed is…” signs that are meant to encourage drivers to slow down. The idea is that if people know their behavior is observable, pollutive, and potentially shameful they may be less likely to engage inthat activity. Noise pollution, especially from cars, has become a more pressing issue as transit infrastructure in general becomes more congested and urban density has increased.

Your speed is...

Honk Box detects honks my measuring the number of continuous loud peaks in the ambient sound.   By default if more than 3 peaks of sufficient loudness persist for more than 1/8th of a second a honk will be counted. The settings panel can be used to adjust sensitivity.IMG_1867

 

Process:

Honk Box originally started out as an idea for my physical computing final. I envisioned a physical box that would look like a street sign with an LED light board to display the faces and the total honk count. However in considering the amount of programming involved in the project and my lack of fabrication skills at this point I decided to make a web based version instead. While considering this transition from a physical device to a web app I decided that as long as it lives on the web it should be usable/deployable in a number of different contexts and on a wide range of devices. As such I spent a great deal of time making sure to incorporate design that was responsive to screen sizes so that it could be run on phones, tablets, or computers. In addition I decided I wanted some form of control panel so users could adjust the honk detection variables to suit their needs. Currently honk box is adapting well to most devices I’ve tested it on but I have run into some browser issues that I hope to iron out.

Another feature I really wanted to incorporate was data logging. I feel that the problem with honking is that it is very disruptive but at the same time ephemeral and semi-anonymous. This makes policing and assessing the problem difficult. So in addition to the emotional feedback the face provides to drivers I thought I would be valuable to have a record of just how “bad” honking really is at any given location over time. Unfortunately this data feature is not fully implemented but it is something I’d like to return to when I have a better idea of how to store and organize the data. This did however lead me to create the data screen which shows the FFT and spectrogram analysis of the live sound.

Screen Shot 2015-12-01 at 11.50.04 PMDeveloping and learning about spectrogtraphs was a major tuning point in how I was attempting to detect the honk sounds. Prior to building the spectrograph I was relying on a mixture of loudness detection and a sum of least squares comparison between previously analyzed “honk Sounds” and the average ambient sounds. This method kind of worked but could easily be fooled by other loud sounds such as motorcycle engines and kneeling busses. Seeing how the spectrograph represented the honk sounds I realized that honks are relatively distinct in the pattern they create. Where as most street sounds tend to either flood all frequencies or move through a number of frequency ranges honking created nice flat lines across the spectrograph. Ultimately the honk detection I used would seek out these “lines” in the sound which allowed me to filter out most false positives. So while the current data tab was originally a calibration and development tool I left it in as part of the final program so that end users could adjust the settings in response to the same sound behaviors I saw but in their local environment.

Going forward I will be thinking about ways to improve this system. A couple of directions I’m already interested in include using machine learning techniques to better identify honking, and using some database technology to store honk data from multiple locations simultaneously.

Please contact me with advice, comments, or questions.

 

Acknowledgements:

Reaction faces come from the Noun Project and were created by Dani Rolli

I received a ton of help from Josh Kramer on sound analysis, as well as  Peter Glennon and Alec Horwitz on programing.

ICM Final Idea | ICM # 7

Honk Board

Honk Board collects data on car honks and displays the medium and short term trends to the user.

This was originally an idea for a Pcomp Project, however I realized that the coding aspects of this project are more interesting to me (Pcomp Project Description). I came to this idea from living next to the Queensboro Bridge in NYC and hearing a constant parade of car horns. Initially my goal was to build a physical sign that would ultimately dissuade people from honking. However as I worked on a P5 Sketch to illustrate my point I realized that It could work as a piece of software for an individual user(s). At this point I’m interested in representing this form of sound pollution as a kind of weather report. I envision placing an android tablet in my window, running this program and having the program automatically calibrate and document all of the honking it hears. This visual component lets the user know its working and gives a suggestion as to whether this particular day or week is better or worse than usual. Hopefully this program could be used by anyone who wanted to document the honking taking place in their neighborhood.

My questions for the class:

How do I best represent positive or negative change in the rate of honking?

What factors besides number of honks might be useful?

What data should I collect on the honks, how do I store it?

 

Early Version: 

Here is a early version I was working on to demonstrate how a physical sign might work (https://11bsouth.com/HonkSignMockUp/)

 

Inspiration: 

I was inspired to work on this by the “Your speed is…” signs and some recent media attention to the sound levels in my neighborhood

Article about honking on the Queensboro Bridge

images

Data week | ICM # 7

This week we were asked to look at incorporating data sets into or sketches. I felt a little shaky on some of the topics covered in the past two weeks so I followed some examples from Dan Shiffman’s p5 videos to re-familiarize myself with constructor functions as well as some other concepts like using different tabs. After completing Dan’s example I customized things a bit to have a series of circles bouncing off of one another and the edge of the window.

At this point I tried to incorporate some external data into my sketch. My first attempt was to use data from (https://collegescorecard.ed.gov) which actually provided a lot of helpful documentation and examples.

Screen Shot 2015-10-21 at 11.10.21 AM

I think I was pretty close to getting this data set to work but after spending a while frustrated by a blank loading screen I decided to look at the openWeather examples I had seen in the p5 reference.

With openWeather I used a call back function to take individual numbers out of the JSON file and set them as universal variables in my sketch. I used the temperature to set the number of circles on the screen and wind speed to give them a range of starting speeds.

Screen Shot 2015-10-21 at 11.13.34 AM

I ended up with my sketch reliably producing the desired results.

Screen Shot 2015-10-21 at 11.13.48 AM

I hope to do a lot more with data while at ITP so it was really interesting to start looking at APIs and the differing file formats. I’m going to review DOM material this week and I expect using some DOM elements will really expand what I can do in terms of utilizing data.

 

sketch https://11bsouth.com/DataBounce/

p5 and DOM | ICM #6

This week in ICM we were asked to do something using the DOM library in p5. I had a hard time with this but was eventually able to get some of it working. Initially I worked through the DOM tutorial and produced the styled text and canvas:

Screen Shot 2015-10-13 at 11.51.13 PM

However when I tried working with DOM on my own I ran into number of problems that I didn’t really understand. The biggest issue for me is understanding where values or references can be made using DOM outside of the setup function. Here is an example of some of the errors I was getting:

 

Screen Shot 2015-10-14 at 9.51.13 AM

 

I decided to try something pretty easy so I used the bouncing circles constructor from the previous class and tried to vary elements of it using sliders. I tried a bunch of different things and most of them didn’t work. Ultimately I just used two sliders and entered some text outside of the canvas.

Screen Shot 2015-10-14 at 10.23.24 AM

It would have been nice to do more this week but I spent a lot of my time going through the tutorial and troubleshooting. At this point I think I understand the basic idea of the DOM elements and how they can be useful but so far implementing them has been difficult for me. I’m looking forward to talking more about this in class.

sketch: https://11bsouth.com/SLidersDOM/

Objects + Arrays | ICM #4

The assignment this week was to start working with Objects and Arrays. In addition we were asked to share some code with a classmate and try to integrate that into a sketch (more on this later). I initially started to “objectify” my peeps from the Evil peeps game I’ve been working on.

Screen Shot 2015-09-30 at 11.02.49 PM Screen Shot 2015-09-30 at 10.50.57 PM Screen Shot 2015-09-30 at 10.50.48 PM Screen Shot 2015-09-30 at 10.50.29 PM

 

At first I had some issues with where the peeps were being placed and had a difficult time replicating the functionality I had before. However, after some playing around I eventually got the peeps to recognize when another peep had been placed within 5 pixels or so which allowed me to finally make a fail screen. The key was using a for loop with a peeps.lenght limit as opposed to other values I had included. I think this had to do with the timer that I was using to generate new peeps.

Screen Shot 2015-10-07 at 9.44.19 AM

Then I started working with my parter, Jess’s code. She had a series of circles bouncing around the screen and I thought it would be cool to include that as part of the fail screen.

Screen Shot 2015-10-07 at 9.57.02 AM

In preparation I made the fail square fit into the existing menu bar and hoped that the circles could be evil peep heads that bounce around after you loose.

Screen Shot 2015-10-07 at 10.47.59 AM

Unfortunately I could only get one static circle to form before P5 would get “Uncaught TypeError: Cannot read property ‘move’ of undefined” I’m not sure why but I couldn’t get the circles/ball function to be recognized by the editor no matter how I tried to configure it. So I think this week I became familiar with constructors but theres still a lot I need to learn before I can build and implement these things easily.

 

Game:

https://11bsouth.com/EvilPeep0.5/