Final Idea Don’t Honk Street Sign – PComp #8

In thinking about my PComp final I’ve come up with an idea for a honk detecting street sign. The goal of this sign would be two fold: to collect data on honking at various intersections, and then to discourage unnecessary honking by providing a visual reaction to individual honks and longer term trends. I came to this idea while listening to the amazing amount of honking that takes place outside my window on the Queensboro bridge. The prototypical bystander reaction to the frustration of incessant honking is to scream obscenities out of your window. I totally get this urge. However as a means of changing the behavior of drivers its almost completely ineffective. I take this as a sign of an unmet need; people want to communicate the frustration of living around a honk ridden intersection to the individual drivers who are presumably just passing through with little awareness to the plight of nearby residents. My hope is that this sign can communicate the collective feeling of neighborhood residents to the individual driver who isn’t aware of the compounding effect of their honking. I believe that such a sign will be effective because in my time spent listening to honking cars (against my will) I have learned that the decision to honk is highly informed by social dynamics.

Often honking will happen in clusters, as one particularly hurried or irritable driver starts the chain reaction by removing the slight stigma that exists. Eventually others will follow suit as if to signal agreement in their displeasure with sitting in traffic. In extreme cases when certain drivers feel their honks are being ignored (something I find insane) they will honk out a tune or simply hold down the horn until their non-verbal demands are met. Since honking operates within some discernible social system, I think this system can be altered by injecting a sign of displeasure to honking. Hopefully the number of “first honkers” can be reduced as they see statistics on honking or a visual representation of how their honks make residents in the area feel. For the “ignored honkers” I hope they will feel some slight shame or guilt when they see how honky of a day its been already, ideally they’ll realize that NYC traffic does not operate according to the whims of the individual and that their attempts to honk their way out of traffic is a tried and failed method.

Screen Shot 2015-10-29 at 4.32.16 AM

Here is my digital mock up of how the physical sign could look and function.

http://www.11bsouth.com/HonkSignMockUp/

 

Read More

Midterm – Sword Fight! | Pcomp #7

Ian (Yuan Gao) and I worked together to produce a sword fighting game with foam swords, digital sound effects, and a scoring mechanism for our PComp midterm. We had recently learned how to read the data from an accelerometer and how to serially communicate between the arduino and our laptops such that programming on the arduino can effect programming in p5.js and vica versa. We thought a sword fighting game would be a good application of these skills.

After consulting with Tom about our project he suggested we start out by visualizing the output of our accelerometers and using any patterns we noticed as a jumping off point for detecting sword motions like swings and hits. After a few attempts I put together a program in p5 that would graph the accelerometer data in a readable way, similar to what Tom had shown us. At the time we were hoping to use bluetooth communication so I set up bluetooth and attached the arduino to a battery as I tested my accelerometer graph.

Video:AccelerometerGraphing

As we studied (mostly played around) with the data output it became clear that acceleration isn’t always what it seems. Conceptually I understand acceleration as the rate of change of velocity. However, in practice I think I was expecting acceleration to act more like velocity or maybe “energy”. In addition, the constant acceleration due to gravity complicates the “movement” view of acceleration further. We looked into how to manage this data input and get it into a more usable state and we realized that in doing so we were entering the world of complicated math and physics lessons. One option in this regard was to calibrate the accelerometer to sense G force accurately. This would have made our measurements more accurate and legible but wouldn’t have necessarily helped us identify swings and hits. The other option we identified, and ultimately used, was to dive deeper into the meaning of acceleration and its derivatives. While we did research we discovered that a lot of people who work with accelerometers also work with the derivatives of the acceleration: Jerk and Jounce. Combining current accelerometer readings with the Jerk and Jounce allows you to determine points of inflection that could indicate the beginning of a swing, the end of a swing, a hit and other events. We ended up using a threshold value for jerk as the primary component of our swing detection, however it became very clear that it is possible to get accurate and distinct readings for a huge variety of motions depending on how much math and pattern recognition you are willing to do with acceleration, jerk and jounce. As a result, we spent a lot of time tweaking and experimenting with different values for each. In the end we included a lot of smoothing algorithms and this kind of threshold test for playing a sound:

function playbackChopSoundEffect(_jerk_vals, _jounce_vals, sample_ind) {
    if ((abs(_jerk_vals.x) >= _jerk_vals.max_x * 0.25 ||
    abs(_jerk_vals.y) >= _jerk_vals.max_y * 0.25 ||
    abs(_jerk_vals.z) >= _jerk_vals.max_z * 0.25) &&
   (abs(_jounce_vals.x) >= _jounce_vals.max_x * 0.33 ||
   abs(_jounce_vals.y) >= _jounce_vals.max_y * 0.33 ||
   abs(_jounce_vals.z) >= _jounce_vals.max_z * 0.33)) {
   if (random(1) > 0.5) swoosh_sounds[sample_ind].play();
   else swoosh_sounds[sample_ind + 2].play();
}
}

If we had more time for research we may have been able to rely completely on the accelerometer to detect all of the possible sword events however we settled on having the accelerometer control swing sounds and to use home made switches for hit detection (scoring) and to play a sound when two swords hit each other.

Attaching the accelerometer to the foam swords, building the p5.js sketch to detect and respond to the action and wiring this all up was tricky but took much less time than the calibration of the accelerometer. We wrapped each sword in tin foil and built companion targets (also wrapped in tin foil) to form switches when the opponents word makes contact with your sword or your target. The accelerometer was attached to a mini breadboard which we zip tied to each sword. Unfortunately we didn’t have time to make this wireless so each sword had a long trail of wires connecting them to the arduino. The sketch incorporated some fencing images and sounds that we found online (citation coming) as well as the score count for each player.

IMG_1701

Screen Shot 2015-10-29 at 6.54.16 PM

I was happy with how this project turned out. Some problems were the lack of wireless which would have made the game more fun due to the increased freedom of movement and the relatively fragile nature of the swords and electronics we fabricated. I’m confident that with more time, resources, and guidance we could make this game fully wireless, self contained (wouldn’t require a PC connection for the sounds and scoring), and more durable.

Here’s a video of it in action:

SwordFight!

 

Read More

Evil Peeps w/ External Game Controller (Serial Communication) | Pcomp #6

The task for week 6 was to “make a serial application that controls one of the animation projects you’ve done in intro to computational media with analog sensor data from an arduino, sent to the browser serially” so I decided to give my evil peep game I’ve been working on in ICM an external controller.

The evil peep game was relatively easy to adapt for an external controller. Origionally the Evil Peeps would appear at the pointer of the mouse on screen at an increasing rate. The challenge was to continue placing peeps without overlaps even as the time between new peeps and the available space on the screen were shrinking. To adapt it for an external controller I first copied in all of the serial communication set up (listing available ports, error checking, etc…)  that was made available to us in the serial communication Lab. Once I got the arduino and the sketch to communicate I added an if statement so that if the peeps game received a signal for a button press from the arduino it would look for all controls to come from that device and ignore the mouse. This allowed the game to be played either the standard way or with the game controller. The next step was to add the controls.

At this point measuring the output of a sensor and mapping it’s values is something I’ve gotten comfortable with so having the arduino read two potentiometers was straightforward. I assembled my arduino with one button input and two potentiometer inputs and stuffed it into a used milk carton.

IMG_1643

I programed the arduino to serial write the value of each potentiometer and the button as a series of values, like this: “125,125,0” – This would indicate that both potentiometers are close to half their maximum resistance and that the button is not pressed. In p5 I read these values, split them up, mapped the range in values to the size of the screen, and stored each in an array. Once I had these values coming in reliably having them control the game was just a matter of replacing mouseX and mouseY as the basis for the peep locations with the mapped potentiometer values stored in the array. Controlling the game with the potentiometers made it significantly more difficult so I spent a lot of time adjusting the score and Peep timers until I got it to a playable format. In the end I was really happy with how it turned out and I think I’ve developed a much deeper appreciation for classic arcade games!

Video:

Evil Peeps With Controller

 

  • one issue I had was that after stuffing my arduino in the box the button became unreliable – the solution would have been to get a better button

Read More

Color Composition | Visual Language #6

This week we were asked to look into color composition. Being a very colorblind person I wasn’t sure how successful I’d be at this but I ultimately realized that I can leave some color decisions to the color wheel and have a reasonable expectation that they would “work” as a composition even if I could’t see it properly. Taking the color perception test was interesting too because I actually felt like I was doing a decent job until I got a score at the bottom of the perception range.

Screen Shot 2015-10-26 at 10.37.42 AM

The funny thing about being colorblind is that its hard to notice until someone tells you. If this challenge lacked a score I would have felt pretty confident that I did well. I think that this is the primary frustration with this minor disability, it’s existence is largely social.

When trying to create a composition I tried to rely on some of the color relationships we learned about on the color wheel. I made a weather displaying sketch that changes color (and some other properties) as we weather and time of day changes. The range of hues is determined by the temperature where I mapped a range of 0 to 100 degrees to 0-330 degrees on the color wheel. The background is then set as an adjacent color by adding 30 degrees and the boarder is set as the complement of the background. Here are some of the ranges with a fixed saturation and brightness.

Screen Shot 2015-10-26 at 10.41.15 AM Screen Shot 2015-10-26 at 10.40.49 AM Screen Shot 2015-10-26 at 10.40.34 AM Screen Shot 2015-10-26 at 10.40.21 AM

Next I varied some of the brightness levels of the background to correspond with the day/night cycle:

Screen Shot 2015-10-26 at 10.42.10 AM Screen Shot 2015-10-26 at 10.41.58 AM Screen Shot 2015-10-26 at 10.41.41 AM

I didn’t want it to ever get completely black so I set the range of the background and border to vary between 50-100 brightness depending on how far away from mid-day it currently is. Lastly, I adjusted the saturation of the background and border. I wanted the circles to be the primary temperature indication so I set the saturation of the background to 50% and the border to 25% while leaving the dots at 100%. Screen Shot 2015-10-26 at 10.43.02 AM

I hope that this isn’t a totally weird collection of colors 🙂

Screen Shot 2015-10-26 at 11.15.54 AM Screen Shot 2015-10-26 at 11.15.43 AM Screen Shot 2015-10-26 at 11.15.26 AM Screen Shot 2015-10-26 at 11.15.13 AM Screen Shot 2015-10-26 at 11.14.58 AM

 

http://www.11bsouth.com/DataBouceTemp/

 

 

 

Read More

Data week | ICM # 7

This week we were asked to look at incorporating data sets into or sketches. I felt a little shaky on some of the topics covered in the past two weeks so I followed some examples from Dan Shiffman’s p5 videos to re-familiarize myself with constructor functions as well as some other concepts like using different tabs. After completing Dan’s example I customized things a bit to have a series of circles bouncing off of one another and the edge of the window.

At this point I tried to incorporate some external data into my sketch. My first attempt was to use data from (https://collegescorecard.ed.gov) which actually provided a lot of helpful documentation and examples.

Screen Shot 2015-10-21 at 11.10.21 AM

I think I was pretty close to getting this data set to work but after spending a while frustrated by a blank loading screen I decided to look at the openWeather examples I had seen in the p5 reference.

With openWeather I used a call back function to take individual numbers out of the JSON file and set them as universal variables in my sketch. I used the temperature to set the number of circles on the screen and wind speed to give them a range of starting speeds.

Screen Shot 2015-10-21 at 11.13.34 AM

I ended up with my sketch reliably producing the desired results.

Screen Shot 2015-10-21 at 11.13.48 AM

I hope to do a lot more with data while at ITP so it was really interesting to start looking at APIs and the differing file formats. I’m going to review DOM material this week and I expect using some DOM elements will really expand what I can do in terms of utilizing data.

 

sketch http://www.11bsouth.com/DataBounce/

Read More

p5 and DOM | ICM #6

This week in ICM we were asked to do something using the DOM library in p5. I had a hard time with this but was eventually able to get some of it working. Initially I worked through the DOM tutorial and produced the styled text and canvas:

Screen Shot 2015-10-13 at 11.51.13 PM

However when I tried working with DOM on my own I ran into number of problems that I didn’t really understand. The biggest issue for me is understanding where values or references can be made using DOM outside of the setup function. Here is an example of some of the errors I was getting:

 

Screen Shot 2015-10-14 at 9.51.13 AM

 

I decided to try something pretty easy so I used the bouncing circles constructor from the previous class and tried to vary elements of it using sliders. I tried a bunch of different things and most of them didn’t work. Ultimately I just used two sliders and entered some text outside of the canvas.

Screen Shot 2015-10-14 at 10.23.24 AM

It would have been nice to do more this week but I spent a lot of my time going through the tutorial and troubleshooting. At this point I think I understand the basic idea of the DOM elements and how they can be useful but so far implementing them has been difficult for me. I’m looking forward to talking more about this in class.

sketch: http://11bsouth.com/SLidersDOM/

Read More

Logos | Visual Language #4

Part One: We were asked to look into a logo that we liked so I chose Bucky badger, the mascot of the University of Wisconsin.

BuckyBadger.svg

 

I know a lot of people from Madison, Wisconsin and they all have some strong feelings one way or the other about this character so I thought it would be a good logo to read up on. While Madison had a mascot as early as the 1930’s the badger named “bucky” or “Buckingham U Badger” was developed in 1940 by a commercial artist named Art Evans from California. Initially it was used primarily by semi-affiliated bookstores and institutions.

 

The University is a major employer and source of recreation and entertainment for Madison so even today Bucky can be found selling everything from popcorn to used cars.

 

Bucky also has many permutations depending on the department or sports team that is using it:

 

 

The “modern” logo came into existence in 1988 from the Anson W. Thompson Company of Los Angeles.

 

This logo was the first that the university itself attempted to form a trademark around which led to University Book Store v. University of Wisconsin–Madison Board of Regents where a trademark was awarded. Having been to Madison many times the badger is used as frequently as the statue of liberty is in NYC. It’s interesting that this particular logo actually started as a community development and became and official emblem of the college. In 2003 the logo was updated to give it more scalability, and to reenforce the university’s ownership of the image.

 

Part 2: The second part of this weeks assignment was to develop a personal logo. This was the first assignment where I really tried to learn illustrator so I didn’t get too ambitious with custom graphics. Here are some of my ideas and attempts:

Screen Shot 2015-10-13 at 12.51.56 AM

I tried to make the “J” in Jesse serve as the primary element of the logo. However I had a hard time getting things to look right with the font I selected. I ended up with two decent versions in this font (Jockey One): Screen Shot 2015-10-13 at 1.07.04 AM

Screen Shot 2015-10-13 at 1.05.16 AM

 

 

 

 

 

I tried this kind of logo again in a different font (Fauna One) and added my last name:

Screen Shot 2015-10-13 at 1.30.32 AM

Eventually I got kind of frustrated with the “J” look so I wrote my name in Quicksand vertically with some varying stroke weights and used the spacial imbalance between my first and last name as a place for interchangeable icons from the noun project.

Screen Shot 2015-10-13 at 11.08.25 AMScreen Shot 2015-10-13 at 11.03.47 AMScreen Shot 2015-10-13 at 11.02.25 AMScreen Shot 2015-10-13 at 2.06.28 AM

Credits for the noun project images: misirlou, Minna Ninova,

Read More

Objects + Arrays | ICM #4

The assignment this week was to start working with Objects and Arrays. In addition we were asked to share some code with a classmate and try to integrate that into a sketch (more on this later). I initially started to “objectify” my peeps from the Evil peeps game I’ve been working on.

Screen Shot 2015-09-30 at 11.02.49 PM Screen Shot 2015-09-30 at 10.50.57 PM Screen Shot 2015-09-30 at 10.50.48 PM Screen Shot 2015-09-30 at 10.50.29 PM

 

At first I had some issues with where the peeps were being placed and had a difficult time replicating the functionality I had before. However, after some playing around I eventually got the peeps to recognize when another peep had been placed within 5 pixels or so which allowed me to finally make a fail screen. The key was using a for loop with a peeps.lenght limit as opposed to other values I had included. I think this had to do with the timer that I was using to generate new peeps.

Screen Shot 2015-10-07 at 9.44.19 AM

Then I started working with my parter, Jess’s code. She had a series of circles bouncing around the screen and I thought it would be cool to include that as part of the fail screen.

Screen Shot 2015-10-07 at 9.57.02 AM

In preparation I made the fail square fit into the existing menu bar and hoped that the circles could be evil peep heads that bounce around after you loose.

Screen Shot 2015-10-07 at 10.47.59 AM

Unfortunately I could only get one static circle to form before P5 would get “Uncaught TypeError: Cannot read property ‘move’ of undefined” I’m not sure why but I couldn’t get the circles/ball function to be recognized by the editor no matter how I tried to configure it. So I think this week I became familiar with constructors but theres still a lot I need to learn before I can build and implement these things easily.

 

Game:

http://www.11bsouth.com/EvilPeep0.5/

Read More

Reading Response | Pcomp #5

The readings this week emphasized sketches and the design process. I was happy to see the repeated suggestion that you don’t have to be “good at drawing” to sketch because I am definitely not good at drawing and that gave me some confidence to do it anyway. It also occurred to me that a non-verbal way of communicating ideas would be helpful in general but paticularly at ITP given the number of international students and group projects. In addition, I really appreciated the concept of local hill climbing at the expense of the “global” maxima. I feel like I’ve already experienced this in some of my ICM work, where a particular problem was wasting a lot of time only to change course and produce similar functionality in a fraction of the time. — The second reading , by Tom Igoe, also dealt with the design process and essentially asked aspiring interactive artists to realize that a truly interactive concept can’t be completely defined or contained by it’s creator. Too explicit an interaction deprives the experience of discovery and a truly two way interaction. This lesson is taught to ITP students in many cases when they walk up to the screens displaying projects in the hallways. Outside of the end of semester shows most of these screens are unlabeled and up to the casual viewer to determine the “purpose” and decide for themselves what they are supposed to “get” from it.  It’s an interesting challenge to see what can be left unsaid and still be heard.

 

Buxton, Sketching User Experiences: The Workbook

IgoeMaking Interactive Art: Set the Stage, Then Shut Up and Listen

Read More