Pcomp final: Handshake and Assembly

We all switched roles a lot of times when we were stuck/ got bored with the same nature of job. But we had the work divided among ourselves, and it helped in tracking the overall project progress and making sure that all the aspects are moving ahead as planned. So Viniyata worked on MAX/MSP code, Jen worked on fabrication, and I worked on Arduino and the arrangements of components inside the box.

To me the most interesting part in the process was handshakes of different components of the project. We worked separately on MAX and Arduino for a few days and merged the two together a couple of days before the final. Putting comments in the arduino code really helped- we could quickly identify the lines of code that needed modification because of the nature of signals exchanged between MAX and arduino. I and Jen had to closely work on the fabrication and assembly part, anticipating a few things in advance. Taking into account the overall size of the components combined with arduino, managing huge bunch of wires that connect things together, exact positioning of the spray bottles to match them with the outlets on the box, exact positioning of the physical buttons right below the flexible birch plywood surface, and provision for a bunch of cables running from the box to the laptop (where MAX played the role) were all important and enjoyable bits. Following are a few pictures of the assembly:

IMG_5068 IMG_5070

architecture-01

And here’s how the final product looks!

doc1-4

doc3

Pcomp final: hacking the air freshener (part 2)

Jen researched smell dispensers and found two products that could be hacked with arduino. We studied controlling DC and stepper motors in the class the day before, and we were able to apply it to smell dispenser 3V DC motor. We hacked the air freshener and used a modified LED blink program to trigger spritz after every three seconds:

Ultimately the trigger will be initiated based on the accuracy of user’s singing. There are still some problems that can be seen in this one:

  1. It makes a sharp sound when operated. This might disturb the user.
  2. The smell offered by this product is too strong and synthetic to be used in our project.

User Testing

User testing was a great start in understanding practicalities of the project. We are targeting a very specific set of users who regularly practice Indian classical singing. The most time consuming part was explaining the concepts of this artform to our fellow ITP students who acted as the users. The bustling, energetic class activity not only threw some light on unforeseen problems (and unforeseen merits alike), but also made us aware of a couple of features that we had taken for granted unconsciously. It even turned out that in spite of working together our own ideas of a few minute features were still quite dissimilar. Overall the testing process was a good success, because failures were anticipated.

We tried different approaches with user testing. Each test session involved two testers and went on for around 20 minutes. In some sessions we explained the project to the two testers together. In others we talked to them separately and then listened to their reactions collectively in a group. In one session we explained the whole interaction verbally, and mostly in the others we used essential oils as response to testers’ vocals. We documented most of the tests by video recording and transcribing the discussions.

utest1

utest2

Broadly there were two categories of inputs from the users.

1. The Awesome Users!

In my opinion these people understood the purpose of the exercise and their role in it really well. They acted purely as potential users of the system rather than ITP students, until at some point they were asked to express their critiques on the experience and interactions offered by the system. Following are some great inputs:

“Is there a noticeable mapping between what I sing and the resultant smell?”

“Will the smell become more intense with time? Will it linger around for a while?”

“When I switch from singing a note to the next one, would there be a gradual change in the fragrance?”

“I might be immune to the smell in some time…”

“I breath out slowly while I sing- and I might breath in really quick. So how much would the fragrance really affect me?”

“Where does the smell actually come from? Is it a device that I can see and touch and interact with?”


 

2. Designer Mode On!

I think getting rid of the designer bias is virtually impossible. When I acted as a tester for others’ projects, I had to make myself constantly aware of the rules of play. While testing our project, a few testers directly proceeded with proposing new features, interactions and enhancements. These users and inputs were as much important:

“I would prefer candles and vapors.”

“There should be a visual feedback, like a live waveform that shows my performance against an ideal one.”

“I would design a few buttons to control volume of electronic tanpura and intensity of smell.”

“This is so cool! Make an installation with eight helmet-like spheres hanging from the ceiling, one for each note in the octave. User wears the first one and sings the first note. It releases a smell in that enclosure only.”

“Should the users know beforehand that they will smell a fragrance?”

“Make a system for a concert. That would create olfactory vibe in the whole room for the audience as well.”


 

As a result, first of all we understood that our own ideas need to be extremely clear and all team members need to be on same page. We need to let go a few ideas in the process and it is ok. For example, offering controls that are useful but a little out of scope— such as the whole range of pitch, timber and scale selections available on an electronic tanpura— can be discarded, and we shall just focus on essential inputs such as on-off switch, and maybe volume levels. We further decided that it needs to be a learning tool for beginners, and not an experience enhancer for seasoned artists. Next, a user might need an explicit way to tell the system what note they are planning to sing. It is certainly an overhead from user’s perspective, but it makes the system much simpler and dumb. In future we will like to implement intelligence that determines the note the user is trying to sing and adjusts itself accordingly. Finally, as of now we are little uncertain about the impacts of smell- will it linger for too long? Will the user develop non-susceptibility to a typical smell? Will the breathing pattern of a user affect the impact of smell? These can be fine-tuned with iterative designs and developments. We plan to make a reasonably functional system and then test it with new users.

Final Project Proposal

Viniyata and I are going to work on PComp final project that attempts to combine a singer’s audio input with olfactory feedback. Like western classical music, Indian classical music is based on compositions of eight notes from the ascending tonal octave. The final product/ system will create an environment that subtly syncs the olfactory response with the singer’s notes. It would work as a feedback system as well as a refreshing vibe that enhances the experiences.

IMG_6210

Riyaz is a serious and intense component of the Indian classical music tradition: it is a rigorously exercised practice that demands commitment for years. Beginners, aspiring artists and even established singers regularly do Riyaz for a few hours a day. We envision our project to be a useful and sublime tool for such practice sessions.

We have following questions/ uncertainties about the concept, and we hope to resolve these in next class when we play-test the project with our classmates:

  1. When does the user anticipate a feedback? Does the system give a feedback on successful completion of full octave or for every successful note?
  2. What is the form factor? Is it a product that sits in front of the singer or is it an invisible, ubiquitous system in that room?
  3. Should it be combined with a Tanpura (an electronic device similar to metronome that creates an aural canvas for the reference)?
  4. How intense and how quick should the generation of smells be?

Thoughts on Pcomp final project

I thought of a few directions for Pcomp final project. These directions are more like dos and don’ts I wish to sincerely adhere to, than the full-fledged concepts:

1. The final appearance needs to be super-refined. As Jonathan Ive describes good design in the documentary film Objectified, “…of course it’s that way, why would it be any other way?” Well, I wish to achieve this level for all my further projects 🙂
2. I want to make something that is thought provoking and conceptually very strong, rather than a ‘technologically complex and therefore appealing’ project. For example I will like to put in a substantial amount of research, user testing, and interface revisions and take forward the cigarette health damages project.
Next, I though of a few ideas that I find compelling:
1. An interactive data visualization with tangible media.
It was quite fascinating to look at on-screen interactive data visualizations on flowingdata. Data viz is an increasingly popular trend on the web these days as it presents a more efficient way to convey information. It can be implemented in form of an interactive installation that talks about some strong message.
2. Deceptive everyday devices
How would users react to something that looks extremely ordinary by virtue of its appearance – so much so that it is always overlooked – but renders an entirely surprising yet delightful experience when used? For example a computer mouse that looks like a computer mouse, but when used, works as a musical instrument instead of a pointing device. Caveat: This device might succeed as a work of art rather than a design project!

Riddles in Sound

For the midterm project, I and Viniyata decided to design a sonic experience where the user controls objects within a given soundscape. As the first step we created a product that allows user to control the ‘pan’ and ‘volume’ with specific gestures and movements with their heads.

Midterm-1024x640

 

The Product :

Midterm2-1024x640

Midterm3-1024x640

The Concept : Currently in its prototype stage, this product is designed to foster a series of ‘audio puzzles’ – games that the users can play using only auditory cues. These audio puzzles would augment reality in order for the user to have an immersive sonic experience.

Process :

Schematic diagram

Midterm5-1024x640

Testing the circuit

Midterm6-1024x640

Design and fabrication

Midterm7-1024x640

Midterm8-1024x640

User testing

Serial Input: Arduino says hello to P5

I created this application using analog serial input from arduino to P5:

A knob (potentiometer) acts as analog input. On laptop screen, a cigarette burns based on the clockwise rotation of the knob. Turning the knob anti-clockwise does not make any change to the animation. As the cigarette smoulders it gradually reveals a line of text beneath its length, “Go on & you’ll realize that the damages are-“
The knob has a limited sweep of rotation and it cannot be turned clockwise any further. The incomplete sentence suggests that there is still some pending bit of interaction that the user needs to complete. Turning the knob anti-clockwise is the only action afforded by the system at this point; and on doing so the sentence is completed: “Go on & you’ll realize that the damages are IRREVERSIBLE.” Hereon no interactive animations happen in response to the knob.
Metaphors
1. The curious revealing of the text represents how one gets addicted to cigarettes by being interested in exploring the sensation associated with smoking. The processes (actual smoking as well as this application) are unidirectional. One can choose to quit midway (which is difficult because the process is exciting) or go to an extreme end.
2. Just like the damages caused by smoking, the animation itself is irreversible. The user first has to finish the cigarette to experience this.
I’m willing to take this project further by putting in more research to support the interactions and by making the interface tightly bound to the theme. For example the knob can be replaced by something that has a direct association with smoking.
Sourcecode references:

Observations: Automatic Hand Sanitizer Dispenser at Tisch

Prompt: Pick a piece of interactive technology in public, used by multiple people. Write down your assumptions as to how it’s used, and describe the context in which it’s being used. Watch people use it, preferably without them knowing they’re being observed. Take notes on how they use it, what they do differently, what appear to be the difficulties, what appear to be the easiest parts. Record what takes the longest, what takes the least amount of time, and how long the whole transaction takes. Consider how the readings from Norman and Crawford reflect on what you see.

For this assignment I studied the two automatic hand sanitizer dispensers placed near elevators on ground floor on 721 Broadway. Each dispenser is a small unit on top of a stand. When a user places hand(s) below it, it dispenses sanitizer right on the hand(s).

IMG_5965

IMG_5967

I suppose I am quiet comfortable now with the dos and don’ts of the New York City! The interactive technologies in public are different here than what I have seen before in other places. Although it was not altogether a new concept, I found it interesting to see people interact with machines almost everywhere in public. A fascinating observation about cultural differences: In India I do not remember seeing interactive technologies like vending machines in public wherein people pay money and buy food/ tickets: There is always a ‘rational human being’ on the other end, one who could resolve queries, and most importantly guard the system when it is in public. (Makes me think of John Kolko’s “Thoughts on interaction design” where he describes need of a sentient entity, preferably human, to respond to the confused users.)

Another interesting facet of differentiation is the heavy use of electricity to power the interactive systems. This is where the idea of observing automated hand sanitizer came to my mind. The ones that I had used before were all mechanical, the user needs to ‘press’ a bulky mechanical button that dispenses sanitizer as a result of springs and notches and perhaps gears working in the background. When I used a dispenser near elevators on day one at Tisch, it took me hardly a second to theorize the overall interaction; but it was all routed through looking at the dispenser, forming a mental model with the help of visuals that I saw, and memories I had about interacting with similar systems. I was able to understand that there’s no need to press a button, without consciously noticing it. This became clearer when I read Norman’s take on forming models about systems:

“Everyone forms theories (mental models) to explain what they have observed. In the absence of external information, people are free to let their imaginations run free as long as the mental models they develop account for the facts as they perceive them.”

It would be interesting in this context to see someone use this system for the first time, especially when they have no knowledge of what a dispenser is. Unfortunately I was not able to observe such a user, since the placement of the dispenser is not technically in a ‘public’ area. All the users entering the space seemed aware of at least the basic idea of what hand sanitizer is.

Observations

People who used the dispenser were generally sure if they wanted to use it or not. As described above, the space is frequented by regular students, faculty and staff members and it is really a part of daily habit to these users. On basis of whether they used it or not, I could see three broad categories:
Users who interacted with it, no matter what. Mostly these users were females. These people even occasionally made others wait for them while they quickly finished the interaction.
Users who did not use it at all. The majority of these users were males. It’s not the case that they were unaware of the existence of the dispenser— they sometimes read the text on it while they waited for the elevator. But they never interacted.
Users who interacted with the system because they were standing idle nearby, waiting anyway for the elevator.

I found third group extremely interesting, and I think this group impressively contributes to the success of having the dispensers installed right next to the elevators.

Irrespective of what group they belonged to, people took around 5 to 7 seconds to interact when they used the dispenser. Afterwards a very few users stood right in front of the unit while they applied it onto hands, most of them left for the stairs or elevators or just ‘moved aside a bit’ while rubbing it onto hands.

Another interesting part of the unit is a flat black plastic part that projects from behind the unit, and runs a few inches below to form a platform for any ‘unused’ sanitizer that the user choses not to use (or that the user fails to ‘catch’). The other purpose that this projection serves is it defines constraint. It directly communicates the range of operation. As Norman describes it:

“The world restricts the allowed behavior. The physical properties of objects constrain possible operations: the order in which parts can go together and the ways in which an object can be moved, picked up, or otherwise manipulated. Each object has physical features—projections, depressions, screw-threads, appendages—that limit its relationships to other objects, operations that can be performed to it, what can be attached to it, and so on.”

There is a tiny green LED to the left of the unit. It blinks when the unit detects palm(s) and dispenses the sanitizer; but I saw it only when I paid close attention to it as I had to work on an assignment. Otherwise, I am sure I would not have noticed this tiny feedback.

Switches and LED circuits

Assignment for Week 2 was to work on a creative application for switches and LED circuits. We saw a number of fundamental components and a variety of switches, and I found some of them quite interesting- the magnetic switch, using copper tape as a switch, and the one that is triggered with vibration/ acceleration. An application that I could immediately think of was a circuit embedded in the skipping rope, wherein LED(s) would glow when a user starts skipping. The battery would be somewhere near the grip and the vibrator switch would be somewhere near the midpoint of the rope so that it reaches the highest possible speed when operated. I soon realized that it will involve some complex issues that need sophisticated solutions. For example when the rope gets twisted it might cause problems to the battery housing and connectors running through the rope.

The next idea that I thought of (and eventually implemented): A piggy bank that glows when a coin is inserted into it. I found this nice baseball-shaped money bank in a store and built a LED circuit inside it so that a coin, when inserted in the slit, acts as the switch.

IMG_5928
A baseball-shaped money bank

The circuit diagram is really a basic one: I decided to use two pencil cells (1.5V each) to power two little green LEDs arranged in parallel.

New-Project

The next step was to implement a mechanism that

  1. Acts as a closed switch when a coin passes through the slit, and
  2. Acts as an open switch otherwise.

It looked simple to me in theory, but I had to think of various alternatives before I reached the best choice. First I thought of using multi-stranded copper wires along both sides of the length of the slit, so that both surfaces of the coin brush the strands and hence current flows through the coin. Next I thought of using copper tape instead of strands. And finally I found a handy product that is just perfect for this application: ball pen springs.

So I started working on it with the baseball money bank, tools and components that I could find in my PComp collection, two ball pens and some coffee.

IMG_5929
The material to start with
IMG_5930
Circuit soldered in parts
IMG_5942
Inside the shell: Switch mechanism using springs
Interaction: A coin is inserted
Interaction: A coin is inserted
IMG_5938
LEDs glow!

IMG_5939

Lessons learnt:

  1. Breadboard is a must. I assumed that since I have worked on building circuits and soldering before, I will be able to just ‘figure things out’. However there was a bit of trial and error as I did not have breadboard and multimeter.
  2. Assembling the actual circuit in final housing (baseball shell in this case) needs to be done carefully and thoughtfully to make it stable and long lasting.
  3. When the interaction happens through the switch itself, the switch mechanism should be extremely reliable. It should work for any legit interaction, and should not complete the circuit otherwise. In this application a little fine-tuning was required to adjust the springs so that any coin would work just fine. The space between the two springs needed to be just enough so that any coin touches both the springs as it passes through the slit, yet good enough so that springs stay normally open and don’t make contact with each other accidentally.

What is interaction?

Bret Victor’s “A brief rant on future of interaction design” sets a great starting point to Physical Computing class as it proposes the Need-Tool-Capability framework to look at interactivity as an extension of human faculties such as vision and motor skills. Many references cross my mind as I write this post- the apes and the monolith in the movie 2001: A Space Odyssey, a range of interfaces and machines from The Matrix, and even the Ikea assembly instructions manual that I recently received along with the DIY furniture kit.

Roughly, interaction would mean any set of steps an entity follows in order to communicate with another entity. It is a dialog that essentially involves a series of responses from both parties.

In the context of Physical Computing I can think of several ways to describe interaction by proposing a few constraints:

First, one of the two entities is a human! (Or may be a sentient being like a cat). Let’s not look at machine-to-machine dialog as it is either a dull information exchange or a delightful installation that goes on without active participation of human(s).
Second, the knowledge of performing these actions is something that a person discovers accidentally/ learns deliberately by trial-and-error approach or by following instructions/ copies from others/ develops over time as it becomes a routine. In short, this knowledge isn’t directly genetically coded. Humans are just curious about the environment that surrounds them and therefore there is a huge and wonderful scope to design entities that humans can communicate with.

A good example of delightful Interactions would be the Nest thermostat, and I strongly feel that these interactions are useful and meaningful even if Timo Arnall discusses how pathetic an attempt it is to market Nest as an invisible and smart entity. It presents a more legible and human way to communicate user’s needs to HVAC system than what was used as a standard by its predecessors. This factor of diverging from the set practices makes the user curious about the interface and hence initiates a dialog.

Examples of non-interactive systems: Automated systems that work on their own, such as a coffee machine that runs out of ingredients and automatically intakes coffee beans and milk from containers. The process involves sensing a need, using tools and providing feedback to the user; however it lacks initiation of a dialog with the human.

This leads my contemplation to a series of thoughts that I would like to discuss at the next class: Human-machine communications certainly involve the need-tool-capacity model, may involve pleasant micro-interactions, and may lead to a series of signals from humans to machines and vice-versa. Hence these can be termed as Interactive. However, over the time such dialogs may become commonplace. A laptop turns on/off upon triggering the switch and leaves no ripple of delight as it does not surprise an experienced user. It feels amazing to use Nest thermostat while it is altogether a new way to communicate your needs; but not so much after using it daily. Communications that once looked Interactive might seem mere Reactive as they become part of the routine. Does this limit the scope of designing interactive entities?