Friday, February 28, 2014

Brazil Instrument and Installation

The users body will be the instrument.  I will use computer vision and a Kinect for this.  Your body will control the visuals that are playing from the emotional type of music the user selected.  There is the emotion angry for this a user can punch the air forward, and that will have effects to the visuals.  For the pump up music the user can control the visuals by moving their hand in an upward direction in the air like they were dancing at a show.  For calming and peaceful this will be controlled mostly by the placement of the users head.  It would not make sense to have the user making large amounts of movements for this emotion because it does not go with the emotion it should be relaxing, I want the user to more enjoy the music and relax rather than be energetic like the pump up emotion.  Then the last emotion is good feeling this will also be controlled by the users hands but in a different way the movements your hand makes will change the visuals.  Maybe how high or low your hands are will have effect.  The Movement the users makes I want to be similar to the type of movements they would make listening to that type of music. 


For the installation I want to bring the interface, and the instrument together for one installation.  My plan is too get the DMX lights connected to my patch.  The DMX lights will be able to fully change the feeling and environment that the user will be in.  This will create the feeling to go along with the music.  It would not make sense to be choosing the angry emotion and have the room be bright, with light colors, the rooms should be dark, and have black and red lights.  The room should not feel to be a happy place, but with the good feeling music this should have a fully different feeling than angry music and the other emotional type of music.  Good feeling music should have the lights having a little brightness to them and cooler colors, like light blue.  For pump up the light should be the brightest, very ambient, cool colors.  The lights should also change more frequently to give the full feeling of being at a show.  This will get the user up and walking around.  It will also have them connecting to that emotion.  For calming the lights should not change as much, the colors should give off a relaxing feeling.  Because I want to use the DMX lights I would need to set my installation up in the Hyper cube.  Everything should just be a quick plug in and go, so it should be able to work easier with the other groups. 

sensor and interface information

The sensors i'm using will be flex and pressure sensor.  The way the the user interacts with the project will be through simple finger motion and pressure between the thumb and the other finger(s).  The idea is that the videos will cycle based upon the flex of the fingers, and by flexing the fingers other then the pointer finger it will change either the sound or the effects that are happening to the video.  It will be using a few preset videos and sound data.   

Friday, February 7, 2014

Virtual Instrument System


Vincent John Vincent invented video gesture control virtual reality with Francis MacDougall in 1986. By 1991 they had a fully dynamic immersive video gesture control virtual reality system. In 1986 Vincent began performing around the world. He had created a whole new genre of performance technology and hundreds of virtual instruments.
Vincent John Vincent is also the President and Co-Founder of GestureTek, the worlds leading Video Gesture Control Company. GestureTek  also invented 3D video gesture control in 2000, which is the technology behind the Kinect.


Thursday, February 6, 2014

Bionic hand lets wearer feel what they're holding

Real-time nerve interface for mechanical hand that relates detailed haptic feedback.

"...by the final week he was able to differentiate between three shapes with 88 percent accuracy and between the hardness of three objects with 78.7 percent accuracy. "It is very intuitive," Micera says.

Using the bionic hand required Sørensen to have electrodes implanted in his arm, just above where it had been amputated nine years prior. Even though the nerves hadn't been in use, the prosthetic was able to translate the bionic hand's input into electrical signals that the nerves could understand."

Full Story

Thursday, January 30, 2014

Rodich Project Proposal

For my first project I would like to focus on creating an interface for controlling video/audio/light in a room which is focused on eye sensing, haptics and proprioception. Ideally, this interface allows the user to control the environment without being hyper aware of which actions control what. The idea of being un-aware of your control over the environment and actually controlling it are of course at odds, but the conceptual idea is to pull of a sense of digital magic trick whereas the actions corresponding to actions conducted by the individual user feel natural. Most importantly, I want to design an interface that is 'invisible' to the user, in that they do not understand and don't need to understand the action functionality of the interface setup. 

I suspect that eyesight tracking will involve a sophisticated dissection of data from a webcam feed which I plan to accomplish with tracking in Max. I plan to use a combination of sonar sensors, accelerometers, and infared sensors to track peoples motion around the room. This will likely have to be an array set up in a specific pattern to generate X,Y quardinates in the room in order to direct any sort of audio/video/light for the user to experience as a result of their interaction. 

Whiteside project proposal

  I believe i want to do a project that involves making an interface that can be mounted on your arm and tracks the motion of your hand through small pegs that surround your hand but does not quite touch it, the sensors i think i would use is probably be optical sensors on 4 pegs that go around the hand.  Have them facing each other to track the 3dimensional movement that the user does with their hand, to avoid overly complex setups i would probably just track the top and palm of the hand.  the idea of this is to allow a user to feel like an interaction with something is completely natural and not like they have to do something extra in order to interact.  What i mean by this is say you are interacting with a computer and you are using a mouse, you have to keep track of where the mouse is so it doesn't fall off like a table or you run out of movement space with your arm, and it is not as clunky as say using a glove of some sort where it interferes with you natural interactions.

Brazil Project proposal


I will create an interface that will be controlling sounds with one hand. I will use a sensor to check the distance from the sensor to the hand this will control the music the farther or closer your hand is would have a different result on the music. Then I will have another sensor for your other hand that will control a visual set.  This will be done in the same fashion as the first sensor.  I want the sensors to be able to recognize hand gestures so your hand does not have to be hovered in the same spot the whole time.  I want the possibility to be open to add more than two sensors I want the user to have more freedom then just hovering over one sensor, so it does not get boring as easily.  If possible I would like to do a sensor for the DMX lights. The visual set will also be partially connected with the music. Im not fully set on what i want to put the sensors in but i am thinking of putting them each in a square or a disk that can resemble a dj work area.  

Diamond Proposal ONE


Create a rainforest environment and have the sonic elements triggered by the sensors. I would like to use proximity sensors that will detect where viewers are located in the room, which will trigger different sounds. There would be a continuous loop playing as sort of background noise as well. This loop might get louder or quieter depending on the different reads from the proximity sensors.

I would also like to have one other type of sensor. This might be an accelerometer or IMU to detect movement. An example of the way I would use this type of sensing would be if no one was moving then a gust of wind or an animal might my sound like it is flashing by. I would also like to have some sort of visual element associated with the installation. Maybe this could be something in the center of the room that looks different depending on where you are viewing it from so that this will encourage people to move about the space.

Braden - Proposal One

My proposal is creating an interactive sonic environment where the presence of the audience manipulates audio effects and sounds. Similar to the blocks on a Reactable, the people within the space depending on their positioning would trigger different effects and sounds. Part of the experience for the audience would be trying to figure out what exactly they are manipulating within the audio, and then being able have more control over the audio experience after they learn how the environment works.

Using a Kinect's blob tracking abilities would be one way to create this type of environment, however due to the limited field of vision of the Kinect, there may have to be more pieces of hardware utilized. In addition to a Kinect, an array of proximity or IR sensors could be placed around the space to detect the presence people and their distances in the room.

Ross - Proposal Project Uno

I'd like to move forward with the 3d capacitive sensing controller for the multiplexed LED cube. Based on the attached sketch, I expect use three sides of cube to create a free-standing 3d capacitive sensor. I'll use either an Arduino Uno or Arduino Mega (depending on the exact number of pins needed) as input control for the capacitive sensors and the output for the LED cube. A 4x4x4 LED cube would provide some reasonably nuanced positional information and at the same time some interesting aesthetics. With 64 LEDs connection, I'll look into multiplexing and shift register chips for individual addressing of the LEDs. I think the control of the LEDs will be limited to binary, but I expect them to be fairly responsive to the position of a hand within the sensing field.

Depending on the complexities discovered with the LED cube, I might also look into addressable strings of LEDs

Wednesday, January 29, 2014

Syd: Proposal for Project 1

Project 1 Proposal:

I would like to create a headband type device (like it could be attached to a sweatband for prototyping purposes, or a generic headband, or a hat) that has the capabilities to read the users brainwaves. I believe that it would be important to start simple here, so I would like the users input to change either an audio or video output, and than if I can accomplish that jump off from there. I would like the user to have a conscious control over what he or she is doing, so that he or she can press next (and as many other variations as possible). It would also be interesting to give the user the ability to pause, play, and go back a song. I want to use music because it has many variables that are trackable and can have an on off quality so that it is more easily defined for by the interface and the user. If I could get the opportunity, I would like to be able to have the user be able to change volume, as that is a gradient and would take more calibration to record and time to interpret.

Project 1 Proposal - Brandon

hands on: love connection

Proposal: I would like to create an interactive, wearable experience using resistance sensing, conductive thread and/or conductive paint. A wearable item will prove for a positive interaction between two or more people, completing a conductive circuit. The interaction will create a love connection.

Using conductive paint, an anatomical human heart will be painted on a textile. As the interacting person approaches and massages or resuscitates the conductive heart, he/she will play the heart's love song, completing the connection to make the heart function. 

This project is dually significant as it will create an implied romantic connection between two people, and accentuate the need for human-computer mediation in modern medical practices. This project will raise awareness to the value of human connectedness and touch. One person can play the melody of another’s’ heart.


Research and Inspiration
Key terms:
Ubiquitous computing: computing is made to appear everywhere and anywhere.
Wearable computers: body-borne computers or wearables are miniature electronic devices that are worn by the bearer under, with or on top of clothing.
Sousveillance: the recording of an activity by a participant in the activity typically by way of small wearable or portable personal technologies.
Wearable technology is related to both the field of ubiquitous computing and the history and development of wearable computers. With ubiquitous computing, wearable technology share the vision of interweaving technology into the everyday life, of making technology pervasive and interaction frictionless. Through the history and development of wearable computing, this vision has been both contrasted and affirmed. Affirmed through the multiple projects directed at either enhancing or extending functionality of clothing, and as contrast, most notably through Steve Mann's concept of sousveillance. The history of wearable technology is influenced by both of these responses to the vision of ubiquitous computing. 

Early examples:
The calculator watch, introduced in the 1980s, was one original piece of widespread worn electronics. Ilya Fridman designed a Bluetooth headset into a pair of earrings with a hidden microphone.The Spy TIE includes a color video camera and USB Heating Gloves keep hands warm when plugged in.

 Circuit Stickers



Papers:



Tuesday, January 28, 2014

Project Proposals

Proposal 1

I would like to manipulate a visual environment based on the viewers eye and where his/her focus is directed, which I would hope to do through blog tracking in Max. Ideally, I would like to setup some sort of installation that psychologically isolates the viewer in some way. This might come in the form of somehow making an art piece which is only viewable by other people when an individual is performing a certain action or looking in a specific place. In this way, the engaged individual is not allowed to see what his/her eye and gaze placement is doing to the environment. I will have to find a way to engage the individual with out revealing the whole piece while simultaneously providing them a motivation for being the 'catalyst' of the work while they receive no immediate reward.


Proposal 2

I am fascinated by installation art pieces which do not reveal any sort of 'cutting edge' technology and rather spend their energy manipulating the human processing of information and derailing the normal interaction humans have with their natural, earthy environments. These concepts are a bit hard to grasp on the delivery level, but I admire them for their simplicity of presentation, regardless of the amount of technology that goes into causing the installations to act the way they do. I have yet to really develop a concept of what kind of interaction i would like to manipulate or alter, but I find my greatest inspiration in the rods & cones pieces, rAndom international's Rain Room at MOMA PS1, as well as zee by Kurt Hentschlager.

Touchscreen for Paper interface

Technology company Fujitsu has prototyped a Touchscreen interface which allows users to interact with physical paper documents as if they were on a touchscreen. Using a simple combination of a webcam and a consumer level projector, the project boasts an impressive list of capabilities including scanning, replication of imagery, three dimensional manipulation of objects, magnification, and additions of supplemental information when appropriate. The webcam feed is run through a program which detects the height of the hand, position of the fingers, rotation of the arm/wrist, among other metrics and converts them into commands for the projector, or uses them to capture imagery/text from the document present on the table.

Though the technology could seem redundant if the user already has his or her document digitally present, it does have interesting applications for older or foreign texts, more readily allowing the user to interact with these documents.

http://www.youtube.com/watch?v=I2l0qklSzks#t=46

Project Proposals

Project 1

I wanted to an interactive sculpture type piece controlled by a glove. I wanted to have multiple sensors in the glove as the viewer and leave the viewer a but of mystery on the functionality of the glove. I mainly wanted lighting to play a big role on my piece, im not exactly sure the specs of the glove but I would want accelerometers and bend sensors to play a role in the design of the glove.


Project 2

As well for the first proposal I wanted to create a interactive sculpture controlled by a combination of a glove and a kinnect. I wanted to use a kinnect cam and blob tracking  and rising jestures to control the height of what I want to be tiny air balloons. I want the sculpture to resemble work of Alex Mcleods in a magical juvenile sense. I would want sensors in the glove to control lights withen the balloons.

Haptix



Darren Lim and Lai Xue the creators of Haptix, wanted to create an interface that steered away from touch technology and fixed surfaces. The haptix allows you to turn any surface into your multitouch interface using blob tracking technology. Haptix ultimately allows the user a more personalized interface experience as you are not subjected to any interface device.



http://makezine.com/2013/05/04/pitches-with-prototypes-haptix/

project proposals

Project 1
I will create an interface that will be controlling sounds with one hand I will use an ultra sonic sensor to check the distance form the sensor to the hand this will control the music the farther or closer your hand is that will have a different result on the music.  If I can do different hand gestures for another sensor I might do this to have more variety.  Your other hand will have the same style distance sensor controlling the DMX lights. 

Project 2.
I would create a visual story a user can be apart of and make their own choices in it.  I would use the kinect for tracking.  Your hand would probably be the controls.  It will either be a story and the user will have the chance to make the story and make it progress. Or I would make it not as much of a story but it will have a visual on the screen the user can interact with it.  I had inspiration for this from Façade projection on KOP at Breda Culture Night

http://vimeo.com/75019733

Siftables


Siftables is a new technology by David Merill that is basically a little computer in toy blocks.  These blocks are intended to take away the mouse of the computer and to give more interaction and more choices to a computer.  It gives the user a new way to interact with a computer.  Each block can sense the other blocks and can sense their movement.  These are tools created to interact with information.  When portraits come together they can tell that there is another portrait next to them they might look at the other portrait.  It is bringing the portraits to life.  There is a color application that lets the user mix colors to try and come up with a new color or the one that works best for what they are doing. They can add colors to one block, or take away color if they added too much of one color in.  The blocks can also do math.  A user can make different equations, and it can become a game to learn math.  The blocks are also connected with a dictionary so you can make words and it will tell you when the word is spelled correctly or incorrectly.  This is made to be a game similar to scrabble.  There is also an interactive cartoon application for language learning.  When a block is lifted off the table the scene changes on the TV, making a new scene and creating a new story.   Merrill even made a music application.  There is a block for bass, tempo, volume, reverse and others.  People can make their own music with these blocks.  There are endless amounts these blocks can and will be able to do.  All of these applications can be used as a game but also as a teaching software that lets the user have fun and have more control so they can pay more attention to it.

project proposal

http://cycling74.com/project/alpha/ the link is for a pretty interesting generative art and visual example.

for my project i am thinking of doing a glove that works by following the movements of your hand inside the gloves' 3d space, would use some kind of motion sensor inside that would track movement by looking for the blank spot created by the hand.  so the two interfaces might be infrared and maybe a light sensor similar tot he one that is used on the bugs we created last week.

Project Proposals


  1. It would be interesting trying to create a new musical interface based on hand proximity. Similarly to a theremin, the hand's location in relation to the sensors could create changes in the audio. In this case I can see volume, pitch, and audio processing being affected based on hand location with the ability to manipulate it in real time. 
  2. Create a mouse replacement interface that uses location tracking on the fingers. It would be like turning any surface into a trackpad and using gestures/taps on a surface to replicate standard mouse functions. 

Project proposals - Brandon

Proposal 1: Create a piece of interactive wearable gear, using capacitive, proximity, and/or resistance sensitivity. Create a surprising result with the interaction of two people. This project would accentuate the value of human connectedness and touch. One person could play the melody of another’s’ heart.


Proposal 2: Inspired by last weeks’ post, I would like to create a user-friendly piezoelectric interface that controls energy transmission. Through the conduction of energy, users will generate or control their own experience. The hope of this project ultimately becoming a usable resource for a community. This interface would use the interaction of some to power a response for many.

Open Circuit, or Offener Schaltkreis



Open Circuit, or Offener Schaltkreis for the German, uses a combination of old and new technology by covering the walls and floors of a gallery with flat copper tacks that carry multi-channel audio.  Participants explore the gallery with a small wireless receivers to listen in on what is being transmitted. The audio intensifies as more people join in, the sound track described as being a cross between industrial noise and Jimi Hendrix tuning up.

Video

Nota Bene's In Order to Control



In this piece the participants walk on the ground where images of phrases are being projected, silhouettes of their bodies, with the same phrases but in opposite contrast, are being projected onto the wall at the same time. The words that are in the images are meant to emphasize the slipperiness of morality and ethics with phrases like "Everything that's fair is not always legal" and "Everything that is legal is not always fair."

The piece uses an Xhox 360 Kinect, two MacBook Pro laptops and two projects to create the installation.

Monday, January 27, 2014

Alternative Interface - Leia Display

The Leia Display System is a display system that utilizes laser projections and cloud vapor to create images. The Leia Display XL allows for rooms to be projected in giving users the ability to walk in and out of the projections. Additionally, it is possible to interact with the display through gestures.

The company is rumored to currently be working on a cell phone technology that will allow people to be projected in 3D during calls using this technology.

Syd Project Proposals

Proposal 1:

To create a neural interface that interacts with brain waves and allows the user to have some sort of control what the outcome is. Possible options could include having an sonic element respond to brain wave activity, or a visual element as well.

Proposal 2:

To create an interface that responds to facial expressions. Than to have that apply to different cultures so that it was applicable outside of an American setting. The interface could respond by changing the environment. I think it would be interesting to have the ability for the response to be some variation on the user itself so it becomes a piece that has the audience responding to their own facial features.
Wearable Tech:

Beauty Technician, Katia Vega has created a new type of wearable that fully crosses over into wearable technology. So far, she has focused mainly on nail tech and eyelash tech and the results are pretty cool. She can attach "RFID tags, small magnets or conductive polish" to false nails so that the user can use hand motions timing to interact with electronics in their environment. She can also do a similar thing with false eyelashes where she uses conductive makeup to that when a user blinks at an object, she or he can interact with it. She has examples where she uses the nails to play music which is extremely and could be used implicated in many different ways. Both from an interactive art way and as an everyday wearable device. Also, the tech can communicate through different materials such as wood or water.

Project Proposals

Project Proposals:
1)  Have a simulation environment of some sort of nature theme. An example might be to try to recreate a rainforest environment, and have the sonic pieces associated with the audience’s interaction. Depending on the number of individuals standing or walking in a certain location would trigger some sort of environmental change or sound disturbance. The idea for the piece would be to have the audience semi-knowledgeable about how to effect the environment.

2)  To associate tracking of either limbs or blob tracking of people that changes the display on a screen. Maybe the display can be a sort of game, or it can simply change the way that the elements are working together in a sort of digital painting.

Augmented Reality Sandbox

The Augmented Reality Sandbox is amazing! 
  A lab at UC Davis, created a a 3D visualization application to teach earth sciences as earlier education programming. Inspired by Czech scientists, this group built an exhibit combining a real sandbox, and virtual topography and water generated via a Microsoft Kinect IR system. This project allows students to make alter topography by playing in an actual sand box, which is then augmented via a projector system. 
  Researchers say, "the system teaches geographic, geologic, and hydrologic concepts such as how to read a topography map, the meaning of contour lines, watersheds, catchment areas, levees, etc."

http://idav.ucdavis.edu/~okreylos/ResDev/SARndbox/

Chirp Microsystem's chip for gestural interfaces

Chirp Microsystems is in the process of producing a gestural interface for smart watches. They have created a small chip that is able to detect three-dimensional gestures using ultrasonic waves information. The plan is for the chips to be used in wearables that would consist of anything any electronic device you would like to control. This would eliminate the need for a touch screen interface.

The use of ultrasonic information should allow for more accurate gestures and a wide range of device sizes.  The echoes produced by ultrasonic waves are measured by the connection to the electronic chip. These chips can also be hooked up to computers, where they can be used to control things like games and flight simulators. The use of sound also lowers the amount of power that is used.

They are trying to create universal gestures that will be programmed and used on any type of device.