Friday, February 28, 2014

Brazil Instrument and Installation

The users body will be the instrument.  I will use computer vision and a Kinect for this.  Your body will control the visuals that are playing from the emotional type of music the user selected.  There is the emotion angry for this a user can punch the air forward, and that will have effects to the visuals.  For the pump up music the user can control the visuals by moving their hand in an upward direction in the air like they were dancing at a show.  For calming and peaceful this will be controlled mostly by the placement of the users head.  It would not make sense to have the user making large amounts of movements for this emotion because it does not go with the emotion it should be relaxing, I want the user to more enjoy the music and relax rather than be energetic like the pump up emotion.  Then the last emotion is good feeling this will also be controlled by the users hands but in a different way the movements your hand makes will change the visuals.  Maybe how high or low your hands are will have effect.  The Movement the users makes I want to be similar to the type of movements they would make listening to that type of music. 


For the installation I want to bring the interface, and the instrument together for one installation.  My plan is too get the DMX lights connected to my patch.  The DMX lights will be able to fully change the feeling and environment that the user will be in.  This will create the feeling to go along with the music.  It would not make sense to be choosing the angry emotion and have the room be bright, with light colors, the rooms should be dark, and have black and red lights.  The room should not feel to be a happy place, but with the good feeling music this should have a fully different feeling than angry music and the other emotional type of music.  Good feeling music should have the lights having a little brightness to them and cooler colors, like light blue.  For pump up the light should be the brightest, very ambient, cool colors.  The lights should also change more frequently to give the full feeling of being at a show.  This will get the user up and walking around.  It will also have them connecting to that emotion.  For calming the lights should not change as much, the colors should give off a relaxing feeling.  Because I want to use the DMX lights I would need to set my installation up in the Hyper cube.  Everything should just be a quick plug in and go, so it should be able to work easier with the other groups. 

sensor and interface information

The sensors i'm using will be flex and pressure sensor.  The way the the user interacts with the project will be through simple finger motion and pressure between the thumb and the other finger(s).  The idea is that the videos will cycle based upon the flex of the fingers, and by flexing the fingers other then the pointer finger it will change either the sound or the effects that are happening to the video.  It will be using a few preset videos and sound data.   

Friday, February 7, 2014

Virtual Instrument System


Vincent John Vincent invented video gesture control virtual reality with Francis MacDougall in 1986. By 1991 they had a fully dynamic immersive video gesture control virtual reality system. In 1986 Vincent began performing around the world. He had created a whole new genre of performance technology and hundreds of virtual instruments.
Vincent John Vincent is also the President and Co-Founder of GestureTek, the worlds leading Video Gesture Control Company. GestureTek  also invented 3D video gesture control in 2000, which is the technology behind the Kinect.


Thursday, February 6, 2014

Bionic hand lets wearer feel what they're holding

Real-time nerve interface for mechanical hand that relates detailed haptic feedback.

"...by the final week he was able to differentiate between three shapes with 88 percent accuracy and between the hardness of three objects with 78.7 percent accuracy. "It is very intuitive," Micera says.

Using the bionic hand required Sørensen to have electrodes implanted in his arm, just above where it had been amputated nine years prior. Even though the nerves hadn't been in use, the prosthetic was able to translate the bionic hand's input into electrical signals that the nerves could understand."

Full Story

Thursday, January 30, 2014

Rodich Project Proposal

For my first project I would like to focus on creating an interface for controlling video/audio/light in a room which is focused on eye sensing, haptics and proprioception. Ideally, this interface allows the user to control the environment without being hyper aware of which actions control what. The idea of being un-aware of your control over the environment and actually controlling it are of course at odds, but the conceptual idea is to pull of a sense of digital magic trick whereas the actions corresponding to actions conducted by the individual user feel natural. Most importantly, I want to design an interface that is 'invisible' to the user, in that they do not understand and don't need to understand the action functionality of the interface setup. 

I suspect that eyesight tracking will involve a sophisticated dissection of data from a webcam feed which I plan to accomplish with tracking in Max. I plan to use a combination of sonar sensors, accelerometers, and infared sensors to track peoples motion around the room. This will likely have to be an array set up in a specific pattern to generate X,Y quardinates in the room in order to direct any sort of audio/video/light for the user to experience as a result of their interaction. 

Whiteside project proposal

  I believe i want to do a project that involves making an interface that can be mounted on your arm and tracks the motion of your hand through small pegs that surround your hand but does not quite touch it, the sensors i think i would use is probably be optical sensors on 4 pegs that go around the hand.  Have them facing each other to track the 3dimensional movement that the user does with their hand, to avoid overly complex setups i would probably just track the top and palm of the hand.  the idea of this is to allow a user to feel like an interaction with something is completely natural and not like they have to do something extra in order to interact.  What i mean by this is say you are interacting with a computer and you are using a mouse, you have to keep track of where the mouse is so it doesn't fall off like a table or you run out of movement space with your arm, and it is not as clunky as say using a glove of some sort where it interferes with you natural interactions.

Brazil Project proposal


I will create an interface that will be controlling sounds with one hand. I will use a sensor to check the distance from the sensor to the hand this will control the music the farther or closer your hand is would have a different result on the music. Then I will have another sensor for your other hand that will control a visual set.  This will be done in the same fashion as the first sensor.  I want the sensors to be able to recognize hand gestures so your hand does not have to be hovered in the same spot the whole time.  I want the possibility to be open to add more than two sensors I want the user to have more freedom then just hovering over one sensor, so it does not get boring as easily.  If possible I would like to do a sensor for the DMX lights. The visual set will also be partially connected with the music. Im not fully set on what i want to put the sensors in but i am thinking of putting them each in a square or a disk that can resemble a dj work area.