Quantcast
Channel: Computational Media – Bubble
Viewing all articles
Browse latest Browse all 10

In Bud-Final

$
0
0

In Bud

 

Project Team:

Xuhui Xu

Wave Effect, Communication, Most Kinect, Installation, Schedule Management, The First Developer

Muqing Niu

Concept, Most Audio, Installation, Material Management, User Experience Designer

a1

Project Introduction:

In Bud is a project generating implicit understanding through a shared music generating experience without the use of direct interaction.

Your relationship is in bud.  Music arouses it, and the beauty would extend to the next spring.  We follow, we understand, we cooperate, we compromise, and we create a collaborative  melody. The changes of music in turns change our own rhythm, and arouses the implicit understanding between us.

 

Project Video:

 

Main Components:

Visual effect: Visual effect is the central core in user experience that it is just like the door of a room to motivate users  keep on exploring in the experience.

Audio : Music is the most mutual and beautiful language in the world, people communicate through music but could have different understanding according to their experience, emotion, cultural respect, etc.

To combine the two main components of our project and chase for the implicit understanding, we should get three things into consideration:

1.  Implicit understanding: Collaboration should be dominating.

How do we get the implicit understanding?  As we know, babies get to know the world and become part of it by imitating. In the nature world,  it is a safe way to do the gesture following for a new comer to show his goodwill to a group. When there is no language between two creatures, it is their nature to get the first understanding by doing the gesture following, and doing something together.  In our projects, we want to get something more substantial, emerging  from the nature of two beings, the two strangers could not see each other during the engaging experience. They standing at different sides of the two screens, getting the implicit understanding based on the same standing point to do something together other than competing with each other. That is to say, collaboration should be dominating in the user experience in this scene. When we define the quality of the implicit understanding in the collaboration, we are thinking more about how much two users waiting for each other, how much they compromising, how much they following, and how much they thinking standing at others’ point.

2. The effect should motivate user’s first trying and the continuous exploring.

If there is one thing that could make user feel special and want to explore more, the thing should not be common in the real world. It would be amazing to let something happen on the screens that could not appear in the real world. And to motivate users keeping on trying, the components in the experience itself should have a feature of continuity. How about erecting water on the wall? And how about touching flame? How about making flame growing in the water and water reflecting the shadow of frame?

3. Everything should be synchronised in the user experience.

We use our energy to control water, flame. Music expands when ripples expand, and music fades out when flame, light and everything fades out. It is just like the harmony and every connection  in the nature world but could not exactly happen, something that could be sensed their existing but never could be hold by human hand , and it surrounds you, but at the same time flows away from the moment  your touch it. Then we are thinking about using soft spandex, it is smooth and flexible, liquidity. You can touch it ,press it, move you hand creating music by control flame and water on it. You will never hold it as everything vanishes in a flash, but your memory and implicit understanding with another human being will last. Do you want to know who is behind that canvas? Who are you following, cooperating with, and waiting for? Hope you can evoke the implicit understanding, and the bud between you could blossom into something beautiful and extend to the next spring…

 

Behind the Story

1.Visual

When wave moves, it passes the near image pixels to the exact points according to the fluctuation of sin wave. When mouse or Kinect average touch points are sensed,  the visual elements would be triggered. The last time of the visual elements is according to how many music tracks are trigged.

inBud

A547018B-D6F4-441A-8636-584020014842

2.Music

Flame effect is synchronised with  the main melodies which is grand, continuous, strong. While the water effect is related to the music that is light, transient, and active. The screen has been divided into 16 music grid on each sides and each grid takes different music track.

A5B90E19-7B1F-448D-9804-10C4D5C2EAD6 Screen Shot 2015-11-29 at 18.27.34

The music tracks are finished in Logic Pro X. Each track is exactly 8 seconds and has the same rhythm speed. To avoid rhythm alignment, the music will jump to the exact second the start to play every time the music is triggered. And a clock is sent from the server to client for synchronising the timing.

85BE81B7-2D24-4297-90F5-92A970F27F0E 75959D4D-2F28-42AE-8EC2-141EAF177AF0

When the sensing point is moved from one music taker to another, the previous ones start to decay and the current one fades in. The decay time of music is also relating to how many pieces of tracks are playing.

E2628039-0CA4-4072-8317-2EC3B1694859 F2E127DF-EE12-46AE-B896-73C39C0B1EEA

3. Communication

The server and the client send the sensed location (by Kinect or mouse movement) to each other. Besides, the server send out a clock to synchronise the time on both sides, ensuring the music comes from both sides are always on the beats.

740D8F7F-CD0C-4C44-8D47-40C0AE48AAC7

The future scope is that we are going to build an app that server makes people into pairs. After the collaborative implicit understanding generating, they can choose who they want to know. This is a scope for the test before people’s first date.

42EE0E61-F63F-479E-9CB3-1D4B84D9A2D5

4.Kinect

First we were testing the nearest one point, which is far from accuracy. After consulting Shiffman and Moon, getting average point of the depth of user’s hand is more practical.

0705546A-A447-46FC-8EC4-9B8789079242

5.Installation

We built our own painting frames using spandex and woods. And the woods are almost the only things we got on black Friday,haha!

IMG_4087

6.User tests

We went through two rounds of user testing, and we got many constructive advices from our beloved professor Tom Igoe and dear classmates. We never stop chasing for better products by improve every detail. We revised our concept of projects after the user test and put all our efforts into the implicit understanding generating. We keep on enhancing visual effects and making music goes more synchronising with visual effect.

Thanks to all your help

Thanks to Tom Igoe, Daniel Shiffman, J.H.Moon, and every classmates gave us constructive advices that help us improve and become better! We are really appreciate it!

 

 

 


Viewing all articles
Browse latest Browse all 10

Trending Articles