We will have 2 new projects starting in January 2018, Vibrating Instruments in Virtual Reality (VIVR): Creative Content Production in VR with Cohesive Sense of Reality and Laser String Kantele, more info in the project section.
We will attend NIME’16 conference in Brisbane between 11 – 15 July, presenting; our paper “Non-intrusive Counter-actions: Maintaining Progressively Engaging Interactions for Music Performance”, music performance of the composition “KET Conversations” and poster on “Materiality for Musical Expressions: an Approach to Interdisciplinary Syllabus Development for NIME”. We also have been contributing the NIME book which will be launched at NIME’16 conference. Looking forward to all that!!
—– Looking for Test Users, next week, 20-23.01 —-
Dear Media Labbers,
Since the last data gathering session, we worked hard and developed/implemented NOISA instruments and the related interactive system. Now they are ready for your feedback/evaluation/comments/suggestions. We are currently looking for volunteers willing to give 40 minutes of their time during next week between 20-23 January. Time will be spent interacting with our new NOISA musical instrument (see previous version http://sopi.aalto.fi/research/noisa/), and following that filling in a questionnaire survey.
There is a foodle (this is new doodle) link for this user study, http://foodl.org/foodle/NOISA-v2-User-Test-Study-54b65
please choose the time/day that best suits you to participate in our study. If you have any questions please let us know (firstname.lastname@example.org)
To say thank you for your time, we would like to give you a chocolate bar and a free hug!!
SOPI research group
This year, we are celebrating the 10th anniversary of Composing with Pure Data (Data Flow Programming Language) course !! The course has been part of the compulsory studies for the major in Sound in New Media, currently it is one of the most popular courses in the curriculum.
We are very much happy to have Andy Farnell, teaching the 10th year course here at the Media Lab. Helsinki!!
I am very happy to announce the Open Source release of PESI extended system developed at SOPI research group. This system is designed for co-located collaboration, providing spatial opportunities for musical exploration.
The software system includes 2 main components;
PESI_OnBody component released as Xcode project to create a mobile instrument as an iOS application with integrated audio synthesis part. This component is a custom software built with Objective-C and Libpd. This release includes a Pure Data patch template with three sound modules to create and develop musical interfaces for iOS mobile phones. PESI_OnBody component uses mobile phone’s sensor input data to create and process sound. All the audio synthesis is done in the device with Libpd. It also streams the sensor data to the _masterController patch in PESI_InSpace component.
PESI_InSpace component (repository includes link for the video tutorial) is a modular multi-user motion tracking system (repositories kinectTrackerRealTime, kinectPlySaver and kinectDataServer), which also provides the data to extract certain features, such as relative distances, velocity, acceleration and alignment. The modular structure of the motion tracking module enables to add several Kinect v1 sensor bars to the system for a reliable tracking. The tracking module avoids the loss of tracked players due to occlusions or abrupt movements. The repository PESI_InSpace, the _masterController patch, receives the tracking data from kinectTrackerRealTime and sensor data from mobile phones. The _masterController patch uses that received data to create / manipulate sound and to distribute it to multi-channel speakers.
This release is one of the research outcomes of The Notion of Participative and Sonic Interaction – Academy of Finland project 137646. The project was presented at NIME 13 “PESI Extended System: In Space, On Body, with 3 Musicians”, at SMC 13 “Composing Social Interactions for an Interactive-Spatial Performance System” and “Situating the Performer and the Instrument in a Rich Social Context with PESI Extended System”. The system components have also been used in various music performances.
more about the project
Dear friends and colleagues,
Sound and Physical Interaction – SOPI research group will be conducting a user-test study between the 16th-19th June and 23rd-27th June. We are currently looking for volunteers willing to give a half an hour of their time between the dates mentioned. The time will be spent interacting with our NOISA musical interface, followed by answering a short questionnaire and an interview.
Anyone can participate, no prerequisite knowledge or experience is required; however, we strongly encourage participants with music background at any level to participate in this study.
To say thank you for your time, we would like to give you a Finnkino movie ticket.
We would like to kindly ask you to register through the below doodle link,
The address for the User-Testing experiment is:
Media Lab 5th Floor, Sound Studio
SOPI Research Group
Posted in Announcements, Events, Uncategorized
Tagged aesthetically engaging forms, CHI, Embodied Interaction, gesture based controllers, interaction design, Machine Learning, Music Experience, New Media practices, NIME, Physical Interaction, sonic interaction, Sound
We shared the stage with Thomas Bjelkeborn and Philippe Moenne-Loccoz, playing live-electronics at Fylkingen as a part of Lamour event. The musical instruments and the interactive system that have been used in this performance are the research outcomes of the PESI research project
on Thursday and Friday 28th/29th November, we visited Design Lab, Koc University (http://designlab.ku.edu.tr/) Istanbul for a seminar talk and tutoring sessions together with Morten Fjeld from t2i interaction lab.
Open Position – A Doctoral Student and/or A Research Assistant position.
The deadline for applications is December, 2nd 2013.