Embedding sensors using new materials into garments for real-time music/sound generation for performance practices using gesture recognition, touch and haptic mechanisms
This research is concerned with using sensors that are directly embedded into fabric and are programmed with a path to synthesizing music on the body. These garments can be used in performance settings such as opera, dance, theatre, musical performance among others to enhance and augment user and audience experience. Using body movement and gestures to create and control sound gives a new meaning to creating sound with the body. Wearable body instruments can also be used in therapeutic settings as well as for educational purposes. Combining technology with clothing design and sound generation is a new concept in performing arts practices. It can allow the user to manipulate sound while performing thereby giving greater control while accompanying other live or electronic instruments. Seamlessly integrating technology with fabric to create a unique sound experience that is felt and generated by using movement and touch is the goal of this research project.