帳號:guest(216.73.216.146)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目
作者(中文):趙偉博
作者(外文):Chao, Wilber
論文名稱(中文):在智慧型手機上與音樂互動
論文名稱(外文):Music Interaction on Mobile Phones
指導教授(中文):陳宜欣
指導教授(外文):Chen, Yi-Shin
口試委員(中文):王浩全
蔡振家
口試委員(外文):Wang, Hao-Chuan
學位類別:碩士
校院名稱:國立清華大學
系所名稱:資訊工程學系
學號:100062603
出版年(民國):102
畢業學年度:101
語文別:英文
論文頁數:24
中文關鍵詞:音樂互動人機互動智慧型手機指揮
外文關鍵詞:Music interactionHCIMobileAndroidconducting
相關次數:
  • 推薦推薦:0
  • 點閱點閱:695
  • 評分評分:*****
  • 下載下載:0
  • 收藏收藏:0
音樂的領域中,有許多不同種類的音樂。維持音樂多樣性有助於音樂的發展與創作。但是有部分音樂種類由於聽眾人口的老化與減少導致其消失;因此我們志在提供人們不一樣的方式去享受音樂,吸引人們去聽平常不太會聽到的音樂。研究顯示透過與音樂互動的方式可以增進人們對音樂的賞析和喜好,又以手勢和音樂節奏的互動最有效果;同時為了讓所有人可以與音樂互動,必須選擇一個可以普及化的方法。本研究中我們透過利用智慧型手機中的加速度感測器來感應使用者手勢移動,進而調整手機的音樂播放節奏來達到使用者的期望。例如:使用者拿著慧型手機打拍子,當手移動加速,音樂跟著加速,反之亦然。
過於敏感的加速度感測器會有太多雜訊影響手勢的辨識,容易導致音樂變得忽快忽慢。為了提供良好的音樂互動體驗,勢必要讓使用者在互動過程中讓音樂的演奏呈現平穩且和順的。由於指揮音樂節奏的手勢是有規則的指出拍子,我們發展出一套辨識演算法"Repeated Pattern",目的是要從使用者手部移動中找出有規律且有意義的拍點,然後藉此改變音樂播放的節奏。實驗部分我們也進行了使用者研究,讓使用者真正透過我們的系統與音樂互動。結果顯示人們互動後對於音樂的喜好度是有正面的影響。我們也探討出是哪些因素去改變使用者對於音樂的喜好程度。
A diversity environment can help the development of music composing. But the decline
of the audience for some music will result in disappearing. If we can provide a different
way to let audience enjoy music and improve the preference, they are willing to listen these
music. Through a interaction way to enjoy music can change the preference and the effect
of interaction between gestures and tempo is more significant. In this paper, we propose
utilizing gestures to interact with music with the mobile phone. By acceleration sensor
which inside the mobile phone to detect the hand gesture and adapt the music tempo to
users’ intention. For smoothness interaction, we developed repeated pattern recognition for
detecting the beat during interaction. The user study demonstrates the effect of preference
after interaction and reveals the cause which change users’ music preference.
Chinese Abstract ii
Abstract iii
Acknowledgement iv
List of Tables vii
List of Figures viii
1 INTRODUCTION 1
2 RELATED WORK 3
3 Interaction Scenario 6
4 Methodology 7
4.1 Data Preprocessing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
4.2 Gesture Recognition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
4.2.1 Tempo smoothness . . . . . . . . . . . . . . . . . . . . . . . . . . 9
4.2.2 Repeated Pattern . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
v4.2.3 Tempo Calculation . . . . . . . . . . . . . . . . . . . . . . . . . . 11
4.2.4 Tempo Validation . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
5 EXPERIMENTAL EVALUATION 13
5.1 Performance of system function . . . . . . . . . . . . . . . . . . . . . . . 13
5.1.1 Performance of tempo smoothness . . . . . . . . . . . . . . . . . . 14
5.1.2 Performance of detecting latency . . . . . . . . . . . . . . . . . . . 15
5.1.3 Performance of tempo accuracy . . . . . . . . . . . . . . . . . . . 15
5.1.4 Usability of music interaction . . . . . . . . . . . . . . . . . . . . 16
5.2 User study of music interaction . . . . . . . . . . . . . . . . . . . . . . . . 17
5.2.1 Pre-test result . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
5.2.2 Post-test result . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
6 CONCLUSION 21
References 22
References
[1] T. Baba, M. Hashida, and H. Katayose. ”virtualphilharmony”: a conducting system
with heuristics of conducting an orchestra. In Proceedings of conference on new
interfaces for musical expression (NIME10), Sydney, Australia, pages 263–270, 2010.
[2] J. Borchers, E. Lee, W. Samminger, and M. M¨uhlh¨auser. Personal orchestra: a real-
time audio/video system for interactive conducting. Multimedia Syst., 9(6):594, 2004.
[3] E. Cambouropoulos, S. Dixon, W. Goebl, and G. Widmer. Human preferences for
tempo smoothness. In Proceedings of the VII International Symposium on Systematic
and Comparative Musicology, III International Conference on Cognitive Musicology.
Jyv¨askyl¨a, Finland: University of Jyv¨askyl¨a, pages 18–26, 2001.
[4] E. Chew, A. Franc¸ois, J. Liu, and A. Yang. Esp: a driving interface for expression
synthesis. In Proceedings of the 2005 conference on New interfaces for musical ex-
pression, pages 224–227. National University of Singapore, 2005.
[5] E. Chew, J. Liu, and A. R. J. Franc¸ois. Esp: roadmaps as constructed interpretations
and guides to expressive performance. In Proceedings of the 1st ACM workshop on
Audio and music computing multimedia, pages 137–145. ACM, 2006.
22[6] S. Dahl and A. Friberg. Visual perception of expressiveness in musicians’ body move-
ments. Music Perception, 24(5):433–454, 2007.
[7] M. Hans, A. Slayden, M. Smith, B. Banerjee, and A. Gupta. Djammer: a new digital,
mobile, virtual, personal musical instrument. In Multimedia and Expo, 2005. ICME
2005. IEEE International Conference on, pages 4 pp.–. IEEE, 2005.
[8] A. LeBlanc and J. McCrary. Effect of Tempo on Children’s Music Preference. Journal
of Research in Music Education, 31(4):283–294, 1983.
[9] E. Lee, I. Gr¨ull, H. Kiel, and J. Borchers. conga: a framework for adaptive conducting
gesture analysis. In Proceedings of the 2006 conference on New interfaces for musical
expression, pages 260–265. IRCAM — Centre Pompidou, 2006.
[10] E. Lee, H. Kiel, S. Dedenbach, I. Gr¨ull, T. Karrer, M. Wolf, and J. Borchers. isym-
phony: an adaptive interactive orchestral conducting system for digital audio and
video streams. In CHI ’06 Extended Abstracts on Human Factors in Computing Sys-
tems, pages 259–262. ACM, 2006.
[11] E. Lee, T. M. Nakra, and J. Borchers. You’re the conductor: a realistic interactive con-
ducting system for children. In Proceedings of the 2004 conference on New interfaces
for musical expression, pages 68–73. National University of Singapore, 2004.
[12] J. Liu, E. Chew, and A. R. J. Franc¸ois. From driving to expressive music performance:
ensuring tempo smoothness. In Proceedings of the 2006 ACM SIGCHI international
conference on Advances in computer entertainment technology, page 78. ACM, 2006.
23[13] Y.-L. Liu and Y.-S. Chen. iconduct:music control in the interactive conducting system.
Technical report, 2009.
[14] A. Maezawa, M. Goto, and H. G. Okuno. Query-by-conducting: An interface to
retrieve classical-music interpretations by real-time tempo input. In Proceedings of
11th International Society for Music Information Retrieval Conference, pages 477–
482. ISMIR 2010, 2010.
[15] A. P. Montgomery. Effect of Tempo on Music Preferences of Children in Elementary
and Middle School. Journal of Research in Music Education, 44(2):134–146, 1996.
[16] T. Nakra, Y. Ivanov, P. Smaragdis, and C. Ault. The UBS virtual maestro: An inter-
active conducting system. NIME2009, 10(4):250–255, 2009.
[17] J. C. Peery and I. W. Peery. Effects of exposure to classical music on the musical
preferences of preschool children. Journal of Research in Music Education, 34(1):24–
33, 1986.
[18] L.-W. Toh and Y.-S. Chen. An interactive conducting system using kinect. In Mul-
timedia and Expo, 2013. ICME 2005. IEEE International Conference on, pages 1–6.
IEEE, 2012.
(此全文未開放授權)
電子全文
摘要
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *