帳號:guest(216.73.216.146)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目
作者(中文):何文暘
作者(外文):He, Wen-Yang
論文名稱(中文):誰會為線上開放式課程公司付費? 基於理論的預測與純數據分析預測的比較
論文名稱(外文):Who is willing to pay for MOOC? Theory-based prediction V.S Pure-data prediction
指導教授(中文):許裴舫
指導教授(外文):Hsu, Pei-Fang
口試委員(中文):林福仁
嚴秀茹
口試委員(外文):Lin, Fu-Ren
Yen, Hsiu-Ju
學位類別:碩士
校院名稱:國立清華大學
系所名稱:服務科學研究所
學號:107078513
出版年(民國):109
畢業學年度:108
語文別:英文
論文頁數:69
中文關鍵詞:大規模開放線上課堂技術接受模型支付意願純數據基於理論
外文關鍵詞:MOOCTAMWillingness to PayPure-dataTheory-based
相關次數:
  • 推薦推薦:0
  • 點閱點閱:552
  • 評分評分:*****
  • 下載下載:0
  • 收藏收藏:0
大規模開放在線課程(MOOC)近年來已逐漸成熟,人們開始關注這種新型態的學習方法是否是成功的。現有文獻使用了數據分析方法來定義MOOC的成功,例如:預測學生何時輟學,預測學生學習成績和預測用戶滿意程度。但是,對於預測誰將為MOOC付費的研究很少。我們的研究建立了預測模型的研究程序,以識別願意為不同的MOOC付費方式(即一次性付費,每月付費,完成證明付費,證書證明付費)付費的客戶。
此外,一般的純數據預測分析模型要求公司收集盡可能多的數據,以建立功能強大的預測模型,這非常耗時,而且缺乏理論依據。在沒有理論依據的情形下,這種預測模型有時很難清楚地證明結果的合理性。因此,我們提出了一種基於理論的預測方法,該方法基於理論來預測使用者是否會付費。在此篇研究中,我們使用TAM(技術接受模型)建立基於理論的預測模型。結果顯示在MOOC中,基於理論的模型可以在兩個付費方式(即每月付費和證書證明付費)中獲得與純數據預測分析相同的預測能力。我們的研究還討論並了解如何基於預測結果來擴展TAM理論,以及向MOOC的公司提出意見,即不同的MOOC平台應採用不同的支付策略以最大化收益。
MOOC, massive open online course, has gradually matured in recent years. Extant literature has spent efforts on predicting the success of MOOCs, such as dropout rate, academic performance, and satisfaction, using data mining methods. However, the attention on predicting who will pay for MOOCs is scarce. Our study demonstrates a research routine to build predictive models in order to identify customers who are willing to pay for different MOOCs package (i.e. adoption, monthly, achievement, certificate). In addition, pure data mining modelling requires company to collect as much data as possible in order to build a powerful prediction model, which is time-consuming. Without a theoretical guidance, this predictive modelling sometimes is very difficult to justify the results clearly. Therefore, we propose a theory-based prediction method that builds on theory to predict future observations. Our findings show that, in MOOC, theory-based models can gain the same predictive power as pure-data models can in two payments Monthly and Achievement. Our study also discusses how we can extend the TAM theory based on predictive results as well as signals to MOOCs practitioners that different MOOCs platforms should pursue different payment strategies to maximize the revenue.
Chapter 1 – Introduction 8
Chapter 2 – Literature Review 12
2.1 Explanatory modeling and Predictive modeling 12
2.1.1 Explanatory modeling 12
2.1.2 Predictive modeling 13
2.2 Theoretical foundation 15
2.2.1 Research Gap of Model 15
2.2.2 Technology acceptance model (TAM) 16
2.2.3 Model description 16
2.2.4. TAM in Explanatory studies 17
2.3 Prior Prediction model in MOOCs 18
2.3.1 Dropout 19
2.3.2 Satisfaction 20
2.3.3 Academic Performance 21
2.3.4 Willing to Buy 22
Chapter 3 – Methodology 23
3.1 Data 23
3.1.1 Data Collection 23
3.2 Research design 23
3.2.1 Independent variables (IV) 24
3.2.1.1 Personal information 24
3.2.1.2 Personal Motivation 24
3.2.1.3 MOOCs Platform 25
3.2.1.4 MOOCs Course 25
3.2.2 Dependent variables (DV) 30
3.2.2.1 Adoption payment 31
3.2.2.2 Monthly payment 31
3.2.2.3 Certificate of Complete 31
3.2.2.4 Certificate of Achievement 31
3.3 Comparison of data mining methods 31
Chapter 4 – Results 33
4.1 Predictive power comparison 33
4.1.1 Naïve Classifier 33
4.1.2 Accuracy 34
4.1.3 Sensitivity 35
4.1.4 Specificity 36
4.1.5 Summary 37
4.2 Predictive Power of Theory-based model 38
4.3 Extending TAM theory 39
4.3.1 Combined with Pure-data model 40
4.3.2 Combined with Self-Regulated Learning 43
4.4 Prediction in a context of a MOOC company 44
4.4.1 Udemy and Coursera characteristics 45
4.4.2 Predictor selection 45
4.4.3 Prediction results 46
4.4.3.1 Adoption 47
4.4.3.2 Monthly 48
4.4.3.3 Certificate 49
4.4.3.4 Achievement 50
Chapter 5 – Conclusion 51
5.1 Theoretical Implications 51
5.2 Managerial Implications 53
5.3 Limitation and future research 54
Reference 56
Appendix 62
1.Astin, A. W. (1993). What matters in college? Four critical years revisited. San Fran.

2.Bayer, J., Bydzovská, H., Géryk, J., Obsivac, T., & Popelinsky, L. (2012). Predicting Drop-Out from Social Behaviour of Students. International Educational Data Mining Society.

3.Berinsky, A. J., et al. (2011). "Using Mechanical Turk as a subject recruitment tool for
experimental research."

4.Broadbent, J. (2017). Comparing online and blended learner’s self-regulated learning
strategies and academic performance. The Internet and Higher Education, 33, 24-32.

5.Bolliger, D. U. (2004). Key factors for determining student satisfaction in online courses. International Journal on E-learning, 3(1), 61-67.

6.Chang, R. I., Hung, Y. H., & Lin, C. F. (2015). Survey of learning experiences and influence of learning style preferences on user intentions regarding MOOC s. British Journal of Educational Technology, 46(3), 528-541.

7.Chu, K. (1999). An introduction to sensitivity, specificity, predictive values and likelihood ratios. Emergency Medicine, 11(3), 175-181.

8.Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS quarterly, 319-340.

9.Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1989). User acceptance of computer technology: a comparison of two theoretical models. Management science, 35(8), 982-1003.

10.Deng, R., Benckendorff, P., & Gannaway, D. (2019). Progress and new directions for teaching and learning in MOOCs. Computers & Education, 129, 48-60.

11.Doty, D. H., & Glick, W. H. (1994). Typologies as a unique form of theory building: Toward improved understanding and modeling. Academy of management review, 19(2), 230-251.

12.Elia, G., Solazzo, G., Lorenzo, G., & Passiante, G. (2019). Assessing learners’ satisfaction in collaborative online courses through a big data approach. Computers in Human Behavior, 92, 589-599.

13.Fincher, S., Robins, A., Baker, B., Box, I., Cutts, Q., de Raadt, M., ... & Petre, M. (2006). Predictors of success in a first programming course. In Proceedings of the 8th Australasian Computing Education Conference (ACE 2006) (Vol. 52, pp. 189-196). Australian Computer Society Inc...

14.Gardner, J., & Brooks, C. (2018). Student success prediction in MOOCs. User Modeling and User-Adapted Interaction, 28(2), 127-203.

15.Gefen, D., Karahanna, E., & Straub, D. W. (2003). Trust and TAM in online shopping: an integrated model. MIS quarterly, 27(1), 51-90.

16.Gigerenzer, G., & Gaissmaier, W. (2015). Decision making: Nonrational theories. In International encyclopedia of the social & behavioral sciences (pp. 911-916). Elsevier.

17.Glanz, K., Rimer, B. K., & Viswanath, K. (Eds.). (2015). Health behavior: Theory, research, and practice. John Wiley & Sons.

18.Gregor, S. (2006). The nature of theory in information systems. MIS quarterly, 611-642.

19.Gupta, S., & Sabitha, A. S. (2019). Deciphering the attributes of student retention in massive open online courses using data mining techniques. Education and Information Technologies, 24(3), 1973-1994.

20.Hew, K. F., Hu, X., Qiao, C., & Tang, Y. (2020). What predicts student satisfaction with MOOCs: a gradient boosting trees supervised machine learning and sentiment analysis approach. Computers & Education, 145, 103724.

21.Hollands, F. M., & Tirthali, D. (2014). MOOCs: Expectations and reality. Center for Benefit-Cost Studies of Education, Teachers College, Columbia University, 138.

22.Hollands, F. M., & Tirthali, D. (2014). Why Do Institutions Offer MOOCs? Online Learning, 18(3), n3.

23.Daniel, J. (2012). Making sense of MOOCs: Musings in a maze of myth, paradox and possibility. Journal of interactive Media in education, 2012(3).

24.Kember, D., & Ginns, P. (2012). Evaluating teaching and learning: A practical handbook for colleges, universities and the scholarship of teaching. Routledge.

25.Lazer, D.M., Kennedy, R., King, G., Vespignani, A., (2014). The parable of Google Flu:
traps in big data analysis. Science 343, 1203–1205.

26.Márquez-Vera, C., Morales, C. R., & Soto, S. V. (2013). Predicting school failure and dropout by using data mining techniques. IEEE Revista Iberoamericana de Tecnologias del Aprendizaje, 8(1), 7-14.

27.Martinho, V. R., Nunes, C., & Minussi, C. R. (2013, September). Prediction of school dropout risk group using neural network. In 2013 Federated Conference on Computer Science and Information Systems (pp. 111-114). IEEE.

28.Manhães, L. M. B., da Cruz, S. M. S., & Zimbrão, G. (2014, March). WAVE: an architecture for predicting dropout in undergraduate courses using EDM. In Proceedings of the 29th annual acm symposium on applied computing (pp. 243-247).

29.Moreno-Marcos, P. M., Muñoz-Merino, P. J., Alario-Hoyos, C., Estévez-Ayres, I., & Delgado Kloos, C. (2018). Analysing the predictive power for anticipating assignment grades in a massive open online course. Behaviour & Information Technology, 37(10-11), 1021-1036.

30.Moreno-Marcos, P. M., Muñoz-Merino, P. J., Alario-Hoyos, C., & Delgado Kloos, C. (2019). Analyzing students’ persistence using an event-based model. Proceedings of the Learning Analytics Summer Institute Spain.

31.Moreno-Marcos, P. M., Muñoz-Merino, P. J., Maldonado-Mahauad, J., Pérez-Sanagustín, M., Alario- Hoyos, C., & Kloos, C. D. (2020). Temporal analysis for dropout prediction using self-regulated learning strategies in self-paced MOOCs. Computers & Education, 145, 103728.

32.Putka, D. J., Beatty, A. S., & Reeder, M. C. (2016). Modern prediction methods: New perspectives on a common problem. Organizational Research Methods, 1094428117697041.

33.Shah, D. (2018). By the numbers: MOOCs in 2018 [Web log post].

34.Richardson, M., Abraham, C., & Bond, R. (2012). Psychological correlates of university students' academic performance: a systematic review and meta-analysis. Psychological bulletin, 138(2), 353.

35.Shmueli, G. (2010). To explain or to predict? Statistical science, 25(3), 289-310.

36.Shmueli, G., & Koppius, O. R. (2011). Predictive analytics in information systems research. MIS Quarterly, 553-572.

37.Thomas, E. H., & Galambos, N. (2004). What satisfies students? Mining student-opinion data with regression and decision tree analysis. Research in Higher Education, 45(3), 251-269.

38.Umer, R., Susnjak, T., Mathrani, A., & Suriadi, S. (2017). On predicting academic performance with process mining in learning analytics. Journal of Research in Innovative Teaching & Learning.

39.Watson, C., Li, F. W., & Godwin, J. L. (2013, July). Predicting performance in an introductory programming course by logging and analyzing student programming behavior. In 2013 IEEE 13th international conference on advanced learning technologies (pp. 319-323). IEEE.

40.Wong, J., Baars, M., Davis, D., Van Der Zee, T., Houben, G. J., & Paas, F. (2019). Supporting self-regulated learning in online learning environments and MOOCs: A systematic review. International Journal of Human–Computer Interaction, 35(4-5), 356-373.

41.Ye, C., & Biswas, G. (2014). Early prediction of student dropout and performance in MOOCs using higher granularity temporal information. Journal of Learning Analytics, 1(3), 169-172.
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top
* *