帳號:guest(216.73.216.146)          離開系統
字體大小: 字級放大   字級縮小   預設字形  

詳目顯示

以作者查詢圖書館館藏以作者查詢臺灣博碩士論文系統以作者查詢全國書目
作者(中文):蘇拉曼
作者(外文):Surahman, Ence
論文名稱(中文):以混合研究法探究線上學習過程之不誠實行為
論文名稱(外文):Studies on academic dishonesty behaviours (ADB) in online learning: A mixed methods
指導教授(中文):王子華
指導教授(外文):Wang, Tzu-Hua
口試委員(中文):周金城
蔣佳玲
黃能富
邱富源
口試委員(外文):Chou, Chin-Cheng
Chiang, Chia-Ling
Huang, Nen-Fu
Chiu, Fu-Yuan
學位類別:博士
校院名稱:國立清華大學
系所名稱:跨院國際博士班學位學程
學號:109003895
出版年(民國):112
畢業學年度:111
語文別:英文
論文頁數:238
中文關鍵詞:學術不誠實行為學術誠信可信評估線上學習評估修改設計
外文關鍵詞:academic dishonesty behavioursacademic integritytrustworthy assessmentonline learningassessment modification design
相關次數:
  • 推薦推薦:0
  • 點閱點閱:81
  • 評分評分:*****
  • 下載下載:0
  • 收藏收藏:0
學術不誠實行為(ADB)是線上學習環境中一個新的、更嚴重的問題。這種混合方法研究的目的包括:1) 系統地回顧ADB的形式、相關因素、減少和研究線上學習中AD和TA差距的解決方案,2) 調查ADB的形式、動機、傾向,以及使用全國互相關調查解決印度尼西亞學生的學術不誠實問題,以及3) 使用每周知識共享(weekly knowledge sharing, WKS)和開放式AI探索修改評估對提高學生學習成果的影響。本研究採用質性和量化研究的混合研究方法以及案例研究來實現這些目標。研究一使用系統文獻綜述進行了分析。研究二在相關調查研究中使用定量方法來調查印度尼西亞高等教育中的ADB現象。然後將案例研究結果用於研究三。
根據研究一的分析結果發現,線上學習中廣泛報導的ADB形式通常包括作弊、串通和抄襲。將原因歸納為兩種,一是個人的,另一個是情境因素。針對這些問題,文獻中提出了一些解決方案,以及更強大的教學課程形式。此外,還有其他解決方案,例如改變評估設計。研究二調查研究結果顯示,約54.15%的受訪者在線上學習期間至少參與過1-3次ADB行為,大部分受訪者認為完全不能接受這種行為的比例高達45.02%,而23.95%選擇不能接受。獲得高度認可的幾個解決方案,包括每學期初對學生的學術道德教育。基於多元回歸測試的結果可以得出結論,使用ICT支持線上學習的動機、傾向、解決方案和能力對ADB有顯著影響。同時,年齡、性別、大學類型、受教育程度、教育領域和線上學習經驗對ADB無顯著影響。這項研究最重要的貢獻是,ADB現象普遍發生在所有層次和類型的教育中,包含男性和女性。驅動ADB的動機可能與他們對它的理解相反。而加強學術道德是一個重要的選擇,教師也需要考慮改變評估設計,使他們不依賴於一種類型的測試來確定學生的畢業。根據研究二的發現,剽竊是寫作作業中報告最多的ADB。檢查10 WKS和1 mini paper相似度的結果表明,一些學生在沒有解釋和記錄來源的情況下從其他來源抄襲。一定比例的學生有低、中、高層度抄襲行為。用於檢查學生寫作成績的Turnitin軟件已被證明有助於顯示學生的錯誤,並且學生會嘗試在下一次寫作作業中進行改述。使用WKS可以幫助學生增加對課程材料的知識,這可以從基於檢查前測和後測分數的結果中看出。
本研究的貢獻涉及理論、實踐和方法論。從理論上講,這些發現支持並強化了先前關於ADB的形式、影響因素和應對措施的理論。在實際貢獻中,本研究鼓勵每所大學就學術誠信問題和對違反者的處罰制定詳細的指導方針,除此之外,教師必須在每堂課中教育學生,並使用多種技術來減少ADB。同時,該研究最重要的貢獻是在方法論上,因為它試圖從系統性文獻分析(研究一)、相關調查(研究二)和案例研究(研究三)進行綜合研究。此外,教師可以學習使用改進的評估形式,以進一步減少學生中出現ADB的可能性。就開放式人工智慧的廣泛發展而言,學生可以輕鬆地訪問和利用它來支持他們的學習。然而,教師們需要制定指導方針,關於如何使用開放式人工智慧作為學習資源之一,同時又不過度依賴該技術。
關於未來的研究,關鍵是要進行額外的研究,以覆蓋更多未包括在本研究中的其他高等教育機構的受訪者。此外,對初中和高中學生情境的研究也有必要。在剽竊問題的背景下,進一步的研究,如研究三所討論的,可以調查在更多樣化的作業形式中,使用開放式人工智慧的有效性差異,以及對學生在各種科學領域中認知獲得的長期影響。未來研究的另一個重要議題是開發針對各級教育學生的學術道德線上課程,以促進學術誠信並盡量減少學術不端行為的現象。
Academic dishonesty behaviours (ADB) are reported to be a new, more massive problem in the online learning environment. The purposes of this mixed-method studies are includes: 1) to systematically review the forms of ADB, associated factors, solutions to reduce and research gaps about AD and TA in online learning, 2) to investigate the forms of ADB, motivation, propensity, and solution of academic dishonesty among Indonesian students using national cross-correlational survey, and 3) to explore the effect of modification assessments desain using Weekly Knowledge Sharing (WKS) and open AI in enhancing student learning outcomes. This research used a mixed method both qualitative and quantitative approaches as well as a case study to achieve these objectives. A qualitative approach was carried out in the first study used a systematic literature review (SLR). Meanwhile, quantitative methods are used in correlational survey research to investigate empirical ADB phenomenon in Indonesia's higher education. Then the case study was employed for the study 3.
Based on the results of data analysis in study 1, it was found that, generally, the forms of ADB widely reported in online learning practices include cheating, collusion, and plagiarism. The causal factors are grouped into two, namely individual, as well as situational factors. Meanwhile, the solutions offered are the use of technology, strong introductory academic ethics courses, as well as changes to the assessment design. The survey research results revealed that about 54.15% respondents had been involved in ADB practice at least 1-3 times during their online learning experience. Most respondents understood that as much as 45.02% of this behaviour was totally unacceptable, 23.95% chose unacceptable. Several solutions that have received high approval to be applied sequentially include education about student academic ethics at the beginning of each semester. Based on the results of multiple regression tests, it can be concluded that motivation, propensity, solutions and capabilities of using ICT to support online learning significantly affect ADB with a total influence of 45.8%. Meanwhile, age, gender, type of university, level of education, the field of education and online learning experience have no significant effect on ADB. The most important contribution of this research is that the ADB phenomenon generally occurs at all levels and types of education, male and female. The motivation that drives ADB can be contrary to their understanding of it. Strengthening academic ethics is an important option, and changes in assessment design need to be considered by teachers so they do not rely on one type of test to determine student graduation. Based on the findings of the study 3, it revealed that plagiarism is the most reported ADB in the writing assignment. The results of checking the similarity level at 10 WKS and 1 mini paper showed that some students plagiarized from other sources without paraphrasing and writing down the sources. A certain percentage of students commit plagiarism at low, medium and high levels. The Turnitin software used to check students' writing results has proven to help show students' mistakes and they try to paraphrase in the next writing assignment. The use of WKS is claimed to help students increase their knowledge of lecture material, this can be seen from the effect based on the results of checking pre-test and post-test scores.
The contribution of this research involves theoretical, practical and methodological. Theoretically, these findings support and reinforce previous theories about the forms of ADB, influencing factors, and solutions to deal with them. In a practical contribution, this research encourages each university to make detailed guidelines on issues of academic integrity and penalties for those who violate them, besides that, lecturers must educate their students in every lecture, and use several technologies to reduce ADB. Meanwhile, the most important contribution of the research is methodologically because it has attempted to conduct comprehensive research from SLR research (study 1), cross correlational survey (study 2), and case studies (study 3). Moreover, teachers can learn to use modified forms of assessment to further reduce the potential for ADB to emerge among students. In terms of the widespread development of open AI which is easily accessible and used by students to support their lectures, practically lecturers must create guidelines on how to use open AI as one of the learning resources, but without relying on this technology.
Regarding the future research, it is critical to do additional research that can reach more respondents from other tertiary institutions that were not included in this study. Furthermore, study on the context of junior and senior high school pupils appears to be of interest. In the context of the plagiarism issue, as discussed in study 3 of this research, additional research can investigate differences in the effectiveness of using open AI in more diverse forms of assignments, as well as the impact on students' acquisition of cognition across various scientific fields in the long run. Another important issue for future research is the research into developing online courses on academic ethics for all students at various levels of education. It aims to promote academic integrity and minimize the ADB phenomenon.
摘要 iii
Abstract v
Acknowledgements vii
Table of Contents viii
List of Figures xii
List of Tables xiv
List of Appendices xvii
Chapter 1. Introduction 1
1.1 Motivation and Background 1
1.2 Research Objectives, Research Questions and Hypotheses 8
1.2.1 Research objectives 8
1.2.2 Research questions 9
1.2.3 Research hypotheses 11
1.2.4 Structure of writing 12
Chapter 2. Literature Review 14
2.1 Online Assessment 14
2.1.1 Concept and definition 14
2.1.2 Online assessment tools 15
2.1.3 Issues related to online assessment 16
2.2 Academic Dishonesty 17
2.2.1 Concept and definition 17
2.2.2 Kind of academic dishonesty 17
2.2.3 Associated factor toward ADB 22
2.2.4 Some reported efforts to reduce academic dishonesty 24
2.3 Theoretical framework 25
Chapter 3. Methods 27
3.1 Introduction 27
3.2 Study 1 Academic Dishonesty and Trustworthy Assessment on Online Learning 27
3.2.1 Planning of the review 28
a. Paper selection 29
3.2.2 Conducting the review 34
a. Coding and categorization 34
b. Synthesis and data analysis 35
3.2.3 Reporting the review 36
3.3 Study 2 Academic Dishonesty Phenomenon in Higher University Students in Indonesia 36
3.3.1 Research approach and design 36
3.3.2 Population and sample 38
3.3.3 Sampling technique 39
3.3.4 Research instruments 39
3.3.5 Data collection procedure 43
3.3.6 Data analysis techniques 45
3.4 Study 3 Case study: an exploration of ADB and Effect of Assessment Modification on
Learning Outcomes 47
3.4.1 Research approach and design 47
3.4.2 Research context 50
3.4.3 Research instrument 51
3.4.4 Data collection procedure 52
3.4.5 Data analysis technique 52
Chapter 4. Results 55
4.1 Introduction 55
4.2 Study 1 Result of Systematic Literature Review Study 56
4.2.1 Forms of AD in online assessment 57
4.2.2 Factors associated with ADB 61
4.2.3 Solutions to Reduce ADB and Increase TA 62
a. Technology-based approach 63
b. Pedagogical approach 65
4.2.4 Future research direction 67
4.3 Study 2 Result of Quantitative Correlational Survey Study 68
4.3.1 Description of the collected data 69
4.3.2 The phenomenon of ADB among Indonesian students 71
a. Results from semi-structured interviews with instructors and students 74
b. Results of semi-structured interviews about ADB with instructors 75
c. Results of semi-structured interviews about ADB with students 76
4.3.2 The driving motivation for ADB among Indonesian students 77
a. Motivation to do ADB from the instructor's perspective 79
b. Motivation to do ADB from the student's perspective 80
4.3.3 The propensity of Indonesian students toward ADB 81
a. Students’ Propensity toward ADB based on instructors’ perspective 83
b. Students’ Propensity toward ADB based on students’ perspective 84
4.3.4 Solutions to reduce ADB based on Students’ and Instructors’ Perspective 85
a. Solutions to reduce ADB based on instructors’ perspective 87
b. Solutions to reduce ADB based on instructors’ perspective 87
4.3.5 Normality, Multicollinearity, Heteroscedasticity, and Autocorrelation Test 89
a. Normality test 89
b. Multicollinearity test 90
c. Heteroscedasticity test 91
d. Autocorrelation test 92
4.3.6 Compare Means of Each Variable Based on Controlled Variable 93
a. Compare means some variables based on educational degree (undergraduate and
graduate) 93
b. Compare means some variables based on university location (west and east) 96
c. Compare means some variables based on type of university (public and private) 98
d. Compare means some variables based on gender (male and Female) 100
e. Compare means some variables based on age (under 21 and 21 and upper) 102
4.3.7 Multiple Linear Regression Test 104
4.3.8 T-test (partial) 104
4.3.9 F Test (simultaneous) 109
4.3.10 The coefficient of determination 110
4.4 Study 3 Case study: An Exploration of the ADB and Effect of Modification
Assessments Design on Learning Outcomes 111
4.4.1 Weekly Knowledge Sharing (WKS) Platform 111
4.4.2 Recapitulation of student engagement/contribution in WKS in each group 112
a. WKS Contribution of Group A 112
b. WKS Contribution of Group B 114
c. WKS Contribution of Group C 115
d. Comparison of WKS contributions from each group 116
4.4.3 Correlation between contributions in WKS to learning outcomes 117
4.4.4 WKS level similarity recapitulation of each group 118
a. WKS Similarity Level of Group A 118
b. WKS Similarity Level of Group B 120
c. WKS Similarity Level of Group C 121
d. Comparison of WKS similarity between group 123
4.4.5 Recapitulation of similarity level of student’s mini paper score 126
a. Similarity Level of Mini Paper Group A 126
b. Similarity Level of Mini Paper Group B 128
c. Similarity Level of Mini Paper Group C 129
d. Comparison of similarity level of mini paper each group 130
4.4.6 Improved learning outcomes through pre-mid and post tests 131
a. Learning Outcomes Report of Group A 132
b. Learning Outcomes Report of Group B 134
c. Learning Outcomes Report of Group C 136
d. Comparison of mean scores and STDEV data from group A, B, and C 138
e. Wilcoxon Signed-Rank Test Analysis Result 140
f. Measuring the Effect Size Using Z score of Wilcoxon Signed-Rank Test 143
g. Measuring the Effect Size Using Cohen’s d 144
4.4.7 Students’ Feedback toward WKS and Open AI Application 149
a. Student feedbacks toward the use of WKS and Open AI in course assignments 151
b. Visualization of student feedback recapitulation based on variables 151
Chapter 5. Discussions 157
5.1 Introduction 157
5.2 Discussion of Study 1: Academic Dishonesty and TA in Online Learning 158
5.2.1 Forms of ADB in online assessment 159
5.1.2 Factors associated with ADB 161
5.1.3 Solutions to reduce ADB and increase TA 163
5.1.4 Future research direction and implication for practice 169
5.3 Discussion of Study 2: Correlational Survey Study about ADB in Indonesia 172
5.3.1 Forms of ADB in online assessment 173
5.3.2 Motivation to Do ADB 174
5.3.3 Propensity of students toward ADB 175
5.3.4 Solutions to Reduce ADB 176
5.3.5 Cultural difference on ADB based on research experience study in Taiwan and survey data
from Indonesia 178
5.4 Discussion of Study 3: Case study: An Exploration of the ADB and Effect of Modification
Assessments Design on Learning Outcomes 181
5.4.1 Forms of ADB in an Empirical Online Assignments 181
5.4.2 Students’ Contribution toward WKS Assignments 184
5.4.3 Correlation between WKS Contributions toward Learning Outcomes 185
5.4.4 The effect of WKS combined with (Chat GPT, Chat Sonic and Non-Open AI) toward
Learning Outcomes 186
5.4.5 Students’ Feedbacks toward WKS and Open AI usage in Online Assignments 188
5.5 Theoretical, Practical and Methodological Contributions 189
Chapter 6. Conclusion 192
6.1 Conclusion 192
6.2 Future Research Directions 197
6.3 Recommendation for Practices 198
References 200
Appendices 217
Abawajy, J., & Kim, T. (2011). Online Learning Environment: Taxonomy of Asynchronous Online Discussion Forums (pp. 706–714). https://doi.org/10.1007/978-3-642-27207-3_76
Adkins, K. L., & Joyner, D. A. (2022). Scaling anti‐plagiarism efforts to meet the needs of large online computer science classes: Challenges, solutions, and recommendations. Journal of Computer Assisted Learning, 38(6), 1603–1619. https://doi.org/10.1111/jcal.12710
Adzima, K. (2020). Examining online cheating in higher education using traditional classroom cheating as a guide. Electronic Journal of E-Learning, 18(6), 476–493. https://doi.org/10.34190/JEL.18.6.002
Akimov, A., & Malin, M. (2020). When old becomes new: a case study of oral examination as an online assessment tool. Assessment & Evaluation in Higher Education, 45(8), 1205–1221. https://doi.org/10.1080/02602938.2020.1730301
Al Lily, A. E., Alhazmi, A. A., Alsubaie, M. A., Alzahrani, S., Bukhamseen, A. M., Aldoughan, E. A., Almudhafar, F. A., Maher, E. A., Al-Abdullatif, A. M., & Kotb, A. A. hamed. (2021). Crowd-reflecting: a counterproductive experience of Arab adult learning via technology. Studies in Continuing Education, 43(1), 86–103. https://doi.org/10.1080/0158037X.2019.1673354
Alessio, H. M., Malay, N., Maurer, K., Bailer, A. J., & Rubin, B. (2018). Interaction of proctoring and student major on online test performance. International Review of Research in Open and Distributed Learning, 19(5), 165–185. https://doi.org/10.19173/irrodl.v19i5.3698
Alexandron, G., Ruipérez-Valiente, J. A., Chen, Z., Muñoz-Merino, P. J., & Pritchard, D. E. (2017). Copying@Scale: Using Harvesting Accounts for Collecting Correct Answers in a MOOC. Computers and Education, 108, 96–114. https://doi.org/10.1016/j.compedu.2017.01.015
Alexandron, G., Wiltrout, M. E., Berg, A., Gershon, S. K., & Ruipérez‐Valiente, J. A. (2023). The effects of assessment design on academic dishonesty, learner engagement, and certification rates in MOOCs. Journal of Computer Assisted Learning, 39(1), 141–153. https://doi.org/10.1111/jcal.12733
Ali, I., Sultan, P., & Aboelmaged, M. (2021). A bibliometric analysis of academic misconduct research in higher education: Current status and future research opportunities. Accountability in Research, 28(6), 372–393. https://doi.org/10.1080/08989621.2020.1836620
Alin, P., Arendt, A., & Gurell, S. (2023). Addressing cheating in virtual proctored examinations: toward a framework of relevant mitigation strategies. Assessment & Evaluation in Higher Education, 48(3), 262–275. https://doi.org/10.1080/02602938.2022.2075317
Alshahrani, M. Y. (2021). Implementation of a Blockchain System Using Improved Elliptic Curve Cryptography Algorithm for the Performance Assessment of the Students in the E-Learning Platform. Applied Sciences, 12(1), 74. https://doi.org/10.3390/app12010074
Amigud, A. (2020). Cheaters on Twitter: an analysis of engagement approaches of contract cheating services. Studies in Higher Education, 45(3), 692–705. https://doi.org/10.1080/03075079.2018.1564258
Amigud, A., Arnedo-Moreno, J., Daradoumis, T., & Guerrero-Roldan, A.-E. (2017). Using Learning Analytics for Preserving Academic Integrity. The International Review of Research in Open and Distributed Learning, 18(5). https://doi.org/10.19173/irrodl.v18i5.3103
Andre, J. M. (2019). More Effective Assessment: Using Student Presentations with Vietnamese University Students. VNU Journal of Science: Education Research. https://doi.org/10.25073/2588-1159/vnuer.4180
Arikunto, S. (2012). Penelitian tindakan kelas.
Arnold, I. J. M. (2016). Cheating at online formative tests: Does it pay off? INTERNET AND HIGHER EDUCATION, 29, 98–106. https://doi.org/10.1016/j.iheduc.2016.02.001
Awdry, R. (2021). Assignment outsourcing: moving beyond contract cheating. Assessment & Evaluation in Higher Education, 46(2), 220–235. https://doi.org/10.1080/02602938.2020.1765311
Awdry, R., & Newton, P. M. (2019). Staff views on commercial contract cheating in higher education: a survey study in Australia and the UK. Higher Education, 78(4), 593–610. https://doi.org/10.1007/s10734-019-00360-0
Baker, R. M., Leonard, M. E., & Milosavljevic, B. H. (2020). The Sudden Switch to Online Teaching of an Upper-Level Experimental Physical Chemistry Course: Challenges and Solutions. Journal of Chemical Education, 97(9), 3097–3101. https://doi.org/10.1021/acs.jchemed.0c00776
Barbosa, H., & Garcia, F. (2005). Importance of online assessment in the e-learning process. 2005 6th International Conference on Information Technology Based Higher Education and Training, F3B-1. https://doi.org/10.1109/ITHET.2005.1560287
Basitere, M., Rzyankina, E., & Le Roux, P. (2023). Reflection on Experiences of First-Year Engineering Students with Blended Flipped Classroom Online Learning during the COVID-19 Pandemic: A Case Study of the Mathematics Course in the Extended Curriculum Program. Sustainability, 15(6), 5491. https://doi.org/10.3390/su15065491
Beavis, J., Morgan, J., & Pickering, J. (2012). Testing nursing staff competencies using an online education module. Renal Society of Australasia Journal, 8(1), 31–36. https://doi.org/10.1016/j.chb.2017.03.074
Benjamin, S., Robbins, L. I., & Kung, S. (2006). Online resources for assessment and evaluation. Academic Psychiatry, 30(6), 498–504. https://doi.org/10.1176/appi.ap.30.6.498
Best, L. M., & Shelley, D. J. (2018). Academic dishonesty: Does social media allow for increased and more sophisticated levels of student cheating? International Journal of Information and Communication Technology Education (IJICTE), 14(3), 1–14. https://doi.org/10.4018/IJICTE.2018070101
Blau, I., & Eshet-Alkalai, Y. (2017). The ethical dissonance in digital and non-digital learning environments: Does technology promotes cheating among middle school students? Computers in Human Behavior, 73, 629–637. https://doi.org/10.1016/j.chb.2017.03.074
Brown, B. S., & Choong, P. (2003). A comparison of academic dishonesty among business students in a public and private Catholic university. Journal of Research on Christian Education, 12(1), 27–48. https://doi.org/10.1080/10656210309484942
Bucea-Manea-Țoniş, R., Martins, O. M. D., Bucea-Manea-Țoniş, R., Gheorghiță, C., Kuleto, V., Ilić, M. P., & Simion, V.-E. (2021). Blockchain Technology Enhances Sustainable Higher Education. Sustainability, 13(22), 12347. https://doi.org/10.3390/su132212347
Caballé, S., Miguel, J., Xhafa, F., Capuano, N., & Conesa, J. (2017). Using trustworthy web services for secure e-assessment in collaborative learning grids. International Journal of Web and Grid Services, 13(1), 49–74. https://doi.org/10.1504/IJWGS.2017.082059
Campione, J. C., & Brown, A. L. (1985). Dynamic assessment: One approach and some initial data. In Technical report no. 361. Univ. of Illinois at Urbana-Champaign, Champaign, IL.
Campione, J. C., & Brown, A. L. (1987). Linking dynamic assessment with school achievement. In In C. S. Lidz (Ed.), Dynamic assessment: An interactional approach to evaluating learning potential (pp. 82–115). Guilford Press.
Carmack, B. J. (1983). Resolving an incident of academic dishonesty: Plagiarism. Nurse Educator, 8(1), 9–12. https://doi.org/10.1097/00006223-198300810-00008
Cerdà-Navarro, A., Touza, C., Morey-López, M., & Curiel, E. (2022). Academic integrity policies against assessment fraud in postgraduate studies: An analysis of the situation in Spanish universities. Heliyon, 8(3), e09170. https://doi.org/10.1016/j.heliyon.2022.e09170
Chan, C. K. K., & Chan, Y.-Y. (2011). Students’ views of collaboration and online participation in Knowledge Forum. Computers & Education, 57(1), 1445–1457. https://doi.org/10.1016/j.compedu.2010.09.003
Chen, B., DeMara, R. F., Salehi, S., & Hartshorne, R. (2018). Elevating Learner Achievement Using Formative Electronic Lab Assessments in the Engineering Laboratory: A Viable Alternative to Weekly Lab Reports. IEEE Transactions on Education, 61(1), 1–10. https://doi.org/10.1109/TE.2017.2706667
Choate, J., Aguilar-Roca, N., Beckett, E., Etherington, S., French, M., Gaganis, V., Haigh, C., Scott, D., Sweeney, T., & Zubek, J. (2021). International educators’ attitudes, experiences, and recommendations after an abrupt transition to remote physiology laboratories. Advances in Physiology Education, 45(2), 310–321. https://doi.org/10.1152/advan.00241.2020
Christakoudis, C., Androulakis, G. S., & Zagouras, C. (2011). Prepare items for large scale computer based assessment: Case study for teachers’ certification on basic computer skills. Procedia-Social and Behavioral Sciences, 29, 1189–1198. https://doi.org/10.1016/j.sbspro.2011.11.353
Chu, S. K. W., Li, X., & Mok, S. (2021). UPCC: A model of plagiarism-free inquiry project-based learning. Library & Information Science Research, 43(1), 1–10. https://doi.org/10.1016/j.lisr.2021.101073
Chua, J. Y. L., Lee, C. S. L., Yeo, K. P., Ali, Y., & Lim, C. L. (2022). Perception and reaction of Nanyang Technological University (NTU) researchers to different forms of research integrity education modality. BMC Medical Ethics, 23(1), 85. https://doi.org/10.1186/s12910-022-00824-6
Chuang, C. Y., Craig, S. D., & Femiani, J. (2017). Detecting probable cheating during online assessments based on time delay and head pose. Higher Education Research & Development, 36(6), 1123–1137. https://doi.org/10.1080/07294360.2017.1303456
Chui, C., Kouchaki, M., & Gino, F. (2021). “Many others are doing it, so why shouldn’t I?”: How being in larger competitions leads to more cheating. Organizational Behavior and Human Decision Processes, 164, 102–115. https://doi.org/10.1016/j.obhdp.2021.01.004
Cifuentes, L., & Janney, A. (2016). Protecting students’ integrity and reducing academic dishonesty in online learning. Distance Learning, 13(4), 9–15.
Clarke, R., & Lancaster, T. (2006). Eliminating the successor to plagiarism? Identifying the usage of contract cheating sites. Proceedings of 2nd International Plagiarism Conference, 19–21.
Clarke, V., & Braun, V. (2014). Thematic analysis. In Encyclopedia of critical psychology (pp. 1947–1952). Springer. https://doi.org/10.1007/978-1-4614-5583-7_311
Clifford, S., & Jerit, J. (2016). Cheating on political knowledge questions in online surveys an assessment of the problem and solutions. Public Opinion Quarterly, 80(4), 858–887. https://doi.org/10.1093/poq/nfw030
Clinciu, A. I., & Cazan, A.-M. (2014). Academic Adjustment Questionnaire for the University Students. Procedia - Social and Behavioral Sciences, 127, 655–660. https://doi.org/https://doi.org/10.1016/j.sbspro.2014.03.330
Clinciu, A. I., Cazan, A.-M., & Ives, B. (2021). Academic Dishonesty and Academic Adjustment Among the Students at University Level: An Exploratory Study. SAGE Open, 11(2), 21582440211021840. https://doi.org/10.1177/21582440211021839
Comas-Forgas, R., Lancaster, T., Calvo-Sastre, A., & Sureda-Negre, J. (2021). Exam cheating and academic integrity breaches during the COVID-19 pandemic: An analysis of internet search activity in Spain. Heliyon, 7(10), e08233. https://doi.org/10.1016/j.heliyon.2021.e08233
Conijn, R., Kleingeld, A., Matzat, U., & Snijders, C. (2022). The fear of big brother: The potential negative side‐effects of proctored exams. Journal of Computer Assisted Learning, 38(6), 1521–1534. https://doi.org/10.1111/jcal.12651
Costley, J. (2017). The instructional factors that lead to cheating in a Korean cyber university context. Interactive Technology and Smart Education, 14(4), 313–328. https://doi.org/10.1108/ITSE-02-2017-0019
Costley, J. (2019). Student Perceptions of Academic Dishonesty at a Cyber-University in South Korea. Journal of Academic Ethics, 17(2), 205–217. https://doi.org/10.1007/s10805-018-9318-1
Crown, D. F., & Spiller, M. S. (1998). Learning from the Literature on Collegiate Cheating: A Review of Empirical Research. Journal of Business Ethics, 18(2), 229–246. https://doi.org/10.1023/A:1017903001888
Cumming, G., Fidler, F., Kalinowski, P., & Lai, J. (2012). The statistical recommendations of the American Psychological Association Publication Manual: Effect sizes, confidence intervals, and meta‐analysis. Australian Journal of Psychology, 64(3), 138–146. https://doi.org/10.1111/j.1742-9536.2011.00037.x
Dadey, N., Lyons, S., & DePascale, C. (2018). The comparability of scores from different digital devices: A literature review and synthesis with recommendations for practice. Applied Measurement in Education, 31(1), 30–50. https://doi.org/10.1080/08957347.2017.1391262
Daft, R. L. (2015). Organization theory and design. Cengage learning.
Daniel, J. (2012). Making sense of MOOCs: Musings in a maze of myth, paradox and possibility. Journal of Interactive Media in Education, 2012(3), 4–23. https://doi.org/10.5334/2012-18
Darvishi, A., Khosravi, H., Sadiq, S., & Gašević, D. (2022). Incorporating AI and learning analytics to build trustworthy peer assessment systems. British Journal of Educational Technology, 53(4), 844–875. https://doi.org/10.1111/bjet.13233
Davies, P. M., Alotaishan, R. M. T., Alabdulwahed, H. K. A., Khan, A. M. F., Ateya, R. M., Alkhamis, T. S., & Alodhieb, A. A. A. (2023). Perspectives on cyberlearning: A case study by students, about students. Education and Information Technologies. https://doi.org/10.1007/s10639-022-11564-w
Davis, S. F., Grover, C. A., Becker, A. H., & McGregor, L. N. (1992). Academic dishonesty: Prevalence, determinants, techniques, and punishments. Teaching of Psychology, 19(1), 16–20. https://doi.org/10.1207/s15328023top1901_3
Dawe, L., Stevens, J., Hoffman, B., & Quilty, M. (2021). Citation and Referencing Support at an Academic Library: Exploring Student and Faculty Perspectives on Authority and Effectiveness. College & Research Libraries, 82(7). https://doi.org/10.5860/crl.82.7.991
De Freitas, S. I., Morgan, J., & Gibson, D. (2015). Will MOOCs transform learning and teaching in higher education? Engagement and course retention in online learning provision. British Journal of Educational Technology, 46(3), 455–471. https://doi.org/10.1111/bjet.12268
De Gagne, J. C., Koppel, P. D., Kim, S. S., Park, H. K., & Rushton, S. (2021). Pedagogical foundations of cybercivility in health professions education: a scoping review. BMC Medical Education, 21(1), 1–11. https://doi.org/10.1186/s12909-021-02507-z
de Lima, D. P. R., Gerosa, M. A., Conte, T. U., & Netto, J. F. de M. (2019). What to expect, and how to improve online discussion forums: the instructors’ perspective. Journal of Internet Services and Applications, 10(1), 22. https://doi.org/10.1186/s13174-019-0120-0
de Maio, C., Dixon, K., & Yeo, S. (2020). Responding to student plagiarism in Western Australian universities: the disconnect between policy and academic staff. Journal of Higher Education Policy and Management, 42(1), 102–116. https://doi.org/10.1080/1360080X.2019.1662927
DeKorver, B. K., Krahulik, M., & Herrington, D. G. (2023). Differences in Chemistry Instructor Views of Assessment and Academic Integrity as Highlighted by the COVID Pandemic. Journal of Chemical Education, 100(1), 91–101. https://doi.org/10.1021/acs.jchemed.2c00206
Dendir, S., & Maxwell, R. S. (2020). Cheating in online courses: Evidence from online proctoring. Computers in Human Behavior Reports, 2, 100033. https://doi.org/10.1016/j.chbr.2020.100033
Denney, V., Dixon, Z., Gupta, A., & Hulphers, E. (2020). Exploring the Perceived Spectrum of Plagiarism: a Case Study of Online Learning. Journal of Academic Ethics, 97, 3290–3294. https://doi.org/10.1021/acs.jchemed.0c00744
Dicks, A. P., Morra, B., & Quinlan, K. B. (2020). Lessons Learned from the COVID-19 Crisis: Adjusting Assessment Approaches within Introductory Organic Courses. Journal of Chemical Education, 97(9), 3406–3412. https://doi.org/10.1021/acs.jchemed.0c00529
Drachsler, H., Jansen, J., & Kirschner, P. A. (2021). Adoption of learning technologies in times of pandemic crisis. Journal of Computer Assisted Learning, 37, 1509–1512. https://doi.org/10.1111/jcal.12626
Duncan, A., & Joyner, D. (2022). On the necessity (or lack thereof) of digital proctoring: Drawbacks, perceptions, and alternatives. Journal of Computer Assisted Learning, 38(5), 1482–1496. https://doi.org/10.1111/jcal.12700
Dye, T. (2022). Qualitative Data Analysis: Step-by-Step Guide (Manual vs. Automatic). Getthematic. https://getthematic.com/
Ealy, J. B., & Stauffer, J. (2020). Reflections of an acs affiliated liaison during COVID-19: A university and ap chemistry aact team. Journal of Chemical Education, 97(9), 3290–3294. https://doi.org/10.1021/acs.jchemed.0c00744
Ebadi, S., & Rahimi, M. (2019). Mediating EFL learners’ academic writing skills in online dynamic assessment using Google Docs. Computer Assisted Language Learning, 32(5–6), 527–555. https://doi.org/10.1080/09588221.2018.1527362
Elo, S., Kääriäinen, M., Kanste, O., Pölkki, T., Utriainen, K., & Kyngäs, H. (2014). Qualitative content analysis: A focus on trustworthiness. SAGE Open, 4(1), 1–10. https://doi.org/10.1177/2158244014522633
Elsalem, L., Al-Azzam, N., Jum’ah, A. A., Obeidat, N., Sindiani, A. M., & Kheirallah, K. A. (2020). Stress and behavioral changes with remote E-exams during the COVID-19 pandemic: A cross-sectional study among undergraduates of medical sciences. Annals of Medicine and Surgery, 60, 271–279. https://doi.org/10.1016/j.amsu.2020.10.058
Ergüzen, A., Erdal, E., Ünver, M., & Özcan, A. (2021). Improving Technological Infrastructure of Distance Education through Trustworthy Platform-Independent Virtual Software Application Pools. Applied Sciences, 11(3), 1214. https://doi.org/10.3390/app11031214
Eshet, Y., Dickman, N., & Ben Zion, Y. (2023). Academic integrity in the HyFlex learning environment. Heliyon, 9(2), e13301. https://doi.org/10.1016/j.heliyon.2023.e13301
Eshet, Y., Steinberger, P., & Grinautsky, K. (2021). Relationship between statistics anxiety and academic dishonesty: A comparison between learning environments in social sciences. Sustainability (Switzerland), 13(3), 1–18. https://doi.org/10.3390/su13031564
Etgar, S., Blau, I., & Eshet-Alkalai, Y. (2019). White-collar crime in academia: Trends in digital academic dishonesty over time and their effect on penalty severity. Computers & Education, 141, 103621. https://doi.org/10.1016/j.compedu.2019.103621
Etter, S., Cramer, J. J., & Finn, S. (2006). Origins of academic dishonesty: Ethical orientations and personality factors associated with attitudes about cheating with information technology. Journal of Research on Technology in Education, 39(2), 133–155. https://doi.org/10.1080/15391523.2006.10782477
Farisi, M. I. (2013). Academic dishonesty in distance higher education: Challenges and models for moral education in the digital era. Turkish Online Journal of Distance Education, 14(4), 176–195.
Farland, M. Z., & Childs-Kean, L. M. (2021). Stop tempting your students to cheat. Currents in Pharmacy Teaching and Learning, 13(6), 588–590. https://doi.org/10.1016/j.cptl.2021.01.035
Farrokhnia, M., Banihashem, S. K., Noroozi, O., & Wals, A. (2023). A SWOT analysis of ChatGPT: Implications for educational practice and research. Innovations in Education and Teaching International, 1–15. https://doi.org/10.1080/14703297.2023.2195846
Fatima, S. S., Idrees, R., Jabeen, K., Sabzwari, S., & Khan, S. (2021). Online assessment in undergraduate medical education: Challenges and solutions from a LMIC university. Pakistan Journal of Medical Sciences, 37(4), 945–951. https://doi.org/10.12669/pjms.37.4.3948
Fernandes, L. B. (2017). Embedding responsible conduct in learning and research into an Australian undergraduate curriculum. Biochemistry and Molecular Biology Education, 45(1), 53–59. https://doi.org/10.1002/bmb.20990
Franklyn-Stokes, A., & Newstead, S. E. (1995). Undergraduate cheating: who does what and why? Studies in Higher Education, 20(2), 159–172. https://doi.org/10.1080/03075079512331381673
Furchan, A. (1982). Pengantar penelitian dalam pendidikan. Surabaya: Usaha Nasional.
Galikyan, I., Admiraal, W., & Kester, L. (2021). MOOC discussion forums: The interplay of the cognitive and the social. Computers & Education, 165, 104133. https://doi.org/10.1016/j.compedu.2021.104133
Gamage, K. A. A., Silva, E. K. de, & Gunawardhana, N. (2020). Online delivery and assessment during COVID-19: Safeguarding academic integrity. Education Sciences, 10(11), 301. https://doi.org/10.3390/educsci10110301
Goff, D., Johnston, J., & Bouboulis, B. S. (2020). Maintaining academic standards and integrity in online business courses. International Journal of Higher Education, 9(2), 248–257. https://doi.org/10.1111/bjet.12914
Goldberg, D. M. (2021). Programming in a pandemic: Attaining academic integrity in online coding courses. Communications of the Association for Information Systems, 48, 47–54. https://doi.org/10.17705/1CAIS.04807
Golden, J., & Kohlbeck, M. (2020a). Addressing cheating when using test bank questions in online Classes. Journal of Accounting Education, 52, 100671. https://doi.org/10.1016/j.jaccedu.2020.100671
Guangul, F. M., Suhail, A. H., Khalit, M. I., & Khidhir, B. A. (2020). Challenges of remote assessment in higher education in the context of COVID-19: a case study of Middle East College. Educational Assessment, Evaluation and Accountability, 32(4), 519–535. https://doi.org/10.1007/s11092-020-09340-w
Guba, E. G., & Lincoln, Y. S. (1994). Competing paradigms in qualitative research. Handbook of Qualitative Research, 2(163–194), 105.
Gudiño Paredes, S., Jasso Peña, F. de J., & de La Fuente Alcazar, J. M. (2021). Remote proctored exams: Integrity assurance in online education? Distance Education, 42(2), 200–218. https://doi.org/10.1080/01587919.2021.1910495
Guraya, S. Y. (2018). Comparing recommended sanctions for lapses of academic integrity as measured by Dundee Polyprofessionalism Inventory I: Academic integrity from a Saudi and a UK medical school. Journal of the Chinese Medical Association, 81(9), 787–795. https://doi.org/10.1016/j.jcma.2018.04.001
Haines, V. J., Diekhoff, G. M., LaBeff, E. E., & Clark, R. E. (1986). College cheating: Immaturity, lack of commitment, and the neutralizing attitude. Research in Higher Education, 25(4), 342–354. https://doi.org/10.1007/BF00992130
Hardy, R. J. (1982). Preventing academic dishonesty: Some important tips for political science professors. Teaching Political Science, 9(2), 68–77. https://doi.org/10.1080/00922013.1982.9942904
Harmon, O. R., & Lambrinos, J. (2008). Are online exams an invitation to cheat? The Journal of Economic Education, 39(2), 116–125. https://doi.org/10.3200/JECE.39.2.116-125
Harris, L., Harrison, D., McNally, D., & Ford, C. (2020). Academic Integrity in an Online Culture: Do McCabe’s Findings Hold True for Online, Adult Learners? Journal of Academic Ethics, 18(4), 419–434. https://doi.org/10.1007/s10805-019-09335-3
Harton, H. C., Aladia, S., & Gordon, A. (2019). Faculty and student perceptions of cheating in online vs. traditional classes. Online Journal of Distance Learning Administration, 22(4), 4. https://ojdla.com/archive/winter224/hartonaladiagordon224.pdf
Hew, K. F., & Cheung, W. S. (2014). Students’ and instructors’ use of massive open online courses (MOOCs): Motivations and challenges. Educational Research Review, 12, 45–58. https://doi.org/10.1016/j.edurev.2014.05.001
Hilliger, I., Ruipérez‐Valiente, J. A., Alexandron, G., & Gašević, D. (2022). Trustworthy remote assessments: A typology of pedagogical and technological strategies. Journal of Computer Assisted Learning, inpress, 1–14. https://doi.org/10.1111/jcal.12755
Huber, E., Harris, L., Wright, S., White, A., Raduescu, C., Zeivots, S., Cram, A., & Brodzeli, A. (2023). Towards a framework for designing and evaluating online assessments in business education. Assessment & Evaluation in Higher Education, 1–15. https://doi.org/10.1080/02602938.2023.2183487
Hylton, K., Levy, Y., & Dringus, L. P. (2016). Utilizing webcam-based proctoring to deter misconduct in online exams. Computers & Education, 92, 53–63. https://doi.org/10.1016/j.compedu.2015.10.002
Ives, B., Alama, M., Mosora, L. C., Mosora, M., Grosu-Radulescu, L., Clinciu, A. I., Cazan, A.-M., Badescu, G., Tufis, C., Diaconu, M., & Dutu, A. (2017). Patterns and predictors of academic dishonesty in Romanian university students. Higher Education, 74(5), 815–831. https://doi.org/10.1007/s10734-016-0079-8
Jaap, A., Dewar, A., Duncan, C., Fairhurst, K., Hope, D., & Kluth, D. (2021). Effect of remote online exam delivery on student experience and performance in applied knowledge tests. BMC Medical Education, 21(1), 1–7. https://doi.org/10.1186/s12909-021-02521-1
Jamieson, M. V. (2020a). Keeping a Learning Community and Academic Integrity Intact after a Mid-Term Shift to Online Learning in Chemical Engineering Design During the COVID-19 Pandemic. Journal of Chemical Education, 97(9), 2768–2772. https://doi.org/10.1021/acs.jchemed.0c00785
Jaramillo‐Morillo, D., Ruipérez‐Valiente, J. A., Burbano Astaiza, C. P., Solarte, M., Ramirez‐Gonzalez, G., & Alexandron, G. (2022). Evaluating a learning analytics dashboard to detect dishonest behaviours: A case study in small private online courses with academic recognition. Journal of Computer Assisted Learning, 38(6), 1574–1588. https://doi.org/10.1111/jcal.12734
Jaramillo-Morillo, D., Ruipérez-Valiente, J., Sarasty, M. F., & Ramírez-Gonzalez, G. (2020). Identifying and characterizing students suspected of academic dishonesty in SPOCs for credit through learning analytics. International Journal of Educational Technology in Higher Education, 17(1), 1–18. https://doi.org/10.1186/s41239-020-00221-2
Jerrim, J., Micklewright, J., Heine, J.-H., Salzer, C., & McKeown, C. (2018). PISA 2015: how big is the ‘mode effect’and what has been done about it? Oxford Review of Education, 44(4), 476–493. https://doi.org/10.1080/03054985.2018.1430025
Joshi, O., Chapagain, B., Kharel, G., Poudyal, N. C., Murray, B. D., & Mehmood, S. R. (2022). Benefits and challenges of online instruction in agriculture and natural resource education. Interactive Learning Environments, 30(8), 1402–1413. https://doi.org/10.1080/10494820.2020.1725896
Kakkonen, T., & Mozgovoy, M. (2010). Hermetic and Web Plagiarism Detection Systems for Student Essays—An Evaluation of the State-of-the-Art. Journal of Educational Computing Research, 42(2), 135–159. https://doi.org/10.2190/EC.42.2.a
Kamalov, F., Sulieman, H., & Santandreu Calonge, D. (2021). Machine learning based approach to exam cheating detection. PLOS ONE, 16(8), e0254340. https://doi.org/10.1371/journal.pone.0254340
Karnalim, O., Simon, & Chivers, W. (2022). Layered similarity detection for programming plagiarism and collusion on weekly assessments. Computer Applications in Engineering Education, 30(6), 1739–1752. https://doi.org/10.1002/cae.22553
Kay, A., Polin, B. A., & Sadeh, S. (2022). Integrity of nursing students in Israel: An exploratory study. Nurse Education in Practice, 64, 103446. https://doi.org/10.1016/j.nepr.2022.103446
Kennedy, K., Nowak, S., Raghuraman, R., Thomas, J., & Davis, S. F. (2000). Academic dishonesty and distance learning: Student and faculty views. College Student Journal, 34(2), 309–314.
Kerlinger, F. N. (1966). Foundations of behavioral research.
Khalil, M., Prinsloo, P., & Slade, S. (2022). In the nexus of integrity and surveillance: Proctoring (re)considered. Journal of Computer Assisted Learning, 38(6), 1589–1602. https://doi.org/10.1111/jcal.12713
Khan, S., Kambris, M. E. K., & Alfalahi, H. (2022). Perspectives of University Students and Faculty on remote education experiences during COVID-19- a qualitative study. Education and Information Technologies, 27(3), 4141–4169. https://doi.org/10.1007/s10639-021-10784-w
Klocko, M. N. (2014). Academic dishonesty in schools of nursing: A literature review. Journal of Nursing Education, 53(3), 121–125. https://doi.org/10.3928/01484834-20140205-01
Kolack, K., Hemraj-Benny, T., & Chauhan, M. (2020). Community College Chemistry Instruction and Research in the Time of COVID-19. Journal of Chemical Education, 97(9), 2889–2894. https://doi.org/10.1021/acs.jchemed.0c00700
Kolski, T., & Weible, J. L. (2019). Do Community College Students Demonstrate Different Behaviors from Four-Year University Students on Virtual Proctored Exams? Community College Journal of Research and Practice, 43(10–11), 690–701. https://doi.org/10.1080/10668926.2019.1600615
Kuleto, V., Bucea-Manea-Țoniş, R., Bucea-Manea-Țoniş, R., Ilić, M. P., Martins, O. M. D., Ranković, M., & Coelho, A. S. (2022). The Potential of Blockchain Technology in Higher Education as Perceived by Students in Serbia, Romania, and Portugal. Sustainability, 14(2), 749. https://doi.org/10.3390/su14020749
Lacko, D., Šašinka, Č., Čeněk, J., Stachoň, Z., & Lu, W. (2020). Cross-cultural differences in cognitive style, individualism/collectivism and map reading between Central European and East Asian University students. Studia Psychologica, 62(1), 23–43. https://doi.org/10.31577/sp.2020.01.789
Lai, J. W. M., & Bower, M. (2020). Evaluation of technology use in education: Findings from a critical analysis of systematic literature reviews. Journal of Computer Assisted Learning, 36(3), 241–259. https://doi.org/10.1111/jcal.12412
Lancaster, M., Seibert, G. S., Cooper, A. N., & ... (2020). Relationship quality in the context of cyber dating abuse: The role of attachment. Journal of family issues, 41(6), 739-758. https://doi.org/10.1177/0192513X19881674
Leamy, M., Bird, V., le Boutillier, C., Williams, J., & Slade, M. (2011). Conceptual framework for personal recovery in mental health: systematic review and narrative synthesis. The British Journal of Psychiatry, 199(6), 445–452. https://doi.org/10.1192/bjp.bp.110.083733
Lee, K., & Fanguy, M. (2022). Online exam proctoring technologies: Educational innovation or deterioration? British Journal of Educational Technology, 53(3), 475–490. https://doi.org/10.1111/bjet.13182
Leeson, H. V. (2006). The mode effect: A literature review of human and technological issues in computerized testing. International Journal of Testing, 6(1), 1–24. https://doi.org/10.1207/s15327574ijt0601_1
Liberman-Martin, A. L., & Ogba, O. M. (2020). Midsemester Transition to Remote Instruction in a Flipped College-Level Organic Chemistry Course. Journal of Chemical Education, 97(9), 3188–3193. https://doi.org/10.1021/acs.jchemed.0c00632
Linden, K., & Gonzalez, P. (2021). Zoom invigilated exams: A protocol for rapid adoption to remote examinations. British Journal of Educational Technology, 52(4), 1323–1337. https://doi.org/10.1111/bjet.13109
Liu, C., Zowghi, D., Kearney, M., & Bano, M. (2021). Inquiry‐based mobile learning in secondary school science education: A systematic review. Journal of Computer Assisted Learning, 37(1), 1–23. https://doi.org/10.1111/jcal.12505
Lomness, A., Lacey, S., Brobbel, A., & Freeman, T. (2021). Seizing the opportunity: Collaborative creation of academic integrity and information literacy LMS modules for undergraduate Chemistry. The Journal of Academic Librarianship, 47(3), 102328. https://doi.org/10.1016/j.acalib.2021.102328
Lowe, M. S., Londino-Smolar, G., Wendeln, K. E. A., & Sturek, D. L. (2018). Promoting academic integrity through a stand-alone course in the learning management system. International Journal for Educational Integrity, 14(1), 1–11. https://doi.org/10.1007/s40979-018-0035-8
Maertens, R., Van Petegem, C., Strijbol, N., Baeyens, T., Jacobs, A. C., Dawyndt, P., & Mesuere, B. (2022). Dolos: Language‐agnostic plagiarism detection in source code. Journal of Computer Assisted Learning, 38(4), 1046–1061. https://doi.org/10.1111/jcal.12662
Mahabeer, P., & Pirtheepal, T. (2019). Assessment, plagiarism and its effect on academic integrity: Experiences of academics at a university in South Africa. South African Journal of Science, 115(11/12). https://doi.org/10.17159/sajs.2019/6323
Mahmud, K., Usman, M., Sindhu, M. A., Jolfaei, A., & Srivastava, G. (2021). Closing the Loop in Feedback Driven Learning Environments Using Trust Decision Making and Utility Theory. IEEE Transactions on Emerging Topics in Computational Intelligence, 5(1), 6–18. https://doi.org/10.1109/TETCI.2020.2991452
Malesky Jr, L. A., Baley, J., & Crow, R. (2016). Academic dishonesty: Assessing the threat of cheating companies to online education. College Teaching, 64(4), 178–183. https://doi.org/10.1080/87567555.2015.1133558
Malik, A. A., Hassan, M., Rizwan, M., Mushtaque, I., Lak, T. A., & Hussain, M. (2023). Impact of academic cheating and perceived online learning effectiveness on academic performance during the COVID-19 pandemic among Pakistani students. Frontiers in Psychology, 14. https://doi.org/10.3389/fpsyg.2023.1124095
Malik, M. A., Mahroof, A., & Ashraf, M. A. (2021). Online University Students’ Perceptions on the Awareness of, Reasons for, and Solutions to Plagiarism in Higher Education: The Development of the AS&P Model to Combat Plagiarism. Applied Sciences, 11(24), 12055. https://doi.org/10.3390/app112412055
Mâță, L., Lazăr, I. M., & Ghiațău, R. (2020). Exploring academic dishonesty practices among science education university students. Journal of Baltic Science Education, 19(1), 91–107. https://doi.org/10.33225/jbse/20.19.91
McCabe, D. L., & Trevino, L. K. (1993). Academic dishonesty: Honor codes and other contextual influences. The Journal of Higher Education, 64(5), 522–538. https://doi.org/10.1080/00221546.1993.11778446
McClung, E. L., & Schneider, J. K. (2018). Dishonest Behavior in the Classroom and Clinical Setting: Perceptions and Engagement. Journal of Nursing Education, 57(2), 79–87. https://doi.org/10.3928/01484834-20180123-04
Mehri Kamrood, A., Davoudi, M., Ghaniabadi, S., & Amirian, S. M. R. (2021). Diagnosing L2 learners’ development through online computerized dynamic assessment. Computer Assisted Language Learning, 34(7), 868–897. https://doi.org/10.1080/09588221.2019.1645181
Meo, S., & Talha, M. (2019). Turnitin: Is it a text matching or plagiarism detection tool? Saudi Journal of Anaesthesia, 13(5), 48. https://doi.org/10.4103/sja.SJA_772_18
Milone, A. S., Cortese, A. M., Balestrieri, R. L., & Pittenger, A. L. (2017). The impact of proctored online exams on the educational experience. Currents in Pharmacy Teaching and Learning, 9(1), 108–114. https://doi.org/10.1016/j.cptl.2016.08.037
Moten Jr, J., Fitterer, A., Brazier, E., Leonard, J., & Brown, A. (2013). Examining online college cyber cheating methods and prevention measures. Electronic Journal of E-Learning, 11(2), pp139-146.
Mukhtar, K., Javed, K., Arooj, M., & Sethi, A. (2020). Advantages, limitations and recommendations for online learning during COVID-19 pandemic era. Pakistan Journal of Medical Sciences, 36(COVID19-S4), 27–31. https://doi.org/10.12669/pjms.36.COVID19-S4.2785
Ndibalema, P. (2021). Online Assessment in the Era of Digital Natives in Higher Education Institutions. International Journal of Technology in Education, 4(3), 443–463. https://doi.org/10.46328/ijte.89
Ng’ambi, D., & Bozalek, V. (2015). Massive open online courses (MOOCs): Disrupting teaching and learning practices in higher education. 46(3), 451–454. https://doi.org/10.1111/bjet.12281
Nguyen, J. G., Keuseman, K. J., & Humston, J. J. (2020). Minimize online cheating for online assessments during COVID-19 pandemic. Journal of Chemical Education, 97(9), 3429–3435. https://doi.org/10.1021/acs.jchemed.0c00790
Nigam, A., Pasricha, R., Singh, T., & Churi, P. (2021). A Systematic Review on AI-based Proctoring Systems: Past, Present and Future. Education and Information Technologies, 26(5), 6421–6445. https://doi.org/10.1007/s10639-021-10597-x
Ocarroll, I. P., Buck, M. R., Durkin, D. P., & Farrell, W. S. (2020). With Anchors Aweigh, Synchronous Instruction Preferred by Naval Academy Instructors in Small Undergraduate Chemistry Classes. Journal of Chemical Education, 97(9), 2383–2388. https://doi.org/10.1021/acs.jchemed.0c00710
O’Donnell, R., Maloney, K., Masters, K., & Liu, D. (2020). Library-Faculty Referencing and Plagiarism Pilot Using Technology-Mediated Feedback for Change. Journal of the Australian Library and Information Association, 69(4), 523–539. https://doi.org/10.1080/24750158.2020.1813406
Omari, E. B., Salifu Yendork, J., & Ankrah, E. (2022). University students’ perspectives on the benefits and challenges of emergency remote teaching during the Covid-19 pandemic in Ghana. Education and Information Technologies. https://doi.org/10.1007/s10639-022-11401-0
Onah, D. F. O., Sinclair, J. E., & Boyatt, R. (2014). Exploring the use of MOOC discussion forums. Proceedings of London International Conference on Education, 1–4.
Page, M. J., McKenzie, J. E., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., Shamseer, L., Tetzlaff, J. M., & Moher, D. (2021). Updating guidance for reporting systematic reviews: development of the PRISMA 2020 statement. Journal of Clinical Epidemiology, 134, 103–112. https://doi.org/10.1016/j.jclinepi.2021.02.003
Page, M. J., Moher, D., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., Shamseer, L., Tetzlaff, J. M., Akl, E. A., & Brennan, S. E. (2021). PRISMA 2020 explanation and elaboration: updated guidance and exemplars for reporting systematic reviews. Bmj, 372(160), 1–36. https://doi.org/10.1136/bmj.n160
Palamenghi, L., & Bonfiglioli, C. (2019). Cognitive enhancement vs. plagiarism: A quantitative study on the attitudes of an Italian sample. Neuroethics, 12(3), 279–292. https://doi.org/10.1007/s12152-019-09397-5
Papageorgiou, S., & Manna, V. F. (2021). Maintaining access to a large-scale test of academic language proficiency during the pandemic: The launch of TOEFL iBT Home Edition. Language Assessment Quarterly, 18(1), 36–41. https://doi.org/10.1080/15434303.2020.1864376
Parks, R. F., Lowry, P. B., Wigand, R. T., Agarwal, N., & Williams, T. L. (2018). Why students engage in cyber-cheating through a collective movement: A case of deviance and collusion. Computers & Education, 125, 308–326. https://doi.org/10.1016/j.compedu.2018.04.003
Patael, S., Shamir, J., Soffer, T., Livne, E., Fogel‐Grinvald, H., & Kishon‐Rabin, L. (2022). Remote proctoring: Lessons learned from the COVID ‐19 pandemic effect on the large scale on‐line assessment at Tel Aviv University. Journal of Computer Assisted Learning, 38(6), 1554–1573. https://doi.org/10.1111/jcal.12746
Patak, A. A., Wirawan, H., Abduh, A., Hidayat, R., Iskandar, I., & Dirawan, G. D. (2020). Teaching English as a Foreign Language in Indonesia: University Lecturers’ Views on Plagiarism. Journal of Academic Ethics, 0123456789, 1–17. https://doi.org/10.1007/s10805-020-09385-y
Peled, Y., Eshet, Y., Barczyk, C., & Grinautski, K. (2019). Predictors of Academic Dishonesty among undergraduate students in online and face-to-face courses. Computers & Education, 131, 49–59. https://doi.org/10.1016/j.compedu.2018.05.012
Peytcheva-Forsyth, R., Mellar, H., & Aleksieva, L. (2019). Using a student authentication and authorship checking system as a catalyst for developing an academic integrity culture: A Bulgarian case study. Journal of Academic Ethics, 17(3), 245–269. https://doi.org/10.1007/s10805-019-09332
Pham, M., Singh, K., & Jahnke, I. (2021). Socio-technical-pedagogical usability of online courses for older adult learners. Interactive Learning Environments, 1–17. https://doi.org/10.1080/10494820.2021.1912784
Polit, D. F., & Beck, C. T. (2009). Essentials of nursing research: Appraising evidence for nursing practice. Lippincott Williams & Wilkins.
Popay, J., Roberts, H., Sowden, A., Petticrew, M., Arai, L., Rodgers, M., Britten, N., Roen, K., & Duffy, S. (2006). Guidance on the conduct of narrative synthesis in systematic reviews. A Product from the ESRC Methods Programme Version, 1, b92.
Razı, S. (2023). Emergency remote teaching adaptation of the anonymous multi–mediated writing model. System, 113, 102981. https://doi.org/10.1016/j.system.2023.102981
Reyneke, Y., Shuttleworth, C. C., & Visagie, R. G. (2021). Pivot to online in a post-COVID-19 world: critically applying BSCS 5E to enhance plagiarism awareness of accounting students. Accounting Education, 30(1), 1–21. https://doi.org/10.1080/09639284.2020.1867875
Rogerson, A. M., & McCarthy, G. (2017). Using internet based paraphrasing tools: Original work, patchwriting or facilitated plagiarism? International Journal for Educational Integrity, 13(1), 1–15. https://doi.org/10.1007/s40979-016-0013-y
Roshid, M. M., Sultana, S., Kabir, M. Md. N., Jahan, A., Khan, R., & Haider, Md. Z. (2022). Equity, fairness, and social justice in teaching and learning in higher education during the COVID-19 pandemic. Asia Pacific Journal of Education, 1–22. https://doi.org/10.1080/02188791.2022.2122024
Rossano, V., Pesare, E., & Roselli, T. (2017). Are computer adaptive tests suitable for assessment in MOOCs. Journal of E-Learning and Knowledge Society, 13(3), 71–81. https://doi.org/10.20368/1971-8829/1393
Rowe, N. C. (2004). Cheating in online student assessment: Beyond plagiarism. Online Journal of Distance Learning Administration, 7(2), 1-10.
Rowland, S., Slade, C., Wong, K.-S., & Whiting, B. (2018). ‘Just turn to us’: the persuasive features of contract cheating websites. Assessment & Evaluation in Higher Education, 43(4), 652–665. https://doi.org/10.1080/02602938.2017.1391948
Ruiperez-Valiente, J. A., Munoz-Merino, P. J., Alexandron, G., & Pritchard, D. E. (2017). Using machine learning to detect ‘multiple-account’cheating and analyze the influence of student and problem features. IEEE Transactions on Learning Technologies, 12(1), 112–122. https://doi.org/10.1109/TLT.2017.2784420
Salas-Rueda, R. A., De-La-Cruz-Martínez, G., Eslava-Cervantes, A. L., Castañeda-Martínez, R., & Ramírez-Ortega, J. (2022). Teachers’ Opinion About Collaborative Virtual Walls and Massive Open Online Course During the COVID-19 Pandemic. Online Journal of Communication and Media Technologies, 12(1), e202202. https://doi.org/10.30935/ojcmt/11305
Samsudin, M. A., Chut, T. S., Ismail, M. E., & Ahmad, N. J. (2020). A Calibrated Item Bank for Computerized Adaptive Testing in Measuring Science TIMSS Performance. Eurasia Journal of Mathematics, Science and Technology Education, 16(7), em1863. https://doi.org/10.29333/ejmste/8259
Santoso, S. (2010). Statistik parametrik. Elex Media Komputindo.
Scheuermann, F., & Pereira, A. G. (2008). Towards a research agenda on computer-based assessment. Challenges and Needs for European Educational Measurement. Luxembourg.
Sharif, A., & Magrill, B. (2015). Discussion forums in MOOCs. International Journal of Learning, Teaching and Educational Research, 12(1), 119-132.
Slade, C., Lawrie, G., Taptamat, N., Browne, E., Sheppard, K., & Matthews, K. E. (2022). Insights into how academics reframed their assessment during a pandemic: disciplinary variation and assessment as afterthought. Assessment & Evaluation in Higher Education, 47(4), 588–605. https://doi.org/10.1080/02602938.2021.1933379
Smith, K., Emerson, D., Haight, T., & Wood, B. (2022). An examination of online cheating among business students through the lens of the Dark Triad and Fraud Diamond. Ethics & Behavior, 1–28. https://doi.org/10.1080/10508422.2022.2104281
Sorea, D., & Repanovici, A. (2020). Project-based learning and its contribution to avoid plagiarism of university students. Investigación Bibliotecológica: Archivonomía, Bibliotecología e Información, 34(85), 155. https://doi.org/10.22201/iibi.24488321xe.2020.85.58241
Sorea, D., Roșculeț, G., & Bolborici, A.-M. (2021). Readymade Solutions and Students’ Appetite for Plagiarism as Challenges for Online Learning. Sustainability, 13(7), 3861. https://doi.org/10.3390/su13073861
Steinberger, P., Eshet, Y., & Grinautsky, K. (2021). No Anxious Student Is Left Behind: Statistics Anxiety, Personality Traits, and Academic Dishonesty—Lessons from COVID-19. Sustainability, 13(9), 4762. https://doi.org/10.3390/su13094762
Stephens, J. M., Watson, P. W. S. J., Alansari, M., Lee, G., & Turnbull, S. M. (2021). Can Online Academic Integrity Instruction Affect University Students’ Perceptions of and Engagement in Academic Dishonesty? Results From a Natural Experiment in New Zealand. Frontiers in Psychology, 12(569133), 1–16. https://doi.org/10.3389/fpsyg.2021.569133
Stöckelová, T., & Virtová, T. (2015). A tool for learning or a tool for cheating? The many‐sided effects of a participatory student website in mass higher education. British Journal of Educational Technology, 46(3), 597–607. https://doi.org/10.1111/bjet.12155
Sun, Y., Wang, T.-H., & Wang, L.-F. (2021). Implementation of Web-Based Dynamic Assessments as Sustainable Educational Technique for Enhancing Reading Strategies in English Class during the COVID-19 Pandemic. Sustainability, 13(11), 5842. https://doi.org/10.3390/su13115842
Surahman, E., & Wang, T. H. (2022). Academic dishonesty and trustworthy assessment in online learning: a systematic literature review. Journal of Computer Assisted Learning, 38(6), 1535–1553. https://doi.org/10.1111/jcal.12708
Symaco, L. P., & Marcelo, E. (2003). Faculty perception on student academic honesty. College Student Journal, 37(3), 327–334.
Terras, M. M., & Ramsay, J. (2015). Massive open online courses (MOOCs): Insights and challenges from a psychological perspective. British Journal of Educational Technology, 46(3), 472–487. https://doi.org/10.1111/bjet.12274
Tsaliki, L. (2022). Constructing young selves in a digital media ecology: youth cultures, practices and identity. Information, Communication & Society, 25(4), 477–484. https://doi.org/10.1080/1369118X.2022.2039747
Turnbull, D., Chugh, R., & Luck, J. (2021). Transitioning to E-Learning during the COVID-19 pandemic: How have Higher Education Institutions responded to the challenge? Education and Information Technologies, 26(5), 6401–6419. https://doi.org/10.1007/s10639-021-10633-w
Turner, K. L., Adams, J. D., & Eaton, S. E. (2022). Academic integrity, STEM education, and COVID-19: a call to action. Cultural Studies of Science Education, 17(2), 331–339. https://doi.org/10.1007/s11422-021-10090-4
Ulfa, S., Fattawi, I., Surahman, E., & Yusuke, H. (2019). Investigating learners’ perception of learning analytics dashboard to improve learning interaction in online learning system. 2019 5th International Conference on Education and Technology (ICET), 49–54. https://doi.org/10.1109/ICET48172.2019.8987229
Ullah, A., Xiao, H., & Barker, T. (2019a). A study into the usability and security implications of text and image based challenge questions in the context of online examination. Education and Information Technologies, 24(1), 13–39. https://doi.org/10.1007/s10639-018-9758-7
Valizadeh, M. (2022). Cheating in online learning programs: learners’ perceptions and solutions. Turkish Online Journal of Distance Education, 23(1), 195–209. https://doi.org/10.17718/tojde.1050394
Vazquez, J. J., Chiang, E. P., & Sarmiento-Barbieri, I. (2021a). Can we stay one step ahead of cheaters? A field experiment in proctoring online open book exams. Journal of Behavioral and Experimental Economics, 90, 101653. https://doi.org/10.1016/j.socec.2020.101653
Verenikina, I., Delahunty, J., & Jones, P. (2015). Scaffolding Online Discussion in an Asynchronous Forum In A Postgraduate Blended Delivery Course to Enhance University Students’construction of Knowledge and Communication Competencies. EDULEARN15 Proceedings, 3922–3923.
Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Cmbridge; harvard university Press.
Wang, T.-H. (2010). Web-based dynamic assessment: Taking assessment as teaching and learning strategy for improving students’e-Learning effectiveness. Computers & Education, 54(4), 1157–1166. https://doi.org/10.1016/j.compedu.2009.11.001
Wang, T.-H. (2014). Developing an assessment-centered e-Learning system for improving student learning effectiveness. Computers & Education, 73, 189–203. https://doi.org/10.1016/j.compedu.2013.12.002
Wang, T.-H., Kao, C.-H., & Chen, H.-C. (2021). Factors associated with the equivalence of the scores of computer-based test and paper-and-pencil test: Presentation type, item difficulty and administration order. Sustainability, 13(17), 9548. https://doi.org/10.3390/su13179548
Wang, T.-H., & Kubincová, Z. (2017). E-assessment and its role and possibility in facilitating future teaching and learning. EURASIA Journal of Mathematics, Science and Technology Education, 13(4), 1041–1043. https://doi.org/10.12973/eurasia.2017.00664a
Wang, T.-H., Sun, Y., & Huang, N.-W. (2021). Implementation of web-based dynamic assessment in improving low English achievers’ learning effectiveness. Computer Assisted Language Learning, 1–27. https://doi.org/10.1080/09588221.2021.1998129
Wang, Y., & Xu, Z. (2021). Statistical Analysis for Contract Cheating in Chinese Universities. Mathematics, 9(14), 1684. https://doi.org/10.3390/math9141684
Webb, C. (1999). Analysing qualitative data: computerized and other approaches. Journal of Advanced Nursing, 29(2), 323–330. https://doi.org/10.1046/j.1365-2648.1999.00892.x
Wei, X., Saab, N., & Admiraal, W. (2021). Assessment of cognitive, behavioral, and affective learning outcomes in massive open online courses: A systematic literature review. Computers and Education, 163, 104097. https://doi.org/10.1016/j.compedu.2020.104097
Wenzel, K., & Reinhard, M.-A. (2020). Tests and academic cheating: do learning tasks influence cheating by way of negative evaluations? Social Psychology of Education, 23(3), 721–753. https://doi.org/10.1007/s11218-020-09556-0
White, A. (2020). May you live in interesting times: a reflection on academic integrity and accounting assessment during COVID19 and online learning. Accounting Research Journal, 34(3), 304–312. https://doi.org/10.1108/ARJ-09-2020-0317
Whitley, B. E., Nelson, A. B., & Jones, C. J. (1999). Gender Differences in Cheating Attitudes and Classroom Cheating Behavior: A Meta-Analysis. Sex Roles, 41(9/10), 657–680. https://doi.org/10.1023/A:1018863909149
Wise, A. F., Cui, Y., & Vytasek, J. (2016). Bringing order to chaos in MOOC discussion forums with content-related thread identification. Proceedings of the Sixth International Conference on Learning Analytics & Knowledge, 188–197.
Wuthisatian, R. (2020). Student exam performance in different proctored environments: Evidence from an online economics course. International Review of Economics Education, 35, 100196. https://doi.org/10.1016/j.iree.2020.100196
Yazici, S., Yildiz Durak, H., Aksu Dünya, B., & Şentürk, B. (2022). Online versus face-to-face cheating: The prevalence of cheating behaviours during the pandemic compared to the pre-pandemic among Turkish University students. Journal of Computer Assisted Learning, n/a(n/a). https://doi.org/https://doi.org/10.1111/jcal.12743
Zoto, E., Kowalski, S. J., Lopez Rojas, E. A., & Kianpour, M. (2018). Using a socio-technical systems approach to design and support systems thinking in cyber security education. CEUR Workshop Proceedings, 123–128.
 
 
 
 
第一頁 上一頁 下一頁 最後一頁 top

相關論文

無相關論文
 
* *