• Register
  • Login
  • Persian

Horizon of Medical Education Development

  1. Home
  2. Electronic Evaluation Methods: A Scoping Review

Current Issue

By Issue

By Author

By Subject

Author Index

Keyword Index

Questionnaires of Articles

About Journal

Aims and Scope

Editorial Board

Publication Ethics

Editorial policy

Peer Review Process

Advertising policy

Indexing and Abstracting

FAQ

Corrections, Retractions and Matters Arising

Authors General Guideline

Guide for Reviewers

Guideline of preparing Author Contribition

Audio Abstract Setting Guideline

Forms

Electronic Evaluation Methods: A Scoping Review

    Authors

    • Fatemeh Keshmiri 1
    • Atefeh Sadat Heydari 2

    1 Medical Education, Department of Medical Education, School of Health, Shahid Sadoughi University of Medical Sciences, Yazd, Iran.

    2 Education Development Center, Shahid Sadougi University of Medical Sciences, Yazd, Iran.

,

Document Type : Review Article

10.22038/hmed.2021.58687.1176
  • Article Information
  • References
  • Download
  • How to cite
  • Statistics
  • Share

Abstract

Introduction: There is a need to focus more on planning for the implementation of electronic assessment in the educational system. This study provided an overview of various electronic assessment methods, including structured, non-structured, and reasoning tests to explain the characteristics of a concept.
Materials & Methods: This a scoping review was conducted by searching for the keywords Distance education, assessment, e-learning, e-assessment, virtual learning, electronic test, and formative assessment in Magiran, ISC, SID, Scopus, Science Direct, and PubMed databases. The review was performed in Persian and English language from 2000-September 2021. A total of 105 studies were extracted in the first step of the study. Then, the titles of the extracted texts were reviewed, and 52 documents, including articles, reports, and guidelines, were included in the review the abstract in the next step. Finally, the full text of 28 articles was reviewed and used in the present study based on the inclusion criteria.
Results: Various structured and non-structured tests are introduced per learner assessment in medical education systems. The most critical structured tests include multiple-choice tests, blank and labeling questions, selection of essential points in the image, True/False questions, and simulations of the process implemented electronically. Essay questions, short answer questions, fill-in questions, scenario-based questions, oral examination, and reflection-based questions are classified into non-structured assessment methods. Project-based tests can also be used to assess learners' high cognition levels and develop an "assessment for learning" approach. The e-portfolio is recognized as a tool for learners to learn more deeply and increase their understanding and knowledge during the learning period. Reasoning tests can also be used electronically to assess decision-making and reasoning skills.
Conclusion: Electronic assessment provides a good platform for measuring different levels of cognition (from low to high), which is very important in medical education. The critical point is to choose the test under the educational goals, which can effectively improve the assessment's effectiveness. Therefore, the use of various methods appropriate to educational purposes is recommended.

Keywords

  • "Electronic test"
  • "E-assessment"
  • "Formative Assessment". "E-Learning"
  • XML
  • PDF 1.37 M
  • RIS
  • EndNote
  • Mendeley
  • BibTeX
  • APA
  • MLA
  • HARVARD
  • CHICAGO
  • VANCOUVER
References
1. Shumway JM, Harden RM. AMEE Guide
No. 25: The assessment of learning outcomes for
the competent and reflective physician. Medical
teacher. 2003;25(6):569-84.
2. Dennick R, Wilkinson S, Purcell N.
Online eAssessment: AMEE guide No. 39. Medical
teacher 2009;31(3):192-206.
3. Mtebe JS, Raphael C. Key factors in
learners’ satisfaction with the e-learning system at
the University of Dar es Salaam, Tanzania.
Australasian Journal of Educational Technology.
2018;34(4):107-22.
4. Venkateswari SL. Using E-Assessment to
Attain the Desired Learning Outcome in Higher
Education. Language in India. 2020;20(4):15-26.
5. Westhuizen D. Guidelines for Online
Assessment for Educators. Burnaby:
Commonwealth of learning; 2016.
6. Grant MJ, Booth A. A typology of
reviews: an analysis of 14 review types and
associated methodologies. Health Information &
Libraries Journal. 2009;26(2):91-108.
7. Xu X, Kauer S, Tupy S. Multiple-choice
questions: Tips for optimizing assessment in-seat
and online. Scholarship of Teaching and Learning
in Psychology. 2016;2(2):147-158.
8. Burton SJ, Sudweeks RR, Merrill PF,
Wood B. How to prepare better multiple-choice
test items: Guidelines for university faculty. Provo:
Brigham Young University and Department of
Instructional Science; 1991.
9. Albanese MA. Type K and other complex
multiple-choice items: An analysis of research and
item properties. Educational Measurement: Issues
and Practice. 1993;12(1):28-33.
10. Saif A. Educational
Measurement,Assessment and Evaluation. Tehran:
DoranPublishers; 2008.
11. Cann AJ. Extended matching sets
questions for online numeracy assessments: a case
study. Assessment & Evaluation in Higher
Education. 2005;30(6):633-40.
12. Jalili M, Khabaz.mafinejad M,
Gandomkar R, Mortaz.Hejri S. Priniciple and
Methods of Student Assessment in Health
Profession. Tehran: The Academy of Medical
Sciences; 2017.
13. Cluskey Jr G, Ehlen CR, Raiborn MH.
Thwarting online exam cheating without proctor
supervision. Journal of Academic and Business
Ethics. 2011;4(1):1-7.
14. Rezayizade M, bandali B, Shahverdi R.
Instruction and evaluation methodes in virtual
class. Tehran: Shahid Beheshti University; 2020.
15. Amin HA, Shehata MH, Ahmed SA. Step-
by-step Guide to Create Competency-Based
Assignments as an Alternative for Traditional
Summative Assessment. MedEdPublish.
2020;9(120):1-28.
16. Wolf DP. Portfolio assessment: Sampling
student work. Effective School Research Abstract;
1989.
17. Gülbahar Y, Tinmaz H. Implementing
project-based learning and e-portfolio assessment
in an undergraduate course. Journal of Research on
Technology in Education. 2006;38(3):309-27.
18. Masters K, Ellaway R. e-Learning in
medical education Guide 32 Part 2: Technology,
management and design. Medical teacher.
2008;30(5):474-89.
19. Dichev C, Dicheva D. Gamifying
education: what is known, what is believed and
what remains uncertain: a critical review.
International journal of educational technology in
higher education. 2017;14(9):1-36.
20. Hatami J, Rezaei E, Maleki M.
Assessment ans Evaluation in E-Learning. Tarbiat
Modares University Press; 2019.
21. Kiryakova G, Angelova N, Yordanova L,
editors. Gamification in education. Proceedings of
9th International Balkan Education and Science
Conference; 2014.
22. Fischer MR, Kopp V, Holzer M, Ruderich
F, Jünger J. A modified electronic key feature
examination for undergraduate medical students:
validation threats and opportunities. Medical
Teacher. 2005;27(5):450-5.
23. Hrynchak P, Glover Takahashi S, Nayer
M. Key‐feature questions for assessment of clinical
reasoning: a literature review. Medical Education.
2014;48(9):870-83.
24. Jesmi A, Jouybari L, Sanagoo A. Can We
Use the Clinical Integrative Puzzle (CIP) For the
Assessment of Clinical Reasoning in Nursing
Students? Iranian Journal of Medical Education.
2018;18(0):106-8.
25. Yaghmaei M, Monajemi A. Reflection on
the implementation of PMP (patient management
problem) in licensing exams in Iran. Iranian
Journal of Medical Education. 2020;20(0):269-71.
26. Shayan S, Sabouri M, Salehi A. A Guid to
assessment of clinical competence using an
objective structured clinical examination. Isfahan:
Education IUom; 2003.
27. Shayan S. Using Patient Management
Problem (EPMP) in Assessment of Clinical
Competency. Iranian Journal of Medical Education.
2011;10(5):1087-92.
28. Takabayashi K, Fujikawa K, Suzuki T,
Yamazaki S, Honda M, Amaral M, et al.
Implementation and evaluation of computerized
patient management problems. Medinfo Medinfo.
1995;8:1218-21
    • Article View: 4,138
    • PDF Download: 2,119
Horizon of Medical Education Development
Volume 13, Issue 4
December 2022
Pages 98-85
Files
  • XML
  • PDF 1.37 M
Share
How to cite
  • RIS
  • EndNote
  • Mendeley
  • BibTeX
  • APA
  • MLA
  • HARVARD
  • CHICAGO
  • VANCOUVER
Statistics
  • Article View: 4,138
  • PDF Download: 2,119

APA

Keshmiri, F. and Heydari, A. S. (2022). Electronic Evaluation Methods: A Scoping Review. Horizon of Medical Education Development, 13(4), 98-85. doi: 10.22038/hmed.2021.58687.1176

MLA

Keshmiri, F. , and Heydari, A. S. . "Electronic Evaluation Methods: A Scoping Review", Horizon of Medical Education Development, 13, 4, 2022, 98-85. doi: 10.22038/hmed.2021.58687.1176

HARVARD

Keshmiri, F., Heydari, A. S. (2022). 'Electronic Evaluation Methods: A Scoping Review', Horizon of Medical Education Development, 13(4), pp. 98-85. doi: 10.22038/hmed.2021.58687.1176

CHICAGO

F. Keshmiri and A. S. Heydari, "Electronic Evaluation Methods: A Scoping Review," Horizon of Medical Education Development, 13 4 (2022): 98-85, doi: 10.22038/hmed.2021.58687.1176

VANCOUVER

Keshmiri, F., Heydari, A. S. Electronic Evaluation Methods: A Scoping Review. Horizon of Medical Education Development, 2022; 13(4): 98-85. doi: 10.22038/hmed.2021.58687.1176

  • Home
  • About Journal
  • Editorial Board
  • Submit Manuscript
  • Contact Us
  • Sitemap

News

  • Introduction to the Horizon Development Journal on ... 2025-05-11
  • SMS system 2024-09-17
  • The Scientific-Research Quarterly of Horizon of Medical ... 2024-11-17
  • Appreciation meeting of the referees of the scientific-research ... 2024-04-13
  • Special call for artificial intelligence 2024-04-06

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Newsletter Subscription

Subscribe to the journal newsletter and receive the latest news and updates

© Journal Management System. Powered by Sinaweb