Evaluating the Impact of Open Book Exams on Secondary School Student Performance in Panipat District

Authors

  • Parul Vats Research Scholar, (Education)
  • Dr. Neeru Verma Research Guide, Bhagwant University Ajmer, Rajasthan, India
  • Dr. S.P. Tripathi Research Guide, Bhagwant University Ajmer, Rajasthan, India

DOI:

https://doi.org/10.31305/rrijm.2024.v09.n11.004

Keywords:

Alternative, Pedagogical, Examinations, Circumstances, Instructors, Effectively, Implemented and Established

Abstract

In this research paper I have thoroughly described about the topic “Evaluating the Impact of Open Book Exams on Secondary School Student Performance in Panipat District.” The purpose of this research is to determine how open book examinations affected the academic performance of Panipat District secondary school students. Research tries to evaluate the usefulness of this assessment approach by comparing average results from conventional tests with open book exams, and by examining subject-specific improvements. From 65.4% on conventional examinations to 72.8% on open book exams, with particular gains in Science and Mathematics, according to data gathered from 200 students. Additional qualitative evidence from both students and instructors points to less test anxiety and more involvement, lending credence to these results. Based on the findings, it seems that open book examinations provide a more conducive learning environment by allowing students to better grasp and apply what they have learned. This study suggests that open book examinations might be a good addition to secondary school evaluations, but more research is required to determine the long-term effects and problems with implementing them.

References

Adler, J. (2017). Open-book examinations: A solution to plagiarism? Assessment & Evaluation in Higher Education, 42(4), 553-561.

Alves, C. D., & Montardo, M. S. (2018). The impact of open-book assessment on university students' learning strategies. Assessment & Evaluation in Higher Education, 43(2), 244-256.

Biggs, J. (2003). Aligning teaching for constructing learning. Higher Education Academy. https://www.heacademy.ac.uk/system/files/id477_aligningteachingforconstructinglearning.pdf8

Brookhart, S. M., & Nitko, A. J. (2016). Assessment and grading in classrooms. Pearson.

Capon, N., & Kuhn, A. (2013). Investigating the impact of a digital library on learning and assessment. Assessment & Evaluation in Higher Education, 38(4), 437-449.

Foltz, P. W., & Gilliam, S. (2018). Measuring learning in the open book classroom. International Journal of Learning, Teaching and Educational Research, 17(9), 1-13.

Gibbs, G., & Simpson, C. (2005). Conditions under which assessment supports students' learning. Learning and Teaching in Higher Education, 1(1), 3-31.

Guskey, T. R., & Bailey, J. M. (2001). Developing grading and reporting systems for student learning. Corwin Press.

Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77(1), 81-112.

McDaniel, M. A., Roediger III, H. L., & McDermott, K. B. (2007). Generalizing test-enhanced learning from the laboratory to the classroom. Psychonomic Bulletin & Review, 14(2), 200-206.

Rust, C. (2002). The impact of assessment on student learning: How can the research literature practically help to inform the development of departmental assessment strategies and learner-centred assessment practices? Active Learning in Higher Education, 3(2), 145-158.

Tan, E., & Fu, Y. (2019). Assessing the impacts of open-book assessments on student learning and attitudes in computer science education. Journal of Computing in Higher Education, 31(2), 271-289.

Tofade, T., Elsner, J., & Haines, S. T. (2013). Best practice strategies for effective use of questions as a teaching tool. American Journal of Pharmaceutical Education, 77(7), 155.

Downloads

Published

20-11-2024

How to Cite

Vats, P., Verma, N., & Tripathi, S. (2024). Evaluating the Impact of Open Book Exams on Secondary School Student Performance in Panipat District. RESEARCH REVIEW International Journal of Multidisciplinary, 9(11), 21–25. https://doi.org/10.31305/rrijm.2024.v09.n11.004