International Association of Educators   |  ISSN: 1308-951X

Original article | International Journal of Research in Teacher Education 2019, Vol. 10(4) 19-34

Appraisal of May/June West African Senior School Certificate Examination Questions in Business Management

Edmond Kwesi Agormedah

pp. 19 - 34   |  Manu. Number: MANU-1908-13-0001

Published online: December 31, 2019  |   Number of Views: 165  |  Number of Download: 731


Assessment is a key component of teaching and learning. Poorly constructed questions might affect students' performance and distort examination results. The main purpose of the study was to evaluate May/June West African Senior School Certificate Examination (WASSCE) questions in Business Management. The study was guided by the cognitive level of Bloom’s Taxonomy. The data source consisted of multiple-choice questions (MCQs) and essay test items drawn from May/June WASSCE questions in Business Management conducted by the West African Examinations Council (WAEC) for a period of eight (8) years (2011-2018). Descriptive-content analysis was used to classify the examination questions based on the cognitive level of Bloom’s taxonomy. The study found that most of the examination questions (MCQs) were standard (followed the principles of constructing multiple-choice item). However, few of the MCQs had item writing flaws (IWFs) such as negative stem, options not having equal length and options not arrange in alphabetical and chronological order. Also, most of the examination questions highly measured the lower-order cognitive processing of the students. Only few questions measured higher-order cognitive levels of the students. The study concluded that the assessment principles in constructing multiple-choice items and profile dimensions are not strictly adhered to in crafting May/June WASSCE questions in Business Management. The dominance questions in the domain of lower-order cognitive skills could possibly affect instructional intercourse, predominantly, where teachers and students largely depend on such questions for practice and assessments. The study recommended that WAEC should ensure that examiners follow assessment principles in constructing multiple-choice items in order to avoid item writing flaws (IWFs). They should ensure that examination questions are carefully designed taking into consideration the profile dimensions of the syllabus in order to develop students’ higher-order cognitive processing skills.

Keywords: Assessment, Business Management, Bloom’s Taxonomy, Item Writing Flaw, WAEC, WASSCE

How to Cite this Article?

APA 6th edition
Agormedah, E.K. (2019). Appraisal of May/June West African Senior School Certificate Examination Questions in Business Management . International Journal of Research in Teacher Education, 10(4), 19-34.

Agormedah, E. (2019). Appraisal of May/June West African Senior School Certificate Examination Questions in Business Management . International Journal of Research in Teacher Education, 10(4), pp. 19-34.

Chicago 16th edition
Agormedah, Edmond Kwesi (2019). "Appraisal of May/June West African Senior School Certificate Examination Questions in Business Management ". International Journal of Research in Teacher Education 10 (4):19-34.

  1. Abosalem, Y. (2016).  Assessment techniques and students’ higher-order thinking skills. International Journal of Secondary Education, 4(1), 1-11. [Google Scholar]
  2. Abouelkheir, H. M. (2018). The criteria and analysis of multiple‑choice questions in undergraduate dental examinations. J Dent Res Rev, 5, 59-64 [Google Scholar]
  3. Alfaki, I. M. (2014). Sudan English language syllabus: Evaluating reading comprehension questions using Bloom’s taxonomy. International Journal of English Language Teaching, 2(3), 53-74. [Google Scholar]
  4. Almuhaidib, N. (2010). Types of item‑writing flaws in multiple choice question pattern: A comparative study. Umm Al Qura Univ JEduc Psychol Sci, 2, 10‑45. [Google Scholar]
  5. Amedahe, F. K. (1989). Testing practices in secondary schools in the Central Region of Ghana. Unpublished master’s thesis, University of Cape Coast, Cape Coast. [Google Scholar]
  6. Assaly, I. R., & Smadi, O. M. (2015). Using Bloom’s taxonomy to evaluate the cognitive levels of master class textbook’s questions. English Language Teaching, 8(5), 100-110. [Google Scholar]
  7. Baig, M., Ali, S. K., Ali, S., & Huda, N. (2014). Evaluation of multiple choice and short essay question items in basic medical sciences. Pak J Med Sci, 30, 3-6.  [Google Scholar]
  8. Bloom, B. S. (1956). Taxonomy of educational objectives: Handbook 1 cognitive domain. London: Longmans [Google Scholar]
  9. Boyd, B. (2008). Effects of state tests on classroom test items in mathematics. School Science and Mathematics, 108(6), 251-261. [Google Scholar]
  10. Clay, B., & Root, E. (2001). Is this a trick question? A short guide to writing effective test questions. Kansas: Kansas Curriculum Center [Google Scholar]
  11. Cobbinah, A. (2016). Items’ sequencing on difficulty level and students’ achievement in mathematics test in Central Region of Ghana. African Journal of Interdisciplinary Studies, 9, 55-62  [Google Scholar]
  12. Cobbinah, A., Daramola, D. S., Owolabi, H. O., & Olutola, A. T. (2017). Analysis of levels of thinking required in West African senior secondary school certificate in core mathematics multiple choice items. African Journal of Interdisciplinary Studies, 10, 1-7.  [Google Scholar]
  13. Costello, E., Holland, J. C., & Kirwan, C. (2018). Evaluation of MCQs from MOOCs for common item writing flaws. BMC Research Notes, 11(849), 1-3  [Google Scholar]
  14. Davidson, R. A., & Baldwin, B. A. (2005). Cognitive skills objectives in intermediate accounting textbooks: Evidence from end-of-chapter material. Journal of Accounting Education, 23(2), 79-95. [Google Scholar]
  15. DiSantis, D. J., Ayoob, A. R., & Williams, L. E. (2015). Prevalence of flawed multiple-choice questions in continuing medical education activities of major radiology journals. American Journal of Roentgenology, 204, 698-702 [Google Scholar]
  16. Downing, S. M. (2002). Construct-irrelevant variance and flawed test questions: do multiple-choice item-writing principles make any difference? Acad Med, 77(10), S103-S104. [Google Scholar]
  17. Downing, S. M. (2005). The effects of violating standard item writing principles on tests and students: The consequences of using flawed test items on achievement examinations in medical education. Advances in Health Sciences Education, 10(2), 133-143. [Google Scholar]
  18. Ebadi, S., & Mozafari, V. (2016). Exploring Bloom’s revised taxonomy of educational objectives in TPSOL textbooks. Journal of Teaching Persian to Speakers of Other Languages, 5(1), 65-93 [Google Scholar]
  19. Ebadi, S., & Mozafari, V. (2016). Exploring Bloom’s revised taxonomy of educational objectives in TPSOL textbooks. Journal of Teaching Persian to Speakers of Other Languages, 5(1), 65-93 [Google Scholar]
  20. Etsey, Y. K. (2012). Assessment in education. Cape Coast: University of Cape Coast Press [Google Scholar]
  21. Haladyna, T. M., & Rodriguez, M. C. (2013). Developing and validating test items. New York, NY: Routledge [Google Scholar]
  22. Hopper, C. (2009). Practicing college learning strategies. (5th Ed). New York, NY: Houghton Mifflin [Google Scholar]
  23. Ijeoma, J. A., Eme, U. J., & Nsisong, A. U. (2013). Content validity of May/June West African senior school certificate examination (WASSCE) questions in chemistry. Journal of Education and Practice, 4(7), 15-21 [Google Scholar]
  24. Kasim, Z. U., & Zulfikar, T. (2017). Analysis of instructional questions in an English textbook for senior high schools. English Education Journal (EEJ), 8(4), 536-552 [Google Scholar]
  25. Kenneth, D. R., & Mari-Wells, H. (2017). The prevalence of item construction flaws in medical school examinations and innovative recommendations for improvement. European Medical Journal, 1(1), 61-66 [Google Scholar]
  26. Köksal, D., & Ulum, O. G. (2018). Language assessment through Bloom’s Taxonomy. Journal of Language and Linguistic Studies, 14(2), 76-88 [Google Scholar]
  27. Masters, J. C., Hulsmeyer, B. S., Pike, M. E., Leichty, K., Miller, M. T., Verst, A. L (2001). Assessment of multiple‑choice questions in selected test banks accompanying text books used in nursing education. Journal of Nursing Education, 40(1), 25‑32. [Google Scholar]
  28. Ministry of Education (2010). Teaching syllabus for business management (senior high school 1-3). Accra: Curriculum Research and Development Division (CRDD), Ghana Education Service [Google Scholar]
  29. Mizbani, M., & Chalak, A. (2017). Analyzing listening and speaking activities of Iranian EFL textbook prospect 3 through Bloom's revised taxonomy. Advances in Language and Literary Studies (ALLS), 8(3), 38-4 [Google Scholar]
  30. Nedeau-Cayo, R., Laughlin, D., Rus, L., & Hall, J. (2013). Assessment of item-writing flaws in multiple-choice questions. J Nurses Prof Dev, 29(2):52-7. [Google Scholar]
  31. Okanlawon, A. E, & Adeot, Y. F. (2014). Content analysis of West African senior school certificate chemistry examination questions according to cognitive complexity. IFE PsychologIA, 22(2), 14-26 [Google Scholar]
  32. Omer, A. A., Abdulrahim, M. E., & Albalawi, I. A. (2016). Flawed multiple-choice questions put on the scale: What is their impact on students’ achievement in a final undergraduate surgical examination? Journal of Health Specialties, 4:270-275. [Google Scholar]
  33. Orey, M. (2010). Emerging perspectives on learning, teaching, and technology. Zurich, Switzerland: Jacobs Foundation [Google Scholar]
  34. Pais, J., Silva, A., Guimaraes, B., Povo, A, Coelho, E. et al. (2016). Do item‑writing flaws reduce examinations psychometric quality? BMC Res Notes, 9, 2-7 [Google Scholar]
  35. Palmer, E. J., & Devitt, P. G. (2007). Assessment of higher order cognitive skills in undergraduate education: modified essay or multiple-choice questions? Research paper. BMC Medical Education, 7(49), 1-7 [Google Scholar]
  36. Quansah, F., & Amoako, I. (2018). Attitude of senior high school teachers toward test construction: Developing and validating a standardised instrument. Research on Humanities and Social Sciences, 8(1), 25-30 [Google Scholar]
  37. Quansah, F., Amoako, I., & Ankomah, F. (2019). Teachers’ test construction skills in senior high schools in Ghana: Document analysis. International Journal of Assessment Tools in Education, 6(1), 1–8 [Google Scholar]
  38. Rahpeyma, A., & Khoshnood, A. (2015). The analysis of learning objectives in Iranian junior high school English text books based on Bloom’s revised taxonomy. International Journal of Education & Literacy Studies (IJELS), 3(2), 44-55 [Google Scholar]
  39. Rawadieh, S. (1998). An analysis of the cognitive levels of questions in Jordanian secondary social studies textbooks according to Bloom’s taxonomy. Unpublished doctoral dissertation, Ohio University.  [Google Scholar]
  40. Rezaee, M., & Golshan, M. (2016). Investigating the cognitive levels of English final exams based on Bloom’s taxonomy. International Journal of Educational Investigations, 3(4), 57-68. [Google Scholar]
  41. Roohani, A. (2015). Analyzing cognitive processes and multiple intelligences in the top-notch textbooks. English Language Teaching, 2(3), 39-65 [Google Scholar]
  42. Rush, B. R., Rankin, D. C., White, B. J. (2016). The impact of item-writing flaws and item complexity on examination item difficulty and discrimination value. BMC Medical Education, 16(250), 3-10 [Google Scholar]
  43. Sadeghi. B., & Mahdipour, N. (2015). Evaluating ILI advanced series through Bloom‘s revised taxonomy. Science Journal (CSJ), 36(3), 2247-2260. [Google Scholar]
  44. Soleimani, H., & Kheiri, S. (2016). An evaluation of TEFL postgraduates’ testing classroom activities and assignments based on Bloom's revised taxonomy. Theory and Practice in Language Studies, 6(4), -869 [Google Scholar]
  45. Solihati, N., & Hikmat, A. (2018). Critical thinking tasks manifested in Indonesian language textbooks for senior secondary students. SAGE Open, 7(9), 1–8 [Google Scholar]
  46. Sood, R. S., Bendre, M. B, & Sood, A. (2016). Analysis and remedy of the item writing flaws rectified at pre-validation of multiple choice questions drafted for assessment of MBBS students. Indian Journal of Basic and Applied Medical Research, 5(2), 692-698 [Google Scholar]
  47. Taghipoor, H. (2015). Determining the emphasis on Bloom‘s cognitive domain in the contents of science textbook for the sixth grade. SAUSSUREA, 3(3), 162-175. [Google Scholar]
  48. Tangsakul, P., Kijpoonphol, W., Linh, N. D., & Kimura, L. N. (2017). Using bloom’s revised taxonomy to analyse reading comprehension questions in team up in English 1-3 and grade 9 English O-Net tests. International Journal of Research – GRANTHAALAYAH, 5(7), 31-41. [Google Scholar]
  49. Tarig, S., Tarig, S., Magsood, S., Jawed, S., & Baig, M. (2017). Evaluation of cognitive levels and item writing flaws in medical pharmacology internal assessment examinations. Pak J Med Sci, 33(4), 866-870. [Google Scholar]
  50. Tarman, B., & Kuran, B. (2015). Examination of the cognitive level of questions in social studies textbooks and the views of teachers based on bloom taxonomy. Educational Sciences: Theory & Practice, 15(1), 213-222 [Google Scholar]
  51. Tarrant, M., & Ware J. (2008). Impact of item-writing flaws in multiple-choice questions on student achievement in high-stakes nursing assessments. Medical Education, 42(2), 198-206 [Google Scholar]
  52. Tarrant, M., Knierim, A., Hayes, S. K, & Ware J. (2006). The frequency of item writing flaws in multiple-choice questions used in high stakes nursing assessments. Nurse Education Today, 26(8), 662-671 [Google Scholar]
  53. Tarrant, M., Ware, J., & Mohammed, A. M. (2009). An assessment of functioning and non-functioning distractors in multiple-choice questions: A descriptive analysis. BMC Medical Education, 9(40), 1-8 [Google Scholar]
  54. Tikkanen, G., & Aksela, M. (2012). Analysis of Finnish chemistry matriculation examinations questions according to cognitive complexity. Nordic Studies in Science Education, 8(3), 258 – 268 [Google Scholar]
  55. Ulum, Ö. G. (2016). A descriptive content analysis of the extent of Bloom’s taxonomy in the reading comprehension questions of the course book Q: Skills for success 4 reading and writing. The Qualitative Report, 21(9), 1674-1683 [Google Scholar]
  56. Upahi, J. E., & Jimoh, M. (2016). Classification of end-of-chapter questions in senior high school chemistry textbooks used in Nigeria. European Journal of Science and Mathematics Education, 4(1), 90-102 [Google Scholar]
  57. Upahi, J. E., Israel, D. O., & Olorundare, A. S. (2016). Analysis of the West African senior school certificate examination (WASSCE) chemistry questions according to bloom’s revised taxonomy. Eurasian Journal of Physics & Chemistry Education, 8(2), 59-70 [Google Scholar]
  58. Upahi, J., E., Issa, G., B., & Oyelekan, O., S. (2015). Analysis of senior school certificate examination chemistry questions for higher-order cognitive skills. Cypriot Journal of Educational Sciences, 10(3), 218-227. [Google Scholar]
  59. Zamani, G., & Rezvani, R. (2015). HOTS in Iran‘s official textbooks: Implications for material design and student learning. Journal of Applied Linguistics and Language Research, 2(5), 138-151. [Google Scholar]
  60. Zareian, G., Davoudi, M., Heshmatifar, Z., & Rahimi, J. (2015). An evaluation of questions in two ESP coursebooks based on Bloom’s new taxonomy of cognitive learning domain. International Journal of Education and Research, 3(8), 313-326. [Google Scholar]