|Year : 2015 | Volume
| Issue : 4 | Page : 76-79
Blueprinting in assessment: A tool to increase the validity of undergraduate written examinations in pathology
Sunita Y Patil, Manasi Gosavi, Hema B Bannur, Ashwini Ratnakar
Department of Pathology, J. N. Medical College, KLE University, Belagavi, Karnataka, India
|Date of Submission||19-May-2015|
|Date of Acceptance||24-Jul-2015|
|Date of Web Publication||5-Aug-2015|
Dr. Sunita Y Patil
Department of Pathology, J. N. Medical College, KLE University, Belagavi, Karnataka
Source of Support: None, Conflict of Interest: None
| Abstract|| |
Context / Background: Written examinations are the most commonly employed method for assessment of cognitive skills in medical education. The few disadvantages of essay questions are less number of questions, limited sampling, unfair distribution of questions over topics, vague questions etc., Blueprinting overcomes these issues, increasing the validity of examinations. Objectives: To describe the process of developing a blueprint for undergraduate written examinations in pathology; and to evaluate its effect as a tool to increase the content validity of assessment. Methodology: A workshop was conducted in the Department of Pathology to sensitize the faculty about the importance of blueprinting. A blueprint was prepared for written examinations in pathology, question papers were set accordingly and administered in preliminary examinations. Feedback was collected from the students and faculty to know their perceptions about the question papers with reference to blueprinting. Results: The students and faculty felt that there was appropriate distribution of questions across topics (77% and 89%, respectively), appropriate weightage given to topics of public health importance (65% and 100%), examinations were fair (86% and 89%). All the faculty felt that blueprints aligns assessment with objectives and helps as a guide and to paper construction. Conclusions: Students were satisfied as blueprinting helped them to attempt examination better. The faculty who validated the blueprint felt that it helps in distribution of appropriate weightage and questions across the topics and blueprinting should be an integral part of assessment.
Keywords: Assessment, blueprinting, pathology
|How to cite this article:|
Patil SY, Gosavi M, Bannur HB, Ratnakar A. Blueprinting in assessment: A tool to increase the validity of undergraduate written examinations in pathology. Int J App Basic Med Res 2015;5, Suppl S1:76-9
|How to cite this URL:|
Patil SY, Gosavi M, Bannur HB, Ratnakar A. Blueprinting in assessment: A tool to increase the validity of undergraduate written examinations in pathology. Int J App Basic Med Res [serial online] 2015 [cited 2020 Apr 7];5, Suppl S1:76-9. Available from: http://www.ijabmr.org/text.asp?2015/5/4/76/162286
| Introduction|| |
"It is said that 'assessment is the tail that wags the curriculum dog.' While this statement amply underscores the importance of assessment in any system of education, it also cautions us about the pitfalls that can occur when assessment is improperly used. "
When we speak to undergraduate medical students after the examinations, not infrequently we hear them complaining in theory examinations that - Too lengthy paper, time was not enough to write; All questions were from few topics only! No questions from many other topics; Questions were too vague, What to write? What to cut?; Long questions were bouncers! They have not taught these. And in practical examinations we hear them complaining that - I had never seen this case before; Most of the theory questions, long case, short case, and viva questions, all were from one/few systems only [Figure 1]. This happens because, in the traditional assessment system in most medical colleges in India, question paper is set by one teacher/examiner and practical examinations are conducted by some other teacher, without any co-ordination and are not aligned to objectives (most of the times).  Often, the content of what to assess is left to the decision of the examiners. Moreover, the examiner/teacher imparts instruction according to what "she/he thinks is appropriate or important." The intended learning outcomes are not stated clearly and therefore overlooked.  The assessment needs to be valid. Validity is a requirement of every assessment and implies that candidates for achieving the minimum performance level have acquired the level of competence set out in the learning objectives.  The validity that relates to measurements of academic achievement is content validity. Content of assessment is said to be valid when it is congruent with the objectives and learning experiences, and congruence between these pillars of education can be facilitated by using blueprinting in assessment. 
|Figure 1: Current scenario of assessment: Students' response many times after written examinations|
Click here to view
In the present study, we describe the process of developing the blueprint for the undergraduate written examinations in pathology and to evaluate its effect as a tool to increase the content validity of assessment.
| Methodology|| |
A faculty development program was conducted in the Department of Pathology to sensitize the faculty about the importance of blueprinting in assessment. Ethical Committee approval was obtained. A blueprint was prepared for Phase II/III term (preliminary) written examinations (theory) in pathology with inputs from all the faculty (since this was the preliminary examinations, the complete syllabus was included in preparing a blueprint and assessment. This was then validated with the help of subject experts/department faculty and necessary changes were made accordingly [Table 1] and [Table 2].
|Table 1: MBBS phase II - preliminary examination: Blueprint for theory paper I (general pathology, hematology and clinical pathology) |
Click here to view
|Table 2: MBBS phase II - preliminary examination: Blueprint for theory paper II (systemic pathology) |
Click here to view
The steps followed to prepare a blueprint were: The scope and purpose of assessment was defined; the weightage to be given to content areas, domains of learning and methods of assessment was decided. Two parameters were considered while calculating this weightage: (i) The perceived impact/importance of a topic in terms of its impact on health, and (ii) The frequency of the occurrence of a particular disease or health problem; the total weightage and number of items to be included was decided; the table of test specifications was decided and accordingly a blueprint was prepared; question papers were set accordingly (paper I and II). 
Written examination of a batch of 163 students was conducted. The feedback questionnaire for collecting feedback from faculty and students about blueprinting was prepared with preset questions including few open ended questions. It was validated with the help of members of Department of Medical Education. Informed consent was taken from students to give a feedback and a total of 139 students who voluntarily agreed to give feedback were included. All 11 faculty of the department who were involved in validating the blueprint provided their feedback. The feedback questionnaire were analyzed and presented as qualitative data.
| Results|| |
Majority of the students felt that there was proper distribution of questions across the topics (77%), appropriate weightage was given to the topics of public health importance (65%), there was synchrony between multiple choice questions (MCQs) and essay type questions (68%), and that the questions tested the in depth knowledge (87%). They also felt that there were not many too easy or too difficult questions (84%) and no question was out of syllabus (87%). Overall, most of the students were satisfied with writing fair examinations (86%) [Figure 2].
Analysis of the feedback of faculty involved in validation of the blueprint revealed that, there was appropriate distribution of questions across the topics (89%), questions were aligned to objectives (100%), questions were distributed adequately as per must know, desirable to know and nice to know categories (100%), included questions that test in depth knowledge (89%), there was synchrony between MCQs and essay questions (100%) and appropriate weightage was given to topics of public health importance (100%). Faculty also felt that blueprint acts as a guide to test paper construction (100%), increases the validity of the assessment (100%), it makes the assessment "fair" (89%) and that blueprint should be an integral part assessment (100%) [Figure 3].
|Figure 3: Faculty feedback on blueprinting with reference to question papers|
Click here to view
Among the open ended questions, there was suggestion from most of the students and faculty that blueprinting should be prepared for every examination of all phases including summative assessment.
| Discussion|| |
Blueprint is a map and a specification for an assessment program which ensures that all aspects of the curriculum and educational domains are covered by assessment programs over a specified period of time.  The term "blueprint" is derived from the domain of architecture which means "detailed plan of action. " In simple terms, blueprint links assessment to learning objectives. It also indicates the marks carried by each question. It is useful to prepare a blueprint so that the faculty who sets question paper knows which question will test which objective, which content unit and how many marks it would carry. 
Blueprinting helps to match various competencies with the course content and the appropriate modality of assessment.  In our study, majority of the faculty (100%) felt that the questions were aligned to objectives. Most of the students felt that there were no questions that were out of syllabus (87%). It makes assessment 'fair' to the students as they can have a clear idea of what is being examined and can direct their learning efforts in that direction.  In this study, feedback from the students and faculty indicated that students felt the examinations were fair (86% students and 89% faculty). Blueprinting helps the teachers in designing the instructional strategies as per the guidelines expected in the curriculum.  Most of the faculty (100%) involved in the validation of blueprint felt that it acts as a guide in construction of test paper. Blueprinting also ensures that the selected test items give appropriate emphasis on thinking skills and assessment of in-depth knowledge.  In our study, most of the students (87%) and faculty (89%) felt that the questions were included, which could test in-depth knowledge. Blueprint deals with the sampling content, competencies and tools for the assessment in a rational and balanced manner.  The feedback revealed that most of the students and Faculty felt that, there was synchrony between MCQs and essay questions (68% and 100%, respectively).
In general, the aim of the blueprinting is to reduce the two major threats to validity, construct under-representation (CU), and construct irrelevance variance (CIV). , CU refers to undersampling or biased sampling of the content domain or the course contents. There may be too few items to sample domain adequately.  CIV is a systematic error introduced into assessment data by the unrelated variables. This means inclusion of flawed item formats, too easy or too difficult questions or examiner bias.  For example, tendency to test favorite, or hot or trivial topics. In our study, most of the students and faculty felt that, there was appropriate distribution of questions across the topics (77% and 89%, respectively), appropriate weightage was given to topics of public health importance (65% and 100%, respectively).
To conclude, blueprinting acts as a valid tool to align objectives with assessment, helps in distribution of appropriate weightage and questions across the topics. Blueprint should be an integral part of assessment.
Financial support and sponsorship
Conflicts of interest
There are no conflicts of interest.
| References|| |
Adkoli B, Deepak KK. Blue printing in assessment. In: Anshu ST, editors. Principles of Assessment in Medical Education. New Delhi: Jaypee Publishers; 2012. p. 205-13.
Sunita YP, Nayana KH, Bhagyashri RH. Blueprinting in assessment: How much is imprinted in our practice? J Educ Res Med Teach 2014;2:4-6.
Coderre S, Woloschuk W, McLaughlin K. Twelve tips for blueprinting. Med Teach 2009;31:322-4.
Adkoli B. Attributes of a good question paper. In: Sood R, editor. Assessment in Medical Education: Trends and Tools. New Delhi: KL Wig Center for Medical Education and Technology, AIIMS; 1995.
Hamdy H. Blueprinting in medical education. N Engl J Med 2007;356:387-95.
Downing SM, Haladyna TM. Validity and its threats. In: Downing SM, Yudkowsky R, editors. Assessment in Health Professions Education. New York: Routledge; 2009. p. 21-56.
[Figure 1], [Figure 2], [Figure 3]
[Table 1], [Table 2]