Home About us Editorial board Search Ahead of print Current issue Archives Submit article Instructions Subscribe Contacts Login 
  Users Online: 604 Home Print this page Email this page Small font sizeDefault font sizeIncrease font size  

 Table of Contents  
ORIGINAL ARTICLE
Year : 2017  |  Volume : 7  |  Issue : 4  |  Page : 239-242  

Piloting direct observation of procedural skills in dental education in India


1 Department of Periodontia, Christian Dental College, Ludhiana, Punjab, India
2 Department of Pedodontia, Christian Dental College, Ludhiana, Punjab, India
3 Department of Pediatrics and Medical Education, Christian Medical College, Ludhiana, Punjab, India

Date of Submission21-Feb-2017
Date of Acceptance06-May-2017
Date of Web Publication11-Dec-2017

Correspondence Address:
Dr. Tejinder Singh
Department of Pediatrics and Medical Education, Christian Medical College, Ludhiana - 141 008, Punjab
India
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/ijabmr.IJABMR_54_17

Rights and Permissions
   Abstract 


Context: Direct observation of procedural skills (DOPS) and corrective feedback are one of the most important tools to promote skill learning. Authentic and pedagogically effective feedback can come out only when based on direct observation of the learners' performance. Use of DOPS, particularly in dental education in India is very uncommon. Aims: To pilot DOPS on undergraduate dental students in the specialty of periodontia. Materials and Methods: The faculty was oriented to the concept and use of this modality during a 1 h session, which included a video demonstration. The generic DOPS recording format with modifications was used for periodontal procedures. A total of 42 procedures (including 7, 30, and 5 of low, average, and high difficulty, respectively) performed by 15 students were observed by four faculty members. Feedback was provided to the students regarding the procedure and how to overcome shortcomings if any. Results: Faculty was comfortable observing and providing feedback (3.95/5.0) and found this mode feasible and nonintrusive in their clinical and teaching schedule. Students expressed satisfaction and acceptance of this modality (4.19/5.0), felt that it would help them in learning skills better (4.01/5.0) and 83% wanted it to be extended to other clinical areas, preferably from the beginning of their clinical postings. Conclusion: DOPS can be incorporated in the in-training assessment of undergraduate dental students and seems to have a good feasibility and acceptability. Faculty training in observation and providing feedback will enhance its utility.

Keywords: Dental education, direct observation, feedback, skills learning


How to cite this article:
Singh G, Kaur R, Mahajan A, Thomas AM, Singh T. Piloting direct observation of procedural skills in dental education in India. Int J App Basic Med Res 2017;7:239-42

How to cite this URL:
Singh G, Kaur R, Mahajan A, Thomas AM, Singh T. Piloting direct observation of procedural skills in dental education in India. Int J App Basic Med Res [serial online] 2017 [cited 2021 Nov 27];7:239-42. Available from: https://www.ijabmr.org/text.asp?2017/7/4/239/220374




   Introduction Top


Direct observation and corrective feedback are one of the most important tools to promote skill learning. In the meta-analysis conducted by Hattie,[1] it was shown that feedback has a most profound influence on student achievement, with an effect size of 1.13. Authentic and pedagogically effective feedback can come out only when based on direct observation of the learners' performance, rather than being based on historical facts or being of a general nature.

Despite its importance and acceptance, direct observation of students' performance rarely occurs in clinical practice. Even where it occurs, it is inadequate and based on arbitrary practices rather than on any scientific basis. Direct observation of procedural skills (DOPS) assessment was developed by the Royal College of Physicians,[2] and now forms part of workplace-based assessments for doctors in the foundation year in many countries. DOPS assesses students' performance over a single encounter, usually focused on a single procedural skill. In effect, this means that a single pair of assessor-student can have multiple encounters involving multiple skills. DOPS serves the twin purpose of assessment as well as learning by observing the trainee in the workplace.[3]

Trainees' performance is scored using a 6-point rating scale where 1–2 is below the expected level of competency; three reflects a borderline level of competency, four meets the expected level of competency, and 5–6 are above the expected level of competency. The assessment procedure is generally expected to require 5–15 min of observation time and 5 min dedicated to feedback. Trainees are provided with a list of commonly performed procedures for which they are expected to demonstrate competence. They are assessed by multiple clinicians on multiple occasions throughout the training period.

This method of procedural skills assessment is not limited to postgraduate training programs. Paukert et al.[4] have included basic surgical skills to be mastered by undergraduate students in their clinical encounter card system. Although DOPS is similar to procedural skills log books, the purpose and nature of these methods differ significantly. The recording of procedures is common to both of them, but log books are usually designed to ensure that students have simply performed the minimum number required to be considered competent. The provision of structured feedback based on observation of performance is not necessarily part of the log book process. Moreover, the procedure is not necessarily performed under direct observation and little feedback, if any, is expected to be given. In contrast, DOPS ensures that students are given specific feedback based on direct observation so as to improve their procedural skills.

Despite its educational effectiveness and simplicity, the method has not yet been formally implemented in medical/dental education in India, and we came across only one report of its use in ophthalmology[5] and another in pediatrics.[6] Its use in dental education in India has not been reported so far.

During their posting in the final clinical year, students are required to undergo and learn a number of procedural skills. This study was designed to evaluate the feasibility, acceptability, and utility of DOPS for undergraduate dental students.


   Materials and Methods Top


The study was approved by the Institutional Research and Ethics Committee and was done on 15 Bachelor of Dental Surgery students coming for their final clinical rotation. The faculty of the department was oriented to the concept of the DOPS and also in the skills regarding giving good feedback. A brief orientation regarding the intervention and its purpose was also given to the students. None of the assessors or the students had any prior exposure to DOPS. The generic DOPS form with modification for dental procedures was used [Annexure 1][Additional file 1] . The DOPS were planned in ad hoc way, depending on the availability of the patient and student. The faculty observed the procedures being performed by the students and then immediately provided them feedback based on the direct observation. Feedback was also collected from the faculty as well as the students regarding the feasibility [Annexure 2] [Additional file 2], acceptability and utility of the intervention on a 5-point scale.

Observations

A total of 42 encounters were observed by the four faculty members on 15 students. These included oral examination, data gathering, periodontal charting, and demonstration of brushing technique, hand scaling, and ultrasonic scaling. No student did the same procedure twice for this purpose.

Of the total encounters, 7 were low level of difficulty, 30 average and 5 had high level of difficulty. The assessors rated 16 as below expectation and another 3 as above expectation, indicating the use of an entire range of scale. Safe analgesia was ticked as “unable to comment” in most cases as it was not used during any encounter.

The duration of observation and duration of feedback is shown in [Table 1]. Feedback was provided to the students within 5 min of the procedure in 9 (21.4%) cases, within 15 min in 12 (28.5%) cases, and after 15 min in the remaining 21 (50%) cases.
Table 1: Time taken for observation and feedback

Click here to view


Students felt that it is a useful tool (mean rating 4.19 + 1.21 on a scale of 5) and will help them to learn the skills better (mean rating 4.01 + 0.92 on a scale of 5). A majority (83.5%) wanted it to be introduced in other areas as well. They suggested that this modality should be introduced from the beginning of the clinical postings. Faculty felt comfortable providing feedback (mean rating 3.95 + 0.43 on a scale of 5) and felt that it is likely to help the students in learning skills. Open-ended comments have been listed in [Table 2].
Table 2: Open ended comments

Click here to view



   Discussion Top


There is a growing body of evidence to suggest that assessment and learning could be combined into a single task, thus making learning focused and assessment authentic. Authentic feedback following observation of trainee performance has been shown to have the most important influence on achievement.

In our study, faculty felt comfortable in giving feedback to the students, and it is noteworthy that feedback was provided in all cases. In fact, in some cases feedback extended to more than 10 min which is double the time taken for observation. The entire range of the scale was used for rating the encounters, which suggests that the assessors were able to distinguish between various levels of performance. Using the unable to comment option also indicates that ratings were based on observation rather than only a tick mark exercise. Students received feedback within a span of 30 min, which can be considered fairly immediate. However, as we gain more experience, these timings can be fine-tuned. As students also go through these encounters, their comfort level is likely to increase further.

The logistics of carrying out DOPS in a workplace setting is largely similar to mini-clinical evaluation exercise with multiple occasions of observation by multiple clinicians on multiple procedural skills.[6] The scoring is done on a 6-point rating scale on a standard rating form. Global rating rather than checklist based rating is done, and it has been shown to produce valid results with the ability to distinguish between levels of performance. The procedures chosen for observation are usually short procedures requiring about 10–15 min of observation time and the observer could be a faculty member, senior resident, trained nurses, or even peers. The observation and scoring are essentially and immediately followed by a feedback interaction session between the assessor and the assessed.

Despite its importance, direct observation of performance has not been a popular modality in India. There have been reports of using a mini-objective structured clinical examination for oral radiology, which is a form of direct observation from India,[7] but this had more to do as an assessment tool rather than as a learning tool. We tried to pilot DOPS as a tool to assist in skill learning. Our results suggest that it is feasible and acceptable for dental skills given the current education scenario in India.

Assessment should be concerned with not only proving but also improving learning. As has been discussed earlier, feedback has been shown to exert a very potent influence on learning. DOPS provides both the opportunities in one encounter it allows the student to be rated and also provided with developmental feedback. There may be concerns about some subjectivity involved, but experience has shown that by increasing the number of encounters to 6–8 in a year, a reasonable level of reliability can be attained.[2]

The faculty and students had no exposure to DOPS earlier, but they did not find it difficult. Our data suggests that DOPS had good acceptability among our study group. Students felt that this helps them in better skills learning, is nonthreatening and should cover other areas of the curriculum. The faculty, on the other hand, felt that it can be integrated within the existing teaching practices and does not require additional preparations. Faculty also felt that training in observation and providing feedback would improve their effectiveness as assessors. However, the small sample size may not really make the results more generalizable.

DOPS has been found useful as assessment and learning tool in many studies.[8] However, as with any other assessment tool, the quality of utility depends on what is assessed rather than on how it is assessed.[9] In addition, this also involves providing educational feedback to the students, which can add another variable affecting the utility of this tool. Before we try making it a routine part of our teaching-learning process, orientation of the students, and training of the faculty are vital. The importance of faculty training in improving quality can never be overemphasized.[10]

We plan to continue this intervention as a longitudinal study with students requiring 8–10 DOPS per year and then see the progression in clinical skills. This will not only help the students to learn skills better but also help the faculty to take timely remedial action if needed.[11]

Acknowledgments

Authors would like to acknowledge students and faculty who participated in this study.

Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.



 
   References Top

1.
Hattie JA. Influences on Student Learning. Inaugural Professorial Address, University of Auckland, New Zealand; 1999. Available from: http://www.arts.auckland.ac.nz/staff/index.cfm?P8650. [Last accessed on 2013 Apr 04].  Back to cited text no. 1
    
2.
Norcini JJ. Workplace-based assessment in clinical training. In: Swanwick T, editor. Understanding Medical Education Series. Edinburgh, UK: Association for the Study of Medical Education; 2010. p. 232-45.  Back to cited text no. 2
    
3.
Wragg A, Wade W, Fuller G, Cowan G, Mills P. Assessing the performance of specialist registrars. Clin Med (Lond) 2003;3:131-4.  Back to cited text no. 3
[PUBMED]    
4.
Paukert JL, Richards ML, Olney C. An encounter card system for increasing feedback to students. Am J Surg 2002;183:300-4.  Back to cited text no. 4
    
5.
Kapoor H, Tekian A, Mennin S. Structuring an internship programme for enhanced learning. Med Educ 2010;44:501-2.  Back to cited text no. 5
    
6.
6. Kundra S, Singh T. Feasibility and acceptability of direct observation of procedural skills to improve procedural skills. Indian Pediatr 2014;51:59-60.  Back to cited text no. 6
    
7.
Lele SM. A mini-OSCE for formative assessment of diagnostic and radiographic skills at a dental college in India. J Dent Educ 2011;75:1583-9.  Back to cited text no. 7
    
8.
McLeod R, Mires G, Ker J. Direct observed procedural skills assessment in the undergraduate setting. Clin Teach 2012;9:228-32.  Back to cited text no. 8
    
9.
Singh T. Student assessment: Issues and dilemmas regarding objectivity. Natl Med J India 2012;25:287-90.  Back to cited text no. 9
    
10.
Norcini J, Burch V. Workplace-based assessment as an educational tool: AMEE Guide No. 31. Med Teach 2007;29:855-71.  Back to cited text no. 10
    
11.
Singh T, Kundra S, Gupta P. Direct observation and focused feedback for clinical skills training. Indian Pediatr 2014;51:713-7.  Back to cited text no. 11
    



 
 
    Tables

  [Table 1], [Table 2]


This article has been cited by
1 Feasibility and Effectiveness of Direct Observation of Procedure Skills (DOPS) in General Surgery discipline: a Pilot Study
Padmanabh Inamdar,Prasan Kumar Hota,Malatesh Undi
Indian Journal of Surgery. 2021;
[Pubmed] | [DOI]
2 Introduction of direct observation of procedural skills (DOPS) as a formative assessment tool during postgraduate training in anaesthesiology: Exploration of perceptions
JuiY Lagoo,ShilpaB Joshi
Indian Journal of Anaesthesia. 2021; 65(3): 202
[Pubmed] | [DOI]
3 Ögrenci, Egitmen ve Hastalarin Bakis Açisiyla Dis Hekimligi Klinik Egitimi
Kadriye Funda Akaltan,Irem Öztürk
Selcuk Dental Journal. 2019;
[Pubmed] | [DOI]
4 Dis Hekimligi Egitiminde Beceri ve Yeterligin Degerlendirilmesi II: Degerlendirme Yöntemleri
Kadriye Funda Akaltan
Selcuk Dental Journal. 2019;
[Pubmed] | [DOI]
5 Dis Hekimliginde Preklinik ve Klinik Egitim Çesitliligi
Kadriye Funda Akaltan
Selcuk Dental Journal. 2019;
[Pubmed] | [DOI]



 

Top
 
 
  Search
 
    Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
    Access Statistics
    Email Alert *
    Add to My List *
* Registration required (free)  

 
  In this article
    Abstract
   Introduction
    Materials and Me...
   Discussion
    References
    Article Tables

 Article Access Statistics
    Viewed2170    
    Printed31    
    Emailed0    
    PDF Downloaded233    
    Comments [Add]    
    Cited by others 5    

Recommend this journal