Print this page Email this page | Users Online: 143
Home About us Editorial board Search Ahead of print Current issue Archives Submit article Instructions Subscribe Contacts Login 


 
 Table of Contents  
ORIGINAL ARTICLE
Year : 2020  |  Volume : 11  |  Issue : 4  |  Page : 185-189

Assessment of suitability of direct observation of procedural skills among postgraduate students and faculty in periodontology and implantology


Department of Periodontology and Implantology, VSPM Dental College and Research Centre, Nagpur, Maharashtra, India

Date of Submission27-Jun-2020
Date of Acceptance19-Nov-2020
Date of Web Publication05-Feb-2021

Correspondence Address:
Dr. Surekha Rathod
Department of Periodontics and Implantology, VSPM Dental College and Research Centre, Digdoh Hills, Hingna Road, Nagpur - 440 019, Maharashtra
India
Login to access the Email id


DOI: 10.4103/srmjrds.srmjrds_56_20

Rights and Permissions
  Abstract 

Purpose: Direct observation of procedural skills (DOPS) is widely used on the basis of workplace-based assessments, which consist of direct observation and feedback. The DOPS covers all related professional qualities including knowledge, clinical perception, communication skills, ethics, the rights of patients, and the speed and accuracy of the task. However, the use of DOPS in dental treatment, especially in the field of periodontics is limited. Objective: The aim of the study was to find the utility of DOPS as assessment tool for the periodontal procedure. Methodology: The departmental faculty and postgraduate (PG) students were sensitized to the DOPS concept. A total of 12 PG students performed the periodontal procedure and were assessed on the basis of the DOPS rating scale for the 4 encounters each. The student's clinical skills were assessed by faculty members and then both students and faculty members were asked to give feedback for the procedure and overall experience. Results: Faculty and students both were comfortable in giving the feedback. Improvements in the knowledge of different clinical skills like relevant anatomy, technique of procedure were observed from 1st encounter to 4th encounter from 70% to 73.33%. Taking informed consent was improved from 53.33% to 90% from 1st encounter to 4th. Moreover, it helped the student to develop and improve their clinical skills, patient examination, and decision-making and treatment. Conclusion: The present study finding shows that faculty and students both were comfortable in giving the feedback and it helped the student to develop and improve their clinical skills. Our experience indicates that DOPS is an appropriate and functional tool in PG setup to enhance the clinical competencies.

Keywords: Clinical competency, direct observation of procedural skills, periodontics


How to cite this article:
Rathod S, Kolte R, Gonde N. Assessment of suitability of direct observation of procedural skills among postgraduate students and faculty in periodontology and implantology. SRM J Res Dent Sci 2020;11:185-9

How to cite this URL:
Rathod S, Kolte R, Gonde N. Assessment of suitability of direct observation of procedural skills among postgraduate students and faculty in periodontology and implantology. SRM J Res Dent Sci [serial online] 2020 [cited 2021 Mar 5];11:185-9. Available from: https://www.srmjrds.in/text.asp?2020/11/4/185/308785


  Introduction Top


In medical training, assessment of the clinical skills is an important and fundamental part. It helps us to determine how well it has been learned and whether the appropriate expectations were met. A curriculum based on skills is established in today's era, so competency-based assessment approaches are required to be included in medical education.[1]

Miller's pyramid (1990) has been developed for portraying the hierarchy of learning domains in the clinical field and assessment of skills related to different learning domains in medical education. It also facilitates the matching of learning outcomes by clinical instructions (clinical competencies) with expectations of what the learner should be able to do at every stage in a pyramid form. In this pyramid, the first component is “Knows” forms; it occupies the base of the pyramid and the foundation for building clinical competence; the second component in the pyramid is “Knows How” which uses knowledge in the acquisition, analysis, and interpretation of data and the development of a plan. The third component is “Show How” which requires the learner to demonstrate the integration of knowledge and skills into successful clinical performance, and lastly, the tip or 4th component of the pyramid is “Does” which focuses on methods that provide an assessment of routine clinical performance [Figure 1].[2] All the levels in Miller's pyramid are used in both UG and postgraduate (PG) curriculum. However, in our study, for postgraduate students' evaluation, approaches associated with the Miller's Pyramid rates “show how” and “does” are taken into account.
Figure 1: Millers pyramid

Click here to view


One of the most critical factors of medical training is “Clinical assessment” that inspires students and enables teachers to recognize educational strengths and weaknesses.[3] Assessment of trainee's success at their working place is based on workplaced- based assessment (WPBA). An advantage of WPBA is that it gives trainees the ability to receive feedback on their results. In order to address important areas of clinical performance based on the clinical skills, practical competences, and generic competencies, various methods are taken into consideration, out of those “direct observation of procedural skills” (DOPS) method is now gaining importance.[4]

DOPS is an evaluation method designed by the Royal Medical College in England for clinical skill assessment and is implemented since the year 2003.[4],[5] DOPS consists of direct observation and feedback which is widely used on the basis of workplace-based assessment. In DOPS, the focus is on procedural skills. The students are evaluated on rating skill instrument which includes understanding of relative anatomy, adequate preoperative procedure, technical ability, the aseptic precaution, management of the postoperative procedure, communication skills, and then evaluate performance standardized task.

The DOPS is both formative and summative assessment method, where the trainee receives immediate positive feedback on each procedure skills and also it can be repeated several times to assess the improvement in the clinical skills.[6] In routine clinical practice, however necessary and appropriate, direct observation of the student performance should be critically assessed based on evidential basis.[7] However, the use of DOPS in dental treatment, especially in the field of periodontics is limited. Hence, this study was carried out to sensitize PG students and faculty regarding the DOPS and to develop DOPS checklist for different procedures in periodontology.

DOPS can be utilized for the assessment of progressive improvement in procedure skills by assessing the PG students periodically. Hence, keeping all these in mind, the aim of the study was to evaluate the utility of DOPS as assessment tool for periodontal surgical procedures in postgraduate students. The main objective of the study was to introduce DOPS as clinical assessment and feedback process to the students and the faculty and how it can be used in order to enhance their skills.


  Methodology Top


The present interventional cross-sectional study was carried out in the department of periodontology and implantology. The research was approved by the institutional ethics committee. The departmental faculties and PG students were sensitized to the DOPS concept and also oriented faculty as well as students to offer constructive feedback on the intervention. In orientation workshop, we sensitized faculties and students by showing a video about how to evaluate students with DOPS rating form and feedback. No faculty member or PG student had a previous DOPS exposure.

The steps taken to perform any procedure were in agreement among the faculties accessors while evaluating the technical section of the DOPS assessment form.[6] In the present study the DOPS skill rating form was used with modification depending upon the periodontal procedure like crown lengthening, flap procedure [Table 1].
Table 1: Direct observation of procedural skills evaluation skill format

Click here to view


For each PG student, a total of 4 encounters took place for 6-month duration. The observation for each procedure lasts for 10–15 min with a feedback period of 5–10 min. The performance of trainee is rated on a 6-point scale where 1–2 level of competence is below expectation, 3 indicates borderline, and 4 referred to those who meet the expectation in terms of ability, i.e., they are as per expectation and 5–6 represents above expectation. The faculty monitored the student's procedures and then provide them with immediate feedback based on the direct observation. Feedback was taken from the students about the opinion of their overall experience.


  Results Top


All twelve PG students were considered for the evaluation and the faculties observed a total of 4 encounters for each member. The relevant anatomy, technique of procedure has improved from 70% to 73.33% from the first encounter to the fourth encounter. As taking informed consent is an integral part in any surgical procedure and in our study, huge improvement was found while taking informed consent which was improved from 53.33% to 90% from 1st encounter to 4th encounter. Preprocedural preparation was maintained to 80% at 1st and 4th encounter, while it was decreased in 2nd and 3rd encounter. The technical ability also improves from 53.33% to 80% from 1st encounter to 4th encounter. The aseptic technique skill was also improved to 86.66% at 4th encounter compared to 1st encounter with 70%. Postprocedure management and overall ability were also improved at 4th encounter to 80% and 83.33% as compared to the 1st encounter [Chart 1].



The open-ended immediate feedback was obtained from students and faculties Moreover, it helped the student to develop and improve their clinical skills, patient examination, and decision-making in diagnosis and treatment [Table 2].
Table 2: Open-ended feedback from student and faculty

Click here to view



  Discussion Top


Clinical assessment testing is an area of confusion in the health sciences. Examinations are a constant source of problems for many faculty members, curriculum designers, and educationalists. The evaluation of student achievement is continuously debated at educational meetings, conferences, and workshops.[8]

In the present study, the performance of student DOPS scores from the first to the fourth encounter of clinical practice shows an improvement in all skills evaluated, supporting the validity of this assessment method. The greatest improvement was found in the skills like informed consent (from 53.33% to 90%) taking, followed by aseptic technique, appropriate analgesia, and overall ability. In the medical profession, taking informed consent from patients is an integral step before any surgical procedure. This shows that the DOPS covers all related professional attributes including knowledge, clinical perception, communication skills, medical ethics, the rights of patients, and the speed and accuracy of the task.

Our findings were in accordance with a study done in 2009 by Shahgheibi et al. where they assessed the clinical skills of nursing students and reported that the DOPS is a valid and reliable method of assessment.[9] Similarly, Kapoor et al. in 2010 done a study among the intern in the ophthalmology department to refine their clinical skills and they found that students and faculties were both satisfied with the test; however, faculties were more satisfied than students.[10]

In our study, the faculty felt comfortable giving the students' feedback and it is worth mentioning that feedback was given in all cases. Likewise, students were gone through these encounters too; their level of satisfaction is likely to rise further in improving their clinical skills. Students felt satisfied giving feedback to faculty as well. The study done by Ali et al. 2019[11] found DOPS as an effective, secure, and reliable workplace-based evaluation method for enhancing urology residents' surgical skills in real-time operating room scenarios. This also provides the residents with useful feedback in shorter periods of time. Kumar et al., 2017, assessed the role of DOPS in teaching and assessment of PG students and observed that repeated DOPS resulted in strengthened student skills and trust in the management of real-life obstetric emergencies irrespective of the teaching modality.[12]

Singh et al 2017 in their study DOPS may be used in a dental undergraduate training evaluation and appears to be highly feasible and appropriate in the evaluation of clinical skills.[13] Time to time observation and feedback by faculty also increased its utility. Farajpour et al. found DOPS a very useful in ensuring the adequacy of medical undergraduate students' education and in assessing their ability to take on professional tasks.[14]

Profanter and Perathoner., 2015, analyzed that DOPS is the relevant method and showed improvement in performance of clinical skills and works well in the 4th year undergraduate students.[15] The traditional assessment method versus DOPS difference was observed by Neeralagi et al. 2019 and found that DOPS to be very effective method to assess all domains of learning including cognitive, affective, and psychomotor skills of interns during performing a procedure as compared to the traditional method.[16]

Despite its significance, DOPS was not a common modality in India. Bindal et al., 2013, conducted a questionnaire study in a regional anesthetic training program. They observed that training is essential in the use of this WBA tool which needs to be planned and sufficient time allocated so as to address current negative attitudes.[17] As it was not a planned assessment in the study, so the author found that DOPS assessments are not valued as an education tool and training in the use of WBA (workplace-based assessments). Mini clinical evaluation exercise (Mini-CEX) as one of the assessments for diagnosis and treatment in dentistry was reported to be a form of direct observation in the Indian Scenario. Lele 2011[18] and Rathod et al. 2017[19] assessed Oral Diagnosis Medicine and Radiology and periodontics postgraduate student, respectively, using the Mini-CEX examination tool and found that Mini-CEX is an acceptable and practical tool in the PG setting. However, there is evidence that the effect of DOPS was about ten times higher than that of Mini-CEX which was observed in the meta-analyses done by Suhoyo et al. 2014[20] and Kim et al. 2016;[21] this could be due to the different design of the study, where DOPS studies on specific procedures and aligned intervention and measurement of the results. The mini-CEX studies employed a more general approach and did not match intervention and measurement of the result as closed as possible. In one of the recent systematic review done in 2018,[22] the author summarized the available evidence on the educational impact of mini-CEX and DOPS from 1995 to 2016 and analyzed associations of the educational impact with characteristics of the setting and characteristics of the implementation status.

The examiners as well as the trainees were pleased with the new approach of clinical skills assessment, and their satisfaction was associated with their assessment and their time spent evaluating the trainee and also the difficulty of the case.

Limitations

  1. Patients were not asked to provide feedback
  2. We have not taken closed-ended feedbacks from students and faculties
  3. The sample size was small as the study was conducted in single department only
  4. The number of encounters was limited.



  Conclusion Top


To the best of our knowledge, this is the first research on introducing DOPS in the field of periodontology in PG dental education. Dental practice is the combination of correct knowledge, diagnosis, procedural and communication skills, correct decision-making, etc.; therefore, it is essential that all the dental aspirants have been trained in improving their knowledge and clinical skills. The present study result indicates that DOPS is an appropriate and functional tool in the PG setup. Hence, we can conclude:

  1. Students show considerable improvement in their clinical abilities with the respective encounters
  2. The importance of immediate faculty feedback will help students to enhance their skills.
  3. It strengthened the interaction between student and teacher. Our research shows that DOPS is a successful and promising method for the evaluation of clinical competencies.


Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.



 
  References Top

1.
Newble DI, Jaeger K. The effect of assessments and examinations on the learning of medical students. Med Educ 1983;17:165-71.  Back to cited text no. 1
    
2.
Swanwick T, Chana N. Workplace-based assessment. Br J Hosp Med (Lond) 2009;70:290-3.  Back to cited text no. 2
    
3.
Habibi H, Khaghanizade M, Mahmoodi H, Ebadi A. Comparison of the effects of modern assessment methods (DOPS and Mini-CEX) with traditional method on nursing students' clinical skills: A randomized trial. Iran JJ Med Educ 2013;13:364-72.  Back to cited text no. 3
    
4.
Wragg A, Wade W, Fuller G, Cowan G, Mills P. Assessing the performance of specialist registrars. Clin Med (Lond) 2003;3:131-4.  Back to cited text no. 4
    
5.
Barton JR, Corbett S, van der Vleuten CP; English Bowel Cancer Screening Programme, UK Joint Advisory Group for Gastrointestinal Endoscopy. The validity and reliability of a direct observation of procedural skills assessment tool: Assessing colonoscopic skills of senior endoscopists. Gastrointest Endosc 2012;75:591-7.  Back to cited text no. 5
    
6.
Norcini J, Burch V. Workplace-based assessment as an educational tool: AMEE Guide No. 31. Med Teach 2007;29:855-71.  Back to cited text no. 6
    
7.
Shute VJ. Focus on formative feedback. Rev Educ Res 2008;78:153-89.  Back to cited text no. 7
    
8.
Van Der Vleuten CP. The assessment of professional competence: Developments, research and practical implications. Adv Health Sci Educ Theory Pract 1996;1:41-67.  Back to cited text no. 8
    
9.
Shahgheibi SH, Pooladi A, Bahram Rezaie M, Farhadifar F, Khatibi R. Evaluation of the effects of direct observation of procedural skills (DOPS) on clinical externship students' learning level in obstetrics ward of Kurdistan University of medical sciences. J Med Educ 2009;13:29-33.  Back to cited text no. 9
    
10.
Kapoor H, Tekian A, Mennin S. Structuring an internship programme for enhanced learning. Med Educ 2010;44:501-2.  Back to cited text no. 10
    
11.
Ali L, Ali S, Orakzai N, Ali N. Effectiveness of direct observation of procedural skills (DOPS) in postgraduate training in urology at institute of kidney diseases, Peshawar. J Coll Physicians Surg Pak 2019;29:516-9.  Back to cited text no. 11
    
12.
Kumar N, Singh NK, Rudra S, Pathak S. Effect of formative evaluation using direct observation of procedural skills in assessment of postgraduate students of obstetrics and gynecology: Prospective study. J Adv Med Educ Prof 2017;5:1-5.  Back to cited text no. 12
    
13.
Singh G, Kaur R, Mahajan A, Thomas AM, Singh T. Piloting direct observation of procedural skills in dental education in India. Int J Appl Basic Med Res 2017;7:239-42.  Back to cited text no. 13
    
14.
Farajpour A, Amini M, Pishbin E, Mostafavian Z, Akbari Farmad S. Using modified direct observation of procedural skills (DOPS) to assess undergraduate medical students. J Adv Med Educ Prof 2018;6:130-6.  Back to cited text no. 14
    
15.
Profanter C, Perathoner A. DOPS (Direct Observation of Procedural Skills) in undergraduate skills-lab: Does it work? Analysis of skills-performance and curricular side effects. GMS Z Med Ausbild 2015;32:Doc45.  Back to cited text no. 15
    
16.
Neeralagi S, Sudhindra GS, Lokesh G. Direct observation of procedural skills (DOPS) versus traditional assessment method for nasogastric tube insertion skill. J Evid Based Med Healthc 2019;6:765-9.  Back to cited text no. 16
    
17.
Bindal N, Goodyear H, Bindal T, Wall D. DOPS assessment: A study to evaluate the experience and opinions of trainees and assessors. Med Teach 2013;35:e1230-4.  Back to cited text no. 17
    
18.
Lele SM. A mini-OSCE for formative assessment of diagnostic and radiographic skills at a dental college in India. J Dent Educ 2011;75:1583-9.  Back to cited text no. 18
    
19.
Rathod SR, Kolte A, Shori T, Kher V. Assessment of postgraduate dental students using mini-clinical examination tool in periodontology and implantology. J Indian Soc Periodontol 2017;21:366-70.  Back to cited text no. 19
[PUBMED]  [Full text]  
20.
Suhoyo Y, Schönrock-Adema J, Rahayu GR, Kuks JB, Cohen-Schotanus J. Meeting international standards: A cultural approach in implementing the mini-CEX effectively in Indonesian clerkships. Med Teach 2014;36:894-902.  Back to cited text no. 20
    
21.
Kim S, Willett LR, Noveck H, Patel MS, Walker JA, Terregino CA. Implementation of a mini-CEX requirement across all third-year clerkships. Teach Learn Med 2016;28:424-31.  Back to cited text no. 21
    
22.
LoÈrwald AC, Lahner F-M, Nouns ZM, Berendonk C, Norcini J, Greif R, et al. The educational impact of mini-clinical evaluation exercise (Mini-CEX) and direct observation of procedural skills (DOPS) and its association with implementation: A systematic review and meta- analysis. PLoS ONE 2018:13;e0198009.  Back to cited text no. 22
    


    Figures

  [Figure 1]
 
 
    Tables

  [Table 1], [Table 2]



 

Top
 
 
  Search
 
Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
Access Statistics
Email Alert *
Add to My List *
* Registration required (free)

 
  In this article
Abstract
Introduction
Methodology
Results
Discussion
Conclusion
References
Article Figures
Article Tables

 Article Access Statistics
    Viewed176    
    Printed4    
    Emailed0    
    PDF Downloaded28    
    Comments [Add]    

Recommend this journal