HomeAboutusEditorial BoardCurrent issuearchivesSearch articlesInstructions for authorsSubscription detailsAdvertise

  Login  | Users online: 129

   Ahead of print articles    Bookmark this page Print this page Email this page Small font sizeDefault font size Increase font size  


 
 Table of Contents    
ORIGINAL ARTICLE  
Year : 2020  |  Volume : 45  |  Issue : 4  |  Page : 526-530
 

Effect of interactive lectures and formative assessment on learning of epidemiology by medical undergraduates – A mixed-methods evaluation


Department of Community Medicine, Sri Manakula Vinayagar Medical College and Hospital, Puducherry, India

Date of Submission20-Jan-2020
Date of Acceptance03-Jul-2020
Date of Web Publication28-Oct-2020

Correspondence Address:
Dr. Amol R Dongre
Department of Community Medicine, Sri Manakula Vinayagar Medical College and Hospital, Puducherry - 605 001
India
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/ijcm.IJCM_46_20

Rights and Permissions

 

   Abstract 


Background: Previously, we had a course in epidemiology for medical undergraduates that was based on traditional lecture methods with no formal formative assessment (FA). We found poor uptake of our course in terms of learning and attendance by students. Objective: The objective was to assess the effect of improved course (interactive lectures and formal FA) in epidemiology on student learning and attendance. Materials and Methods: It was a triangulation type of mixed-methods program evaluation, where both quantitative (quasi-experimental design) and qualitative (open-ended responses) analysis was done. This study was carried out in the department of community medicine in a tertiary care teaching hospital, Puducherry. We improved the quality of the course material, interaction in lectures and included formal structured FA in the last course. Kirkpatrick's framework was used for the course evaluation. We compared the performance of three batches to check the effect of our revisions on students' learning and their attendance. Results: Student's learning outcome was measured using end-of-course assessment scores (Level-2). The percentage of students successfully completing the course improved from 39% to 81% and attendance status of ≥90% improved from 50% to 57%. Learner's immediate reactions (Level-1) were captured using open-ended questions, and content analysis was done. Students appreciated the course material, FAs, and in-class activity. Conclusions: Little improvement in a traditional epidemiology course for undergraduates in the form of interactive lectures and formative feedback and providing the student with course material led to significant gains in students' knowledge and attendance.


Keywords: Epidemiology, feedback, formative assessment, interactive lecture, Kirkpatrick's evaluation


How to cite this article:
Venugopal V, Dongre AR. Effect of interactive lectures and formative assessment on learning of epidemiology by medical undergraduates – A mixed-methods evaluation. Indian J Community Med 2020;45:526-30

How to cite this URL:
Venugopal V, Dongre AR. Effect of interactive lectures and formative assessment on learning of epidemiology by medical undergraduates – A mixed-methods evaluation. Indian J Community Med [serial online] 2020 [cited 2020 Dec 1];45:526-30. Available from: https://www.ijcm.org.in/text.asp?2020/45/4/526/299429





   Introduction Top


Learning epidemiology improves critical thinking and the problem-solving ability of students. However, medical undergraduates pay less attention to this subject, assuming that skills in this subject are not related to practicing medical service in future.[1],[2] Though there are various methods of teaching epidemiology, it is predominantly taught by traditional lecture system in India.[3] Lectures make student passive recipients of information, leading to poor engagement of the learning process.[4] These things make teaching epidemiology a real challenge to them.

Considering the large intake of students in medical colleges, lectures still continue to be the primary mode of teaching. Hence, it is important to look at ways in which the traditional lectures can become a more effective environment for engaging students in gaining adequate knowledge and skills. Scientific literatures recommend that lectures can be made effective by modifying it following good teaching–learning principles and assessment format.[5] However, there exists a paucity of evidence, especially at local context, showing the combined effect of interactive teaching and formative assessment (FA) on learning epidemiology by medical undergraduates.

In the department of community medicine (DCM), we have been running a course on epidemiology for medical students for the last 4 years. It has been found that nearly 40% of the students successfully complete the course, which was thought to be far from satisfactory. The course was predominantly lecture based, and there was no mechanism to identify and offer assistance to students facing any learning difficulty. Thus, it was decided to improve the existing epidemiology course and to assess its effectiveness in terms of uptake of the course by students.


   Materials and Methods Top


Setting and participants

The study was carried out in a private medical college located at the eastern border of Puducherry union territory for a period of 10 months from August 2018 to May 2019. The college admits 150 undergraduate students during every academic year and is affiliated to the Pondicherry University. Epidemiology is a part of medical undergraduate's curriculum which is taught during posting in the DCM. The improved epidemiology course was delivered to 145 sixth-semester students of the academic year 2016, over a period of 3 months from January 2019 to March 2019. Two previous academic batch students of the year 2015 (n = 122) and 2014 (n = 129) for whom epidemiology was taught using traditional lecture formed the historic comparison cohorts.

Design and intervention

It was a triangulation type of mixed-methods evaluation, where both quantitative (quasi-experimental design) and qualitative (open-ended responses) methods were involved simultaneously for evaluating the improved course on epidemiology.[6] Quasi-experimental design was adopted to evaluate the effectiveness of intervention, which was ascertained by the uptake of the epidemiology course by medical undergraduates, and the qualitative component included content analysis of written feedback given by the students about the course. Uptake of the epidemiology course was measured in terms of the students' learning scores and their overall course attendance. Improved lecture system is a package of educational interventions that aimed to improve the learning of epidemiology that includes development of course material, lecture with preplanned interaction in the form of self-check exercises (multiple-choice questions [MCQs] and epidemiological problems), and FA with feedback tailored to every student.

As there were modifications in the traditional teaching–learning format, the proposal was presented to the members of the Medical Education Unit and was initiated after obtaining the clearance from the Research Committee and Ethics Committee (IEC code no. 20/2018). The flow of study procedure for the development, delivery, and evaluation has been explained in [Figure 1].
Figure 1: Overview and timeline of the educational project

Click here to view


Development of course and course material

We ensured that the overall course objectives were in alignment with the Pondicherry University curriculum. The intended learning outcomes of each session of the course were in alignment with the overall course objectives. The course content consisted of study design, measures of association, interpretation of findings, and causal models. All the three batches were taught the similar content, however the recent batch additionally received interactive lecture and feedback. The revised course included authentic examples from real worksetting to help them understand its relevance to practice. The course material was written in simple English language, and it was made available to all the students at the beginning of the course. The course book included details on course goals, session objectives, support strategy, and criteria for certification.

Planning and delivery of interactive lectures

Ten classes (2 h/week) were delivered over the period of 3 months. The interactive sessions were facilitated by a team of trained faculty and postgraduates in community medicine. The lesson plan of each session included set induction, recapitulation of the previous session to reactivate their knowledge, and plan for self-check exercises. At the end of each ten interactive lectures, self-check exercises in the form of MCQ of the single response type that test learners remembering and understanding levels were used. Students were encouraged and offered clarifications on concepts, if required.

Formative assessments and feedback

Students were subjected to two written FAs during the course. The question paper consisted of essay-type questions, short-answer questions, and epidemiological exercises of a total of 30 marks. Questions testing all domains of cognition were included. Postgraduates (n = 6) and faculties (n = 5) were trained to give effective feedback. They provided written feedback to all the students stating their strengths and areas for further improvement. This feedback was given within a week of FA. Those students who scored <50% mark were given feedback in a face-to-face meeting based on the Pendleton's model.[7]

The Pendleton method is a simple way to give feedback to learners in a constructive manner.[7] The positives aspects are identified first by the learner, which is followed by the facilitator reinforcing these positives. Then, self-assessment by the learner on what could be done differently is done, and the alternate skills required for improvement are then suggested by the facilitator. Recognizing the positives and avoiding a discussion of weaknesses right at the beginning prevents defensiveness and allows reflective behavior in the learner.

End-of-course assessment

Students' learning of epidemiology was assessed by a written examination conducted at the end of course that was similar to the pattern of FA. Questions testing the factual knowledge, comprehension, reasoning and critical analysis, and synthesis of information related to contents delivered in the course were included. The answer sheets were also checked by the faculty and postgraduates who were not the part of this evaluation study. They assessed the students' responses against the standard answer key. The scores of the current batch of students were compared with the scores of previous batches of students. In order to ensure alignment between course objectives and assessment plan, a test blueprint was prepared.

Evaluation of the course

The evaluation of our improved epidemiology course was carried out to analyze its effectiveness. Because the course was evaluated in terms of learning and as it was backward in nature, we used Kirkpatrick's framework to evaluate it.[8] We used students' end-of-course scores (Level-2) and their feedback given for the course (Level-1) to decide the effectiveness of the course.

Data analysis

SPSS statistical package version 24 (SPSS Inc., Chicago, IL, USA) was used for data analysis. Proportion of students as per se x, grades of end-of-course assessment score, and level of attendance was compared between the present batch of students and the previous batches using Chi-square test. Intention-to-treat analysis was carried out to check the effectiveness of improved course, as all students in the intervention group, not just those who attended all the sessions of the course, were included. The level of significance was set at 5%.

Manual content analysis was carried out. After reading the responses of learners to the open-ended questions, similar responses across the study participants were coded. Next, the codes related to similar areas were clustered together to form the categories. Finally, similar categories were grouped under the selected themes. The guidelines by UCLA Center for Health Policy Research were used for analysis.[9] The “Consolidated Criteria for Reporting Qualitative Research” guidelines have been followed while reporting.[10]


   Results Top


There were 145 students in the present batch (2016) that was exposed to the improved course on epidemiology, of which 44.1% were male and 55.9% were female. In the previous batches, 53.3% and 58.1% were female students of batches 2015 and 2014, respectively. Gender distribution was not statistically significantly different among all the three batches of students (P = 0.74). The result of the present study was described under two levels of Kirkpatrick evaluation model.

Students' reaction (Kirkpatrick level-1)

The content analysis of student's feedback about the course was carried out into two themes, the facilitating factors and the suggestions to improve. Four categories emerged under each theme, namely FAs conducted, course material provided, additional academic support provided, and the quality of teaching methods and media [Table 1].
Table 1: Content analysis of students' reaction to the course

Click here to view


Students' learning (Kirkpatrick level-2)

In the end-of-course assessment, 49% of the present 2016 batch of student's scored ≥70%, whereas only 3.1% and 4.9% of the previous 2014 and 2015 batch students scored ≥70%, respectively. In the current batch, only 18.6% of the students scored <50%, whereas 68.2% of the 2014 batch and 61.5% of the 2015 batch students scored <50% in the end-of-course assessment. The improvement in learning among the current batch of students was statistically significant (P < 0.001). Percentage of students who had ≥90% attendance during the entire course improved in the current batch of students compared to the two previous batches, however this was not statistically significant (P = 0.24).

The median (interquartile range) score of end-of-course assessment of all students in the current batch was 70 (55–80.6) and that of two previous batches namely 2014 and 2015 batch was 31.4 (18.6–51.4) and 36.5 (23–51), respectively. There was significant improvement in the median score of the current batch of students irrespective of gender [Table 2].
Table 2: Gender-wise comparison of median score of students between the previous batches (2014 and 2015) and the current batch (2016)

Click here to view



   Discussion Top


This basic course in epidemiology for medical undergraduates covered the complete cycle of course planning, designing, implementation, and evaluation. The academic performance of the current batch of learners has shown significant improvement than that of the previous two batches of students. Though there was improvement in the overall course attendance, the difference was not statistically significant compared with two historic groups. The revised course was well received by the students.

The current project demonstrated significant knowledge gains in epidemiology among medical undergraduates. Two aspects of this study are particularly noteworthy. First, nearly 50% of the students scored more than 70% mark at the end-of-course exam. Second, the failure rate had reduced significantly compared with the previous two batches of students. Traditional lecture is effective in the transmission of information to a large group of students, but it has its own demerits namely teacher centeredness, long lecture hours, didactic, and passive learning.[11] We found that little interaction in traditional lectures and ongoing assessment and feedback improved the course uptake among students. These findings were similar to the results of an educational model where traditional lectures were supplemented with cognitively engaging tasks.[12] Interaction promotes student's engagement in the learning process, heightens attention and motivation, gives feedback to the teacher and student, and increases satisfaction for both.[13],[14],[15] Previous studies showed that the students exposed to formal FA with immediate feedback showed better academic achievement.[16],[17],[18],[19] It helps to identify at-risk learners who require additional academic support.[20],[21]

Multiple assessment methods were used at multiple times in our course. The use of multiple observations and different assessment methods over time can partially compensate for flaws in any one method.[22] The test papers were in alignment with the course objectives. The assessment content and the method chosen need to be in alignment with the learning objectives and activities in order to make the test scores reliable and valid.[23],[24]

Strengths and limitations

Our educational project, being an improved traditional lecture, had no additional cost involved in designing, implementing, and evaluating the course, thereby making this model feasible and cost-effective. Being an educational research in real practice setting, it was not possible to carry out randomized controlled trial on feasibility and ethical grounds. Hence, the effectiveness of intervention was compared with that of the previous two cohorts of students taught by traditional methods. The previous two batches were taught on the same course content by the same set of faculty and the same course duration and had similar pattern of end-of-course assessment, thus ensuring the comparability. For all the three batches, we had used a test blueprint, a standard answer key for assessment, and the same set of examiners.


   Conclusions and Recommendations Top


Thus, we conclude that little improvement in traditional teaching with respect to interaction with students and ongoing assessment can help us better achieve students' learning outcome and satisfaction. This finding is particularly important for resource-poor developing countries where most of the curriculum is delivered in traditional format with limited technical capacity and resources for a major change in curriculum.

Acknowledgment

This study was done as a part of the FAIMER fellowship in PSG-FAIMER Regional Centre Coimbatore. We acknowledge the support of faculty and fellows at PSG-FRI. We also thank the management of our college for providing permission to conduct this educational research.

Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.



 
   References Top

1.
Ernster VL. On the teaching of epidemiology to medical students. Am J Epidemiol 1979;109:617-8.  Back to cited text no. 1
    
2.
Novick LF, Greene C, Vogt RL. Teaching medical students epidemiology: Utilizing a state health department. Public Health Rep 1985;100:401-5.  Back to cited text no. 2
    
3.
Khapre M, Gupta M, Kishore S. Experiential Learning in Epidemiology for Medical Undergraduates: A Mixed-Method Approach. J Clin Diagn Res 2019;13:7-12.  Back to cited text no. 3
    
4.
Kaur G. Study and analysis of lecture model of teaching. Int J Educ Plann Adm 2011;1:9-13.  Back to cited text no. 4
    
5.
Bransford J, Brown A, Cocking R. How People Learn: Brain, Mind, Experience, and School. Washington, DC: National Academies Press; 2000. Available from: https://www.nap.edu/read/10067/chapter/7. [Last accessed on 2019 Aug 01].  Back to cited text no. 5
    
6.
Creswell JW, Clark VLP. Designing and Conducting Mixed Methods Research. 3rd ed. Washington DC: SAGE; 2011.  Back to cited text no. 6
    
7.
Pendleton D, Schofield T, Tate P, Havelock P. The Consultation: An Approach to Learning and Teaching. Oxford Oxfordshirem, New York: Oxford University Press; 1984.  Back to cited text no. 7
    
8.
Kirkpatrick D, Kirkpatrick J. Evaluating Training Programs: The Four Levels. 3rd ed. Oakland: Berrett-Koehler; 2006.  Back to cited text no. 8
    
9.
UCLA Center For Health Policy Research: Section 4: Key Informant Interviews. Available from: http://healthpolicy.ucla.edu/programs/health-data/trainings/Documents/tw_cba23.pdf.[Last accessed on 2017 May 03].  Back to cited text no. 9
    
10.
Tong A, Sainsbury P, Craig J. Consolidated criteria for reporting qualitative research (COREQ): A 32-item checklist for interviews and focus groups. Int J Qual Health Care 2007;19:349-57.  Back to cited text no. 10
    
11.
Chaudhury SR. The lecture. New Dir Teach Learn 2011;2011:13-20.  Back to cited text no. 11
    
12.
Beichner R. The Student-Centered Activities for Large Enrollment Undergraduate Programs (SCALE-UP) Project. In: Research-Based Reform of University Physics. American Association of Physics Teachers; 2007. p. 1-42. Available from: https://www.per-central.org/items/detail.cfm?ID=4517. [Last accessed on 2019 Aug 01].  Back to cited text no. 12
    
13.
Sharma N, Lau CS, Doherty I, Harbutt D. How we flipped the medical classroom. Med Teach 2015;37:327-30.  Back to cited text no. 13
    
14.
Moffett J. Twelve tips for “flipping” the classroom. Med Teach 2015;37:331-6.  Back to cited text no. 14
    
15.
Cantillon P, Wood DF, Yardley S, editors. ABC of Learning and Teaching in Medicine. 3rd ed. London: BMJ Books; 2017.  Back to cited text no. 15
    
16.
Carrillo-de-la-Peña MT, Baillès E, Caseras X, Martínez A, Ortet G, Pérez J. Formative assessment and academic achievement in pre-graduate students of health sciences. Adv Health Sci Educ Theory Pract 2009;14:61-7.  Back to cited text no. 16
    
17.
Velan GM, Jones P, McNeil HP, Kumar RK. Integrated online formative assessments in the biomedical sciences for medical students: Benefits for learning. BMC Med Educ 2008;8:52.  Back to cited text no. 17
    
18.
Jain V, Agrawal V, Biswas S. Use of formative assessment as an educational tool. J Ayub Med Coll (Abbottabad) 2012;24:68-70.  Back to cited text no. 18
    
19.
Luvira V, Bumrerraj S, Srisaenpang S. Formative evaluation and learning achievement in epidemiology for preclinical medical students. Indian J Community Med 2018;43:298-301.  Back to cited text no. 19
[PUBMED]  [Full text]  
20.
Rushton A. Formative assessment: A key to deep learning? Med Teach 2005;27:509-13.  Back to cited text no. 20
    
21.
Hill DA, Guinea AI, McCarthy WH. Formative assessment: A student perspective. Med Educ 1994;28:394-9.  Back to cited text no. 21
    
22.
Epstein RM. Assessment in medical education. N Engl J Med 2007;356:387-96.  Back to cited text no. 22
    
23.
Bridge PD, Musial J, Frank R, Roe T, Sawilowsky S. Measurement practices: Methods for developing content-valid student examinations. Med Teach 2003;25:414-21.  Back to cited text no. 23
    
24.
Norcini J, Anderson B, Bollela V, Burch V, Costa MJ, Duvivier R, et al. Criteria for good assessment: Consensus statement and recommendations from the Ottawa 2010 Conference. Med Teach 2011;33:206-14.  Back to cited text no. 24
    


    Figures

  [Figure 1]
 
 
    Tables

  [Table 1], [Table 2]



 

Top
Print this article  Email this article
           

    

 
   Search
 
  
    Similar in PUBMED
  Related articles
    Article in PDF (591 KB)
    Citation Manager
    Access Statistics
    Reader Comments
    Email Alert *
    Add to My List *
* Registration required (free)  


    Abstract
   Introduction
    Materials and Me...
   Results
   Discussion
    Conclusions and ...
    References
    Article Figures
    Article Tables

 Article Access Statistics
    Viewed147    
    Printed2    
    Emailed0    
    PDF Downloaded27    
    Comments [Add]    

Recommend this journal

  Sitemap | What's New | Feedback | Copyright and Disclaimer
  2007 - Indian Journal of Community Medicine | Published by Wolters Kluwer - Medknow
  Online since 15th September, 2007