Objectives: A method of recording and analyzing the performance of students during practical examinations needed to be developed as part of a component of a new course. The evaluation criterion included basic knowledge, skill and clinical preparedness as demonstrated in a practical exam setting. The data collected from examinations is intended to be used to compare individual students with overall class performance and expected minimal competencies. The system must also collect data so that trends between classes, sections within classes and individual groups observed by specific examiners can also be compared and analyzed for corrective actions, if deemed necessary.
Methods: A system was developed to record the performance of students using a Scantron examination form. This allowed for accurate recording of specific criteria that was consistent between examiners and easy to analyze. The exam format is a multiple case, timed station OSCE style examination. Each case had three stations: a reading station, a performance station and a follow-up station. The answer form used by the examiners was organized to follow a logical order, to minimize errors in scoring. Notes were used to provide feedback to the entire class as to errors made by students during the exam process. The Scantron forms were read by a scanner that organized the responses into a comma delineated text file. The text file was opened using an Excel workbook that was created to analyze the data.
Results: One hundred-one students were graded for three examinations using this system of scoring practical exams. All examiners and students were in agreement that the process did not add any additional time to recording the answers, that the students were not distracted by the process, and that the examiners were able to pay more attention than with previous systems. Several students were identified as having weaknesses in one or more areas and were given the opportunity to work with tutors on specific assignments outside of class. A significant trend was realized in the area of neurologic evaluation. Action steps were taken to include adding the performance of evaluating neurological systems, including evaluating particular dermatomes, into case studies using the DPO-CEX format (Doctor, Patient, Observer- Clinical Exercise). Subsequent exam results revealed significant improvement in neurologic exam procedures for all students.
Discussion: Using computer cards to score and record practical exams has provided accurate analyzable data regarding the performance of individuals and entire classes. Scoring in this manner was fast and accurate, and increased the ability of the examiner to concentrate on what the student was doing during an exam. The data revealed trends and areas to be worked on for both individual students and entire classes. Corrective actions were taken in the form of alterations in teaching material for the class, and the outcomes of corrective action were easily and efficiently measured. This method is best utilized for advanced and capstone type courses due to the limitations of individual feedback often useful as a teaching aide in earlier classes. This method of evaluation has been highly successful, and is recommended for use as part of a comprehensive evaluative strategy for students, classes and elements within chiropractic programs.
This abstract is reproduced with the permission of the publisher.