Search the Glossary of Terms to find definitions that pertain to the SIRC site.
Browse the glossary using this index
Special | A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z | ALL
Certification of Simulation
The process of approving that aor program meets certain published standards
Clinical Simulation Center
that serves as a to learners and instructors during the development, implementation and evaluation of activities
Clinical Simulation Laboratory
Environment that serves as a resource to learners and instructors during the development, implementation and evaluation of activities
A confederate is an individual other than the patient who is scripted in a simulation to provide realism, additional challenges, or additional information for the learner (e.g. paramedic, receptionist, family member, lab technician).
CSE, CSA, CPX
A clinical skills or clinical practice examination (CSE, CSA, CPX) is a station or series of stations designed to assess the key clinical competencies of history-taking, physical examination, communication, and interpersonal and professionalism skills. Learners are expected to structure the history, physical examination, and/or other tasks (counseling, education, etc.) necessary based on the presenting complaint. Documentation of findings, differential diagnosis/diagnosis, diagnostic work-up, and/or therapeutic work-up may be included. Learners are evaluated via direct observation, checklists, learner presentation or written follow-up exercises. The examinations can be either formative or summative and may involve feedback. Station length is typically 10-20 minutes, but can be longer.
Standardized Patient and Simulation Terminology Standards of Practice (retrieved from http://aspeducators.org/terminology-standards 04/14/2012)
Information provided by instructors or designated participants in the simulation that helps the student progress through the simulation activity by providing information about the step the student is on or is approaching
that follows a simulation experience led by a facilitator wherein is provided on the simulation participants’ while positive aspects of the completed simulation are discussed and reflective thinking encouraged
describes a systematic process for eliciting written feedback from experts about a topic. This method may be done electronically and usually involves several rounds of questioning with a summary of the participants’ feedback being circulated in an effort to work toward consensus or agreement on the topic (Polit & Beck, 2012).
is essentially a measure of consistency. Reliability can exist in the absence of validity, but the interpretation of data cannot be valid if the data are not reliable. Furr and Bacharach (2008) define reliability as, “the extent to which differences in individuals’ scores produced using an evaluation instrument are consistent with differences in their true abilities” (p. 82). For performance evaluation, three types of reliability include inter-rater reliability, intra-rater reliability and inter-instrument reliability.
refers to, how well theory and evidence support the way evaluation scores are used (AERA, APA & NCME, 1999).