An Adaptive Testing Simulation for a Certifying Examination [electronic resource] / Rosemary A. Reshetar and Others.

This study examined performance of a simulated computerized adaptive test that was designed to help direct the development of a medical recertification examination. The item pool consisted of 229 single-best-answer items from a random sample of 3,000 examinees, calibrated using the two-parameter log...

Full description

Saved in:
Bibliographic Details
Online Access: Full Text (via ERIC)
Main Author: Reshetar, Rosemary A.
Format: Electronic eBook
Language:English
Published: [S.l.] : Distributed by ERIC Clearinghouse, 1992.
Subjects:
Description
Summary:This study examined performance of a simulated computerized adaptive test that was designed to help direct the development of a medical recertification examination. The item pool consisted of 229 single-best-answer items from a random sample of 3,000 examinees, calibrated using the two-parameter logistic model. Examinees' responses were known. For tests of 60, 120, and 180 items, estimation error and the accuracy of pass/fail classification decisions were studied. Ability estimates were stable across test length changes, and accurate estimates were obtained with all three test lengths. However, it is recommended that overall pass/fail decisions be based on longer tests, especially when the cutscore is close to the mean. This initial application suggests that computerized adaptive testing has promise in professional evaluation settings. Six tables present study data, and there is a 10-item list of references. (Author/SLD)
Item Description:ERIC Document Number: ED346119.
ERIC Note: Paper presented at the Annual Meeting of the American Educational Research Association (San Francisco, CA, April 20-24, 1992). Research supported by the American Board of Internal Medicine.
Physical Description:11 p.