Why Do SSAT Scores Take More than Two Days to Release to Schools?

EMA
January 17, 2015

Why Do SSAT Scores Take More than Two Days to Release to Schools?

EMA
January 17, 2015

Why Do SSAT Scores Take More than Two Days to Release to Schools?

EMA
January 17, 2015

Why Do SSAT Scores Take More than Two Days to Release to Schools?

EMA
January 17, 2015
From The Yield, Winter 2014

Logistics—Prompt Test Return Is Essential

First, logistics play a role. The Enrollment Management Association relies on test administrators for their prompt return of test materials as well as the concise reporting of irregularities and their treatment. As boxes from our over 400 test centers worldwide are received at our headquarters, materials are accounted for and scanning begins. In addition, all reported irregularities are reviewed and the necessary followup taken with the families and the test administrators to ensure scores are released (or not). Once 95% of the answer sheets are scanned and accounted for, the item analysis and equating processes can begin.

Ensuring Fairness—Item Analysis

After a test form is administered, and before tests are scored, preliminary item analysis is performed as a last check on the items. Preliminary item analysis serves as a further check on errors that may have slipped through the pretest item analysis or on errors that may have been made between the time of test assembly and the time of test scoring. Final item analysis assists in the overall evaluation of the entire test, such as the test difficulty, the reliability, the speededness, and to evaluate further individual items on the full test-taking population.

Making All Forms Equal—The Equating Process

Although each Standard test form is built according to detailed test specifications, it is inevitable that there are variations in form difficulties. This means that raw scores, which are based on the number of rights, wrongs, and omits, are not directly comparable across different forms. The Enrollment Management Association’s psychometric team must utilize a statistical procedure referred to as score equating to adjust for form difficulty differences.

First, an equating plan is built—a systematic way of linking new forms to old forms across multiple administrations and multiple years. The Enrollment Management Association’s team of psychometricians use multiple equating methods and evaluate different functions using a variety of criteria, including: group ability comparison, form difficulty comparison, evaluation of the relationship between the new and the old forms, and historical data. The conversion that is deemed the most reasonable will be chosen as the operational conversion for the new form.

Applying the Equating Conversion

Once the equating plan is completed for a given Standard test, scoring can begin. At this point, additional quality controls are in place to ensure accurate reporting. In addition, “large score differences” are flagged for individual students and additional investigation and actions are taken. Following the completion of all these statistical and quality control processes, scores are released to schools and are immediately available for viewing, exporting, and printing via the Member Access Portal (MAP).

EMA Members can view the full report in our Member Community.

Become a member to gain access to our full magazine
and more professional development tools.

Subscribe to learn more about EMA and our services.

EMA
January 17, 2015
Ready to make EMA part of your enrollment toolkit?

Subscribe to learn more about EMA and our services.

Why Do SSAT Scores Take More than Two Days to Release to Schools?

EMA
January 17, 2015
From The Yield, Winter 2014

Logistics—Prompt Test Return Is Essential

First, logistics play a role. The Enrollment Management Association relies on test administrators for their prompt return of test materials as well as the concise reporting of irregularities and their treatment. As boxes from our over 400 test centers worldwide are received at our headquarters, materials are accounted for and scanning begins. In addition, all reported irregularities are reviewed and the necessary followup taken with the families and the test administrators to ensure scores are released (or not). Once 95% of the answer sheets are scanned and accounted for, the item analysis and equating processes can begin.

Ensuring Fairness—Item Analysis

After a test form is administered, and before tests are scored, preliminary item analysis is performed as a last check on the items. Preliminary item analysis serves as a further check on errors that may have slipped through the pretest item analysis or on errors that may have been made between the time of test assembly and the time of test scoring. Final item analysis assists in the overall evaluation of the entire test, such as the test difficulty, the reliability, the speededness, and to evaluate further individual items on the full test-taking population.

Making All Forms Equal—The Equating Process

Although each Standard test form is built according to detailed test specifications, it is inevitable that there are variations in form difficulties. This means that raw scores, which are based on the number of rights, wrongs, and omits, are not directly comparable across different forms. The Enrollment Management Association’s psychometric team must utilize a statistical procedure referred to as score equating to adjust for form difficulty differences.

First, an equating plan is built—a systematic way of linking new forms to old forms across multiple administrations and multiple years. The Enrollment Management Association’s team of psychometricians use multiple equating methods and evaluate different functions using a variety of criteria, including: group ability comparison, form difficulty comparison, evaluation of the relationship between the new and the old forms, and historical data. The conversion that is deemed the most reasonable will be chosen as the operational conversion for the new form.

Applying the Equating Conversion

Once the equating plan is completed for a given Standard test, scoring can begin. At this point, additional quality controls are in place to ensure accurate reporting. In addition, “large score differences” are flagged for individual students and additional investigation and actions are taken. Following the completion of all these statistical and quality control processes, scores are released to schools and are immediately available for viewing, exporting, and printing via the Member Access Portal (MAP).

EMA
January 17, 2015