TY - JOUR
T1 - When assessment validation neglects any strand of validity evidence
T2 - An instructive example from PISA
AU - Pepper, David
PY - 2020/12/6
Y1 - 2020/12/6
N2 - The Standards for Educational and Psychological Testing identify several strands of validity evidence potentially needed as support for particular interpretations and uses of assessments. Yet assessment validation often does not seem guided by these Standards, with validations lacking a strand even when it appears relevant to an assessment. Consequently, the degree to which validity evidence supports the proposed interpretation and use of the assessment may be compromised. Guided by the Standards, this article presents an independent validation of OECD’s PISA assessment of mathematical self-efficacy (MSE) as an instructive example of this issue. OECD identifies MSE as one of a number of ‘factors’ explaining student performance in mathematics, thereby serving the ‘policy orientation’ of PISA. However, this independent validation identifies significant shortcomings in the strands of validity evidence available to support this interpretation and use of the assessment. The article therefore demonstrates how the Standards can guide the planning of a validation to ensure it generates the validity evidence relevant to an interpretive argument, particularly for an international large-scale assessment such as PISA. The implication is that assessment validation could yet benefit from the Standards as “a global force for testing” (Zumbo, 2014, p. 33).
AB - The Standards for Educational and Psychological Testing identify several strands of validity evidence potentially needed as support for particular interpretations and uses of assessments. Yet assessment validation often does not seem guided by these Standards, with validations lacking a strand even when it appears relevant to an assessment. Consequently, the degree to which validity evidence supports the proposed interpretation and use of the assessment may be compromised. Guided by the Standards, this article presents an independent validation of OECD’s PISA assessment of mathematical self-efficacy (MSE) as an instructive example of this issue. OECD identifies MSE as one of a number of ‘factors’ explaining student performance in mathematics, thereby serving the ‘policy orientation’ of PISA. However, this independent validation identifies significant shortcomings in the strands of validity evidence available to support this interpretation and use of the assessment. The article therefore demonstrates how the Standards can guide the planning of a validation to ensure it generates the validity evidence relevant to an interpretive argument, particularly for an international large-scale assessment such as PISA. The implication is that assessment validation could yet benefit from the Standards as “a global force for testing” (Zumbo, 2014, p. 33).
KW - validation
KW - large-scale assessment
KW - response processes
KW - self-efficacy
KW - PISA
KW - mathematics
U2 - 10.1111/emip.12380
DO - 10.1111/emip.12380
M3 - Article
SN - 1745-3992
VL - 39
SP - 8
EP - 20
JO - Educational Measurement: Issues and Practice
JF - Educational Measurement: Issues and Practice
IS - 4
ER -