No Test Required: Demographic Factors Predict Common Core Proficiency
Christopher H. Tienken, Ed.D., Department of Education Leadership, Management and Policy, has published an article in the Journal of Research in Middle Level Education, Predicting Middle Level State Standardized Test Results Using Family and Community Demographic Data. The research shows Common Core test results to be predictable using the following three factors: (1) percentage of families in a community with income over $200,000 a year, (2) percentage of people in a community in poverty and (3) percentage of people in a community with bachelor's degrees. The results accurately predict proficiency rates for over 70% of NJ middle schools evaluated in the study.
School funding implications, teacher quality ratings, and judgments about school administrator effectiveness are often contingent upon standardized test results. Tienken and his team highlight the questionable nature of that relationship. The results from the research reveal that factors outside the classroom affect some of the outcomes commonly attributed to the classroom.
"When you look at those three factors, they're a combination of family and community factors. They are proxies for variables that students are exposed to in their community. Life experiences, along with direct and indirect academic experiences that children receive are highly influential on their ultimate achievement on standardized tests. We're trying to build upon previous research and identify specific factors within the family and community to accurately predict test results," explained Tienken.
New Jersey middle schools in lower socio-economic areas have lower proficiency rates on the Common Core tests than those in affluent areas. This hints at inherent bias within the test and adds fuel to the debate: Is where you come from where you end up?
"Our results are not suggesting that demographics are destiny. What they do show is that these tests are not accurately capturing a student's full learning potential, a teacher's impact on learning, or the quality of the leadership at the school," said Tienken.
Tienken points out that a more accurate and representative picture of student learning is attainable from looking at a student's GPA, which is a culmination of years of learning and demonstrated to be a more reliable predictor for future success than a single test score.
Concerns about fairness and transparency to teachers and administrators result from the study. The use of test scores, influenced by factors outside the classroom, is proving to be a poor measure for promotion and tenure decisions, according to the research. The study classifies standardized test results as "too unstable" and "not representative of the multifaceted job middle level administrators perform."
Aspiring educators may be less inclined to take a job in a lower performing school district for fear of losing their position due to factors out of their control. To avoid this, Tienken calls for a more broad-based approach to evaluation. Administrators and teachers should be evaluated using portfolio assessment based on the mission and goals of their school district as a more accurate way to demonstrate proficiency in achieving the district's goals.
"These evaluations cost more money and take more time but they provide a better picture of the professional output than just a single test score," said Tienken.
Tienken collaborated on this study with Anthony Colella, Ph.D., Department of Education Leadership, Management and Policy, and former doctoral students: Christian Angelillo, Ed.D., Boonton Township School District, Meredith Fox, Ed.D., Nanuet Union Free School District, Kevin R. McCahill, Ed.D., George W. Miller Elementary School and Adam Wolfe, Ed.D., Peoria Unified School District.