MIT researchers analyzed more than 800,000 online school reviews using advanced natural language processing, determining that reviews were largely associated with schools’ test scores — a measure that correlates closely with race and family income and tends to reinforce inequities in educational opportunity — rather than measures of student growth, which reflect how well schools actually help students learn.
“Our hope is that parents who learn about our study will be highly discerning when they read school reviews and take what they are reading with a grain of salt, triangulating subjective assessments with a range of metrics that try to capture what’s really going on at the school,” says Nabeel Gillani, a doctoral student and research assistant in MIT’s Media Lab, and the lead author of the study, which was published this week in AERA Open, a peer-reviewed journal of the American Educational Research Association.
Gillani and his fellow researchers — his faculty advisor, Professor Deb Roy; MIT graduate student Eric Chu; Media Lab Research Scientist Doug Beeferman; and Rebecca Eynon of the University of Oxford — drew on approximately 830,000 reviews of more than 110,000 publicly funded K-12 schools across the United States. The reviews were posted by parents from 2009 to 2019 on the GreatSchools.org school information website. GreatSchools, which made the reviews data available for the study, has updated its rating systems in recent years to improve its effectiveness in providing information that minimizes inequity in educational opportunities.
The study characterizing the reviews is the first of its kind. Gillani, whose volunteer work involves helping families who are not familiar with U.S. public education to select high-quality schools for their children, first thought of the concept after a phone call with a mother who had recently immigrated to the United States. As the mother read online reviews to select a school for her daughter, Gillani says he was struck by one school in particular for which the reviews were very positive, “but based on various quality metrics, the school itself didn’t appear to be a quality school,” where student learning and growth were emphasized.
“Ever since then, I’ve been interested in what information reviews contain about different measures of school quality. What are they saying about the quality of education children have access to at their schools?”
Gillani says these questions “aligned well with our research group’s focus on using machine learning and natural language processing to understand discourse patterns and human behavior.”
To conduct the study, the authors linked the GreatSchools reviews with the Stanford Educational Data Archive and census data on race and socioeconomic status by neighborhood. Their preliminary analyses revealed that reviews were largely posted by parents at urban schools and those that serve more affluent families. They then developed machine learning models that used the language in reviews to predict different attributes of schools, including test scores, measures of student growth, the percentage of students at the school who are white, and the percentage receiving free or reduced lunch. They found that the models were quite accurate in predicting test scores and school demographics, but were virtually unable to predict student growth — suggesting the information contained in reviews was closely associated with racial and demographic indicators of schools.
To better understand these associations, the researchers then inspected the decision-making processes used by the models, identifying words and phrases most closely associated with the school performance measures and demographics. Many of these words and phrases — such as “the PTA,” “emails,” “private school,” and “we” and “us” versus “I” and “my” — were more closely associated with higher-performing, whiter, and more affluent schools. These associations reflect documented trends in education, which have revealed that parents at such schools often have more time and comfort to be involved in parent groups, better digital connectivity, more schooling options, and two-parent households, according to Roy, MIT professor of media arts and sciences, director of the MIT’s Center for Constructive Communication, and executive director of MIT’s Media Lab. “Our study illustrates how techniques from machine learning, applied to large-scale datasets describing human thought and behavior, can surface subtle patterns that might otherwise be difficult to detect,” Roy says.
The findings led the authors to state that “parents who reference school reviews may be accessing and making decisions based on biased perspectives that reinforce achievement gaps.”
If reviews reflect test scores and demographics, and parents use them to decide where to send their children to school, then such reviews could even push schools to continuously prioritize high test scores instead of student progress and growth, Gillani says.
“In an education system where test scores are notoriously correlated with race and income, one concern is that reviews primarily associated with test scores could influence parent and school decision-making in ways that increasingly skew school demographics along racial and income lines,” he says. “Just like with any market, consumer reviews and preferences are likely to have a strong influence on what kinds of products are ultimately created.”
de MIT News https://ift.tt/3sSm5Tb
No hay comentarios:
Publicar un comentario