Beyond NAEP and PISA: Many U.S. Adults Lack Practical Skills, New Tests Show
Students also struggle with digital, information literacy.
Students also struggle with digital, information literacy.
The results from high-profile assessments issued this fall — both national (NAEP) and international (PISA) — show troubling academic outcomes for U.S. students. Drawing far less attention, however, are important findings from other exams, including a lack of practical literacy, numeracy, and problem-solving skills among many Americans ages 16 to 65.
For example, nearly 30 percent of U.S. adults performed at the lowest levels in numeracy (the ability to work with and understand numbers), well below the average of other industrialized nations that participated in the global exam, the Program for the International Assessment of Adult Competencies (PIAAC). Higher levels of numeracy in adults were found to be associated with higher wages. Results were issued in early November.
“You look at the direct test scores in numeracy and skill use, and you can see they add quite a lot of explanatory power to the wages people attain,” said Andreas Schleicher, the director for education and skills at the Paris-based Organization for Economic Cooperation and Development (OECD), which developed and oversees the assessment.
In March 2020, journalists will be able to sift through these PIAAC results at the state and county level, when the National Center for Education Statistics makes available a mapping tool.
Meanwhile, results from another under-the-radar assessment, the International Computer and Information Literacy Study (ICILS), are shining a spotlight on the realities of the “digital native” myth. This global assessment reveals that while young people prolifically use digital devices, they are not necessarily developing sophisticated digital skills. One in four eighth grade students in the U.S. achieved at the lowest levels of computer and information literacy — the ability to identify reliable and trustworthy information online.
It’s been a busy fall for assessment data. Earlier this week, a fresh round of results from PISA, the Program for International Student Assessment, found stalled achievement among U.S. 15-year-olds in reading, math and science, as Emily Richmond explains in this EWA blog post.
In addition, results issued in November on NAEP, the National Assessment for Educational Progress, showed declines in reading scores for both fourth and eighth graders. (See Word on the Beat: NAEP.)
The PIAAC results show that literacy, numeracy, and technological skills of U.S. adults have not improved since 2012. Adult attainment in all three domains notably lagged behind international top performers Japan and Finland and fell below the international average in both numeracy and digital problem-solving.
While the literacy score for U.S. adults is above the OECD average, around 13 percent of adults are reading at the lowest level. Scoring at that level means they struggle to read relatively short texts to locate a single piece of information and may have difficulty successfully determining the meaning of sentences. Comparatively, only 5 percent of adults in Japan are reading at the lowest level.
To demonstrate proficiency in problem solving in technology-rich environments, the PIAAC required that adults demonstrate they have rudimentary skills in the use of computer tools and applications. On this measure, about one in five U.S. adults fall below the basic level of proficiency.
The exam, administered by the OECD, tested 245,000 adults ages 16 to 65 in 39 countries that are part of the international partnership organization.
Reporting for Education Week, Sarah Sparks noted that Hispanic adults’ skills improved in both literacy and problem-solving, while just over half of U.S. adults are reading at the lowest two levels of five measured by the exam.
To help put the data in context, the PIAAC report includes specific examples of the types of questions participants must address. For example, when asked to calculate how much gas is in a 24-gallon tank if the gas gauge reads three-quarters full, almost one in three U.S. adults struggled to accurately estimate the remaining amount.
Questions in the problem-solving section of the exam focused on assessing practical digital skills needed for acquiring jobs and navigating the real world. One sample task involved filtering and bookmarking the websites of job search services based on cost and online registration requirements.
Soon, journalists will be able to sift through these results at the state and county-levels. The National Center for Education Statistics has been working to provide a small-area estimation mapping tool, which will combine PIAAC data from 2012, 2014, and 2017. The tool, to be released in March 2020, will allow users to view levels of adult proficiency based on demographic characteristics, compare county results to state or national averages, and conduct side-by-side comparisons for two states or counties.
Rounding out this fall’s wave of assessment developments is a new profile in U.S. student and teacher digital skills and technology use, administered for the first time in 2013.
The ICILS tested 46,000 eighth graders in educational systems representing 14 countries, including Chile, Finland, Germany and South Korea. The study also gathered information about how well students are being prepared for study, work, and life in a digital world from more than 26,000 teachers in over 2,200 schools.
A primary goal of ICILS is to measure students’ computer and information literacy, defined as the ability to evaluate the trustworthiness and reliability of online information. Only 2 percent of all U.S. students scored at the highest level of computer and information literacy. Three out of five students demonstrated a need for direct instruction when using computers and gathering online information.
With scores of devices at their fingertips at home and increased access at school, the findings come as a shock to those who believe young people are only getting more technologically proficient than adult generations. Researchers point to the unclear purpose around how to use technology in schools as one factor in the lack of cultivation of tech prowess.
“I wasn’t that surprised,” said Sara Dexter, a professor who teaches school leadership courses to teachers at the University of Virginia. “What I hear from [teachers] over and over and over and over is, ‘We don’t have any real clear purpose for why we have technology. It’s purchased, it’s at our school… but we don’t get very direct messages about what we are to do with it.’”
Further, while U.S. students scored higher than their international peers in digital skills, results indicated persistent disparities by gender and socioeconomic status.
Results from the test, which also measures computational thinking (the ability to think and solve problems like a computer scientist), suggest that male students were better at performing simple coding tasks. Female students did better when asked to identify and share online information that was reliable and trustworthy. Scores indicate that the digital skills divide continues to persist along socioeconomic lines. (For indicators of socioeconomic background, ICILS considers parental occupation, parental educational attainment, and number of books in the home.)
In a finding that might stoke fear in the hearts of all journalists, PISA results showed that one in seven students in the U.S. was able to differentiate between fact and opinion. When asked in 2009 to respond to the statement: “Several times a month I read newspapers,” just over 60 percent of 15 year-olds affirmed. In 2018, that number had dropped to about one in four.
Your post will be on the website shortly.
We will get back to you shortly