Frequently Asked Questions
PIAAC is the Programme for the International Assessment of Adult Competencies, an international assessment of the foundational information-processing skills required to participate in the social and economic life of advanced economies in the 21st century.
An initiative of the Organisation for Economic Co-operation and Development (OECD), PIAAC provides a highly detailed survey of skills in literacy, numeracy, and adaptive problem solving among adults between the ages of 16 and 65 in over 30 countries and economies, along with all Canadian provinces. These core skills form the basis for cultivating the other, higher-level skills necessary to function at home, school, work, and in the community.
Data from PIAAC provide a measure of cognitive and workplace skills, a rich evidence base for policy-relevant analysis, a better measure of a country’s “stock” of skills, insights into whether education and training programs are focusing on the right competencies, and the capacity to compare skills across a broad sampling of countries around the world.
In 2022–23, PIAAC was administered a second time, allowing participating countries to track the skill levels of their adult population since the first administration in 2012.
2. What information does PIAAC collect?
PIAAC is made up of three main parts: a direct skills assessment, a background questionnaire, and a module on the use of skills:
-
The skills assessment, administered on digital devices, examines individual proficiencies in the three foundational skills: literacy, numeracy, and adaptive problem solving. Each skill is measured along a continuum that has been divided into different levels of proficiency to help interpret the results.
-
The background questionnaire puts the results of the skills assessment into context. It allows participant results to be classified according to a range of factors that can influence skills outcomes (e.g., age, education, employment status).
-
The module on the use of skills collects information from each respondent on how he or she uses a range of skills at work and in everyday life. It looks at cognitive skills such as reading or computer use, as well as non-cognitive skills such as social interaction, cooperation, learning, organization and planning, and physical/motor activity.
3. How was this information collected?
PIAAC was conducted by Statistics Canada on behalf of the Council of Ministers of Education, Canada and Employment and Social Development Canada (ESDC).
Close to 11,700 adults aged 16 to 65 from across the 10 provinces responded to the survey, which was carried out in their own homes. The personal interview portion of the assessment took place in concert with a representative from Statistics Canada, with both participant and StatCan representative using a tablet device to record answers. The rest of the assessment was completed on the tablet by the participants themselves.
For some participants, who were not able to communicate effectively with the interviewer in the language of the assessment (English or French in Canada), a “doorstep interview” questionnaire was offered, instead. This questionnaire, available in different languages, gathered data on key background characteristics of these participants. It also allowed for an estimation of the proficiency of survey participants in the three domains assessed to be obtained.
PIAAC data collection is designed to take place every 10 years.
In Canada, PIAAC was funded by the following partners:
5. What countries participated in PIAAC Cycle II?
Austria, Belgium (Flanders), Canada, Chile, Croatia, Czechia, Denmark, Estonia, Finland,
France, Germany, Hungary, Ireland, Israel, Italy, Japan, Korea, Latvia, Lithuania,
Netherlands, New Zealand, Norway, Poland, Portugal, Singapore, Slovak Republic, Spain,
Sweden, Switzerland, United Kingdom (England), and United States.
A second round of data collection is scheduled to take place between 2024 and 2029, with a list of countries still to be finalized.
6. Why is Canada’s sample size larger than that of many other countries?
A sample of 4,000 to 5,000 respondents is required by the OECD to obtain valid and reliable results at the level of a country.
In Canada, education policy is developed and decided at the provincial and territorial level, so a larger sample is required to obtain statistically reliable results for each province and territory.
In this cycle of PIAAC, the 10 provinces participated. In the 2012 cycle, the territories also participated and specific subpopulations were oversampled.
As a result, the sample for this second cycle of PIAAC, although bigger than that of many other countries, was much smaller than for the first cycle.
Literacy is more than just reading the words on a page or a screen. Literacy in PIAAC is defined as “accessing, understanding, evaluating, and reflecting on written texts in order to achieve one’s goals, to develop one’s knowledge and potential, and to participate in society.”
Throughout personal, professional, and social activities, adults engage with a range of written communications in a variety of formats, such as emails, leaflets, timetables, newspaper articles, and instruction manuals.
The conceptual underpinnings of literacy have not changed between 2012 and 2023, allowing literacy results to be comparable over time. However, the assessment framework has been revised to account for a more contemporary view of the concepts being evaluated.
Numeracy is more than just the ability to count or perform basic mathematical operations. Numeracy in PIAAC is defined as “accessing, using, and reasoning critically with mathematical content, information, and ideas represented in multiple ways in order to engage in and manage the mathematical demands of a range of situations in adult life.” This definition highlights the importance of numeracy to a wide range of skills and knowledge used in everyday life, extending beyond quantity and numbers to include things like dimensions, shapes, patterns, and relationships. It recognizes that managing a situation or solving a problem in a real context—such as understanding purchases and receipts, reading maps, cooking, or engaging in home repairs—requires more than an understanding of basic mathematical operations. It also requires the ability to compute and interpret things like proportions, measurements, and statistics.
As is the case for literacy, the assessment framework for numeracy has been revised to account for the increasing prevalence of quantitative and mathematical information to which adults are exposed through online and technology-based resources. However, results from the numeracy assessment in 2023 can still be compared with those from 2012.
9. What is adaptive problem solving?
In the first cycle, PIAAC assessed problem solving in a technologically rich environment, abbreviated as PS-TRE. The assessment focused mainly on measuring skills using specific digital applications.
Now, more than ever, adults are required to both continually adapt to new circumstances and engage in lifelong learning as a result of an increasingly complex and changing society. This need to assess and adjust to changing conditions prompted the inclusion of adaptive problem solving as a new area of assessment. Adaptive problem solving is defined as “the capacity to achieve one’s goals in a dynamic situation in which a method for solution is not immediately available. It requires engaging in cognitive and metacognitive processes to define the problem, search for information, and apply a solution in a variety of information environments and contexts.”
The adaptive problem-solving assessment has three characteristics:
10. What do the different levels of proficiency mean?
Respondents are categorized by proficiency level for literacy, numeracy, and adaptive problem solving. The PIAAC proficiency levels correspond to the level of difficulty of the tasks they are able to complete.
If a respondent scores at a particular proficiency level, it does not mean that he or she cannot complete tasks at higher levels. It only means that even if he or she does successfully complete some tasks at a higher level, the probability of consistently doing so is low.
In literacy and numeracy, the continuum of performance is divided into five levels, whereas adaptive problem solving is divided into four levels. For respondents with very low levels of proficiency, PIAAC included separate reading and numeracy components aimed at measuring low performance more accurately.
11. What are the limitations/caveats when comparing results from PIAAC with previous household survey results?
PIAAC differs from previous household surveys in several important ways.
The most prominent is its use of information technology. PIAAC is administered completely on digital devices (tablets), whereas other surveys have been completed entirely on paper.
Compared with the first cycle of PIAAC in 2012, the second cycle introduced the “doorstep interview.” To minimize non-response from respondents with very limited proficiency in the language of assessment, the OECD initiated a shorter version of the assessment that is offered in a large number of languages. Results are used to estimate the proficiency of respondents in literacy, numeracy, and adaptive problem solving. It also collects a small subset of personal background information. While the introduction of the doorstep interview minimized the proportion of non-respondents in the second cycle, it means that care should be given when comparing results between the two cycles of PIAAC or with previous household surveys because the samples are different. As a result, it is advisable to remove the cases obtained from the doorstep interview when comparing results from the second cycle with previous surveys.
PIAAC also takes a different approach to literacy. In recognition of the enormous increase in the use of digital technology over the past 10 years, it includes texts of multiple types — prose, document, digital, and mixed-format. This is different from earlier surveys, which used only paper-based texts in prose and document format. Starting with the first cycle, PIAAC carried this different approach into the way it presents results: it reports literacy on a single scale, whereas the earlier surveys reported literacy on two separate scales (prose literacy and document literacy).
Finally, PIAAC assesses skills with a finer level of detail than previous surveys did. New elements introduced in PIAAC permitted the development of the “below Level 1” category, a more detailed assessment of reading ability for individuals at low levels of literacy, and a richer source of data for constructing the numeracy scale.
As a result of all these differences, caution should be exercised when comparing PIAAC with previous surveys. For example, the use of a single scale for literacy in PIAAC meant that results from the previous OECD survey [the International Adult Literacy and Skills Survey (IALSS), completed in 2003] had to be re-scaled before the two could be compared. Without that re-scaling, the IALSS results cannot be compared with those for PIAAC, since the scales used in the two surveys are not the same.
12. What are the limitations/caveats when comparing results from Canada with those from other countries?
PIAAC was standardized to ensure that its content and implementation were the same across all participating countries. While such standardization allows results to be compared, they can only be fully understood by bearing in mind the context of the country in which the assessment is administered.
No two countries in PIAAC are identical. From age structure, educational attainment, immigration levels, and cultural background, the populations that were measured all display very different characteristics. Given that these characteristics can have a significant impact on skill levels, making direct comparisons between one country and another by simply looking at their average results is problematic.
Fortunately, PIAAC provides much more than average skill outcomes for entire populations. It also provides outcomes for different categories within those populations, such as age cohorts, educational attainment, and occupational status. This allows for much better comparisons between national populations, since it controls for the sociodemographic differences between them.
In short, PIAAC allows us to make comparisons between countries—but it is essential to understand what exactly is being compared.
13. What are the limitations/caveats when comparing results between provinces?
PIAAC was standardized to ensure that its content and implementation were the same across all participating provinces and territories. While such standardization allows results to be compared, they can only be fully understood by bearing the context of regional differences in mind.
No two provinces (or, indeed, territories) in Canada are identical. From age structure, educational attainment, immigration levels, cultural backgrounds, the populations that were measured all display a unique mix of characteristics. In addition, Canada has a complex linguistic profile: not only does the country have two official languages, but there are also substantial proportions of the population who have neither English nor French as a mother tongue. Given that these aspects can have a significant impact on skill levels, making direct comparisons between one province and another by simply looking at their average results is problematic.
Fortunately, PIAAC provides much more than average-skill outcomes for entire populations. It also provides outcomes for different categories within those populations, such as age cohorts, educational attainment, and occupational status. Such categorization allows for much better comparisons between populations, since it controls for the sociodemographic differences between them.
Canadian provinces have a lot more in common than other countries do, but it is important to consider these key characteristics when comparing results between provinces or between provinces and countries.
14. Did the COVID-19 pandemic impact PIAAC results?
It is difficult to measure the impact that the global pandemic had on the skill level of respondents in literacy, numeracy, and adaptive problem solving. What we do know is that in most countries across the world, learning was disrupted, if not interrupted, to different degrees and for different durations.
Results from the OECD’s Programme for International Student Assessment (PISA) 2022 study suggest that many countries experienced declines in scores and that countries with average low scores and school systems with longer school closures were more affected.
Results from this second cycle of PIAAC indicate that skills remained stable or declined in most countries in the last decade. It is conceivable that the lack of improvement may be due, in part, to the impact of the pandemic on the global economy.
15. What are some of the key findings from the PIAAC 2023 cycle in Canada?
Overall, Canada’s mean performance in literacy, numeracy, and adaptive problem solving is above the OECD average.
Our share of adults who are low performers (Level 1 or below) is also lower than the OECD average across the three domains of assessment.
In Canada, average skill levels are higher among the 25–34-year-old group and lower for the 55–64-year-olds.
As is the case in all participating countries, Canadian adults with higher levels of educational achievement demonstrated a higher level of skills in the three assessed domains.
In Canada, adults born in the country performed better on average in the three domains than those who were born elsewhere and had since immigrated. However, second-generation immigrants (those who were born in Canada but whose parents were not born in Canada) performed as well or better on average than non-immigrants.
Canadian adults who were employed demonstrated higher skill levels in the three domains than Canadians who were unemployed or those out of the labour force.
Compared with the previous cycle of PIAAC in 2012, the average proficiency of Canadians aged 16–65 remained stable in literacy and improved in numeracy.
16. How do PISA and PIAAC compare?
PIAAC assesses adults between the ages of 16 and 65, and thus the youth population that it surveyed was much larger than that studied in PISA, which only assesses 15-year-olds. Moreover, PISA focuses on reading literacy and mathematical literacy as it applies to school-age children, while PIAAC reflects the kinds of tasks that adults encounter at home, in their social life, and in work environments.
While PISA results for Canada have been consistently among the top performing countries since 2000, PIAAC’s Cycle II results for Canadians aged 16–24 are above the OECD average in literacy and adaptive problem solving and at the OECD average in numeracy.
The lower relative standing in PIAAC of Canada’s youth may be related to a number of factors, such as different levels of education and employment attained after graduation from high school or an increased proportion of survey respondents in PIAAC whose mother tongue was neither English nor French. At this stage, we do know that the answer to that question is multifaceted. Over the coming months, we will be examining the PIAAC and PISA data carefully to see why some results differ between the two assessments.