Introduction

The term adult education gained currency in the early 20th century, when thousands of immigrants arriving in the United States adopted an “Americanisation” strategy, which for many included learning to read and write. This resulted in a marked improvement of literacy rates (Richey 1939), demonstrating that literacy helps people integrate into society. It is estimated, however, that today, worldwide, “773 million adults and young people lack basic literacy skills” (UN 2020). The low literacy rate in Latin America was improved by only four points from 2000 to 2015, 14 per cent below expectations (UNESCO 2017). This slow rate of progress presents a problem that must be addressed by the world’s governments.

Most countries with young populations and low literacy levels have a history of discouragement of policies related to education and training (Bernhardt et al. 2014). Although the United Nations (UN) plays an active role in promoting adult education at the transnational level, policies are still lacking due to limited recognition of the complexity of the issue (Milana 2012). Low literacy is a visible issue on international agendas, and provides an opportunity to generate spaces of inclusion. It is vital that we reach people who have abandoned traditional formal education and wish to embark on a path of lifelong learningFootnote 1 (Hanemann 2019). Therefore, literacy programmes address a social demand, but also represent a powerful pedagogical strategy, created to meet the diverse needs of a substantial audience (Miller et al. 2011).

It is essential that adult education programmes are tailored to meet the needs and expectations of the young people and adults who choose to take them up (Pajaziti 2014). The research we present in this article emphasises the importance of evaluating the impact of literacy programmes for youth and adults in ensuring effectiveness. At the same time, it creates an instrument that allows us to do just that, based on the particularities of a youth and adult literacy programme being implemented in Colombia.

The impact of adult education

People who access adult education typically have significant restrictions on their time due to family and work commitments (Schuetze and Slowey 2002). As a result, they tend to have expectations of the educational process which differ from those of traditional students. It is therefore important to develop educational programmes which take a holistic approach to the teaching–learning process, taking into account users’ needs, age, habits, personal interests, and the amount of time they are able to dedicate to learning, among other factors (Castaño et al. 2013).

Effective adult education programmes are designed to facilitate the learning of concepts, skills and abilities. International programmes which seek to find systematic solutions to mitigate low literacy, and which are adapted to specific contexts and conditions, have already made substantial progress. A study by Eileen Brennan et al. (2016) details the curriculum available for these types of programmes. Brennan and her colleagues found that programmes which emphasised positive reinforcement and encouraged students to take a proactive attitude towards their learning resulted in improvements in students’ academic development. Ellis and Richardson (2012) identify four key competencies that adult education teachers should have: (1) disciplinary knowledge, (2) experience in adult education, (3) the ability to form positive interpersonal relationships with their students, and (4) an ethical and professional approach.

Studies have also highlighted the positive impact of quality adult education programmes on the health of their participants (Hamilton 2014; Yamashita et al. 2019). At their best, such programmes can enhance participants’ self-esteem and social networks, improving their overall social and psychological well-being (Lucas-Molina et al. 2015). Hal Beder (1999) studied the impact of adult literacy in the United States and noted an improvement in participants’ self-image and self-esteem, making students more likely to want to continue studying throughout their lives (Trudell and Cheffy 2019).

Rukmini Banerji et al. (2017) found an improvement in a number of areas, including self-esteem, analytical and problem-solving abilities, community participation, resource management, early childhood education, gender roles and health awareness. Students reported feeling empowered by having a role to play in a social context, whilst also taking greater responsibility for improving their quality of life. Social care-oriented educators also reveal how programmes with a social justice orientation enhance participants’ lives on the emotional and social level more than on the economic and labour level (Taber 2011).

Lifelong learning helps older people to adapt to today’s rapidly changing society, as conditions for good health in adulthood are declining (Tikkanen 2017). Continuing educationFootnote 2 provides older people with the resources they need to remain in the labour market, for example through upgrading their skills in the use of new technologies (Yamashita et al. 2019). It is a proven fact that ensuring that people can access education at any time in their lives enhances quality of life for both men and women. Therefore, countries worldwide should commit themselves to guaranteeing access to education across the lifespan (Hamilton 2014).

In Cuba, for example, access to education for both young people and adults is constitutionally enshrined (Yuni and Urbano 2014). It is (a) flexible, adapted to the needs, motivations and interests of the beneficiary population; (b) massive, with nationwide coverage; (c) comprehensive, with a curriculum that includes personal, social and academic aspects; (d) coordinated, involving joint work between the Ministry of Education and social institutions, and (e) free, as education is considered a fundamental right (MinEdCu 2018). This explains why Cuba had a literacy rate of 99.8 per cent in 2012 among the population aged 15+ and serves as a reference point for all of Latin America (World Bank 2018).

Colombia

In Colombia, the National Programme for Literacy and Basic and Secondary Education for Youth and Adults (PNA), also constitutionally enshrined, contains two flexible and relevant teaching models: A Crecer [To grow] and Aprender a Aprender [Learning to learn], both sponsored by the Family Compensation Fund (CAFAM).Footnote 3 The programme helped to reduce the country’s rate of people with low literacy skills from 7.2 per cent in 2003 to 6.1 per cent in 2015, and to increase the literacy rate from 93.9 per cent in 2015 to 96.2 per cent by 2018. This represents 676,000 newly literate youth and adults (MinEducación 2005).

We chose to focus our own study on a third initiative entitled Avancemos. Like A Crecer and Aprender a Aprender, Avancemos was created in line with government policies and has contributed to reducing rates of people with low literacy skills in the region in which it operates. The Avancemos programme was created in 1991 and launched in 1993 under the same methodology as A Crecer to respond to the educational needs of adults in Tolima.Footnote 4 Due to social and economic disadvantages, including unemployment, poverty, and forced displacement due to armed conflict, many adults in this administrative department were not able to complete their formal education (Avancemos 2016). Since its inception, the programme has meanwhile engaged about 25,000 students in basic and secondary education (Parra and Álvarez 2020).

Like other adult literacy programmes, Avancemos is evaluated by coverage indicators, such as number of enrollees and graduates, but its full impact is unknown. We wanted to find out whether the programme had an impact on the well-being and quality of life of the beneficiary population (Biencinto et al. 2005; Folgueiras and Marín 2009; Solórzano 2005) similar to that which other adult education programmes have been shown to have (Brennan et al. 2016; Lucas-Molina et al. 2015; Banerji et al. 2017).

Education is recognised as a fundamental part of people’s lives (Pajaziti 2014); it is therefore essential to evaluate existing programmes in order to identify best practice and continue to improve quality. Takashi Yamashita et al. (2015) conducted a study that identifies positive factors in the adult education programme of the Osher Lifelong Learning Institute in the United States. Their research involved the participation of 330 older adults and showed that participants’ satisfaction varied according to gender, number of households participating, income, religious affiliation, health self-assessment, and number of courses taken.

Maria Roxana Solórzano (2005) designed a Social impact assessment model for literacy programmes. She argues that while an accurate measurement of impact requires taking account of participants’ learning outcomes, it also needs to look at the direct and indirect social effects of the programmes they participated in. She proposes a system of variables, dimensions, indicators and sub-indicators, which includes analysis of (a) the personal potential variable, concerning individual action and interpersonal relationships; and (b) the social potentialities variable, comprising update and improvement and social participation. She applied this analysis to participants themselves as well as their families and their social groups.

As noted above, adult education programmes have a significant impact on the quality of life of youth and adults. Governments and ministries responsible for regulating the quality of education should therefore take an interest in ensuring that such programmes are indeed generating a positive impact on those who participate in them (Hamilton 2014; Yamashita et al. 2019). This is why we set out to design an instrument to assess the impact of literacy programmes (MinEducación 2010) for youth and adults, focusing on one such programme in Colombia. Our proposed instrument aims to achieve a holistic assessment of the overall effect that such programmes can have on students’ lives.

Method

We engaged in quantitative research (Creswell et al. 2008), combining its transversal descriptive design (León and Montero 2015) with our selected evaluation model, which is based on the non-experimental sólo después [only after] method (Cohen and Franco 1992).Footnote 5 Before elaborating the measurement scale, we defined the dimensions and sub-dimensions we were interested in, based on a bibliographic and retrospective review of the topic, and on expert analysis.

Description of the programme

Avancemos is a second-chance programme of education for youth and adults which was launched more than two decades ago and has since been developed by the University of Ibagué, Colombia. Today, it comprises four integrated, accelerated and flexible academic cycles, which together last two years. The cycles are divided into two levels: the first corresponds to primary and lower secondary level, while the second covers upper secondary (academic middle school) level and is made up of two-semester cycles. The methodology used by the Avancemos programme follows the guidelines of University of Ibagué’s Institutional Educational ProjectFootnote 6 (Avancemos 2016), which is supervised by the Ministry of Education each year.

The first level of the literacy programme for adults (Avancemos) provides basic knowledge in the following literacy areas: natural sciences and environmental education, social sciences, history, geography, political constitution and democracy, artistic education, ethical education and human values, physical education, recreation and sports, humanities, Spanish language and foreign languages (English), mathematics, and technology and computer science. The second level focuses on economics, political science and philosophy. Teaching content is regularly updated according to national regulations.

Developing the instrument

Our research team comprised four members (the authors of this article). Our combined expertise included having worked on the design and validation of psychometric instruments and on the elaboration of impact evaluation instruments, as well as on the evaluation of education programmes and working with communities. We developed our model in three phases. The first phase included the definition of dimensional and sub-dimensional features; the second phase proposed the design of the measurement scale, and the third phase involved finalising the survey questionnaire, distributing it for data gathering and processing, and conducting our analysis.

Phase 1: definition of a general framework and variables

This phase included the review of institutional documents such as the Social Responsibility Policy of the University of Ibagué (Universidad de Ibagué 2014), the Institutional Educational Plan of the Avancemos Programme (Avancemos 2016), and the doctoral thesis of Solórzano (2005). This made it easier to understand and clarify the purpose and scope of this type of programme.

After this review, we conducted three focus group discussions, one each for samples of the programme’s students (15), teachers/managers (14/2), and graduates (10) respectively.Footnote 7 These discussions were complemented by in-depth interviews to identify which areas the evaluation should focus on, from the perspective of the different actors (Hamui-Sutton and Varela-Ruiz 2013). We processed the information from the focus group discussions via content analysis (Bardin 2002). From this, we derived a descriptive analysis of matrices. We attempted to gather the most relevant information as input for the study.

The two steps of this phase resulted in a document that outlined the first version of our six dimensions (1) “personal sphere”; (2) “social skills”; (3) “relationship with the environment”; (4) “interpersonal relationships”; (5) “cognitive sphere”; and (6) “economic, labour and/or academic situation”. and their 33 sub-dimensions. The literature we had reviewed provided theoretical and conceptual support for the dimensions we had identified during the discussions and interviews. In Tables 1, 2, 3, 4, 5, 6, we present summaries of the definitions and theoretical references for each dimension.

Table 1 Impact on personal sphere
Table 2 Impact on social skills
Table 3 Impact on students’ relationship with the environment
Table 4 Impact on interpersonal relationships
Table 5 Impact on cognitive sphere
Table 6 Impact on students’ economic, professional and/or academic situation

Phase 2: construction of items

Based on Phase 1, we then proceeded to compile a list of 83 items (not presented in this article) to accord with the proposed 33 sub-dimensions. From this list, we devised a Likert scaleFootnote 8 survey (Downing and Haladyna 2006), covering features such as representativeness, relevance, diversity, clarity, simplicity and comprehensibility (Muñiz et al. 2005). We evaluated the impact of the programme using the International Labour Organization’s Training impact assessment guide (ILO 2011) and asking: “What impact did the programme have?” We invited our participants to respond to each statement by completing the sentence: “My participation in the programme allowed me to improve in …”. Participants had to rate the impact of the programme for each item on a 5-point Likert scale ranging from 1 (high impact) to 5 (no impact). Given the ad hoc design of the instrument, efforts had to be made to guarantee validity and reliability (León and Montero 2015; Pérez-Escoda and Rodríguez-Conde 2016). To ascertain the level of reliability (in terms of how well our scale was measuring what it was meant to measure), we applied the statistical formula of Cronbach’s alpha. It was equivalent to 0.989, showing that our proposed scale presented a high level of internal consistency (Pardo et al. 2015).

We also checked the validity of the content by carrying out an expert trial (and, in Phase 3, a pilot test) (Cubo et al. 2011). The expert trial was conducted with a panel of six specialists in education and psychology teaching at our university. These experts received the survey by e-mail and were invited to rate each item on a scale ranging from 1 to 4 (1 = deficient, 4 = excellent), according to three criteria:

  1. a)

    clarity (the item is clear and adequate in syntax and semantics);

  2. b)

    consistency (the item is logical and internally coherent); and

  3. c)

    relevance (the item is essential and/or important and should therefore be included).

As a result of this expert trial, we reduced the number of items to 65, keeping the number of 6 dimensions and 33 sub-dimensions (already shown in Tables 1, 2, 3, 4, 5, 6) in the scale. The 65 impact items are shown in Table 7.

Table 7 65 items of perceived impact validated by judges

Phase 3: pilot tests

Finally, in order to review the accuracy of the instrument, we conducted a pilot test with 20 graduates of the Avancemos programme. The results confirmed the relevance of the proposed items and prompted us to make a few adjustments to the order of their presentation (see Table 7).

To measure participants’ perception of the programme’s impact, we suggested a score ranging from 1 to 5 (1 = totally disagree; 5 = totally agree) for responding to the following statement: “My participation in the Avancemos programme allowed me to improve in …” Next, we contacted 132 Avancemos graduates using different modes: e-mail, phone and personal home visit, explaining the purpose of our research and asking them to participate in our survey. We ran into a few difficulties, such as an outdated contact database and limited internet access.

Participants

For the sake of convenience, we distributed our finalised questionnaire to a non-probabilistic (i.e. non-random) sample (León and Montero 2015) of 132 graduates of the Avancemos programme, all of whom lived in the city of Ibagué, Colombia. However, only 124 graduates completed their questionnaire in full. In terms of age and gender, 81 per cent of the graduates were between 18 and 33 years old; 51 per cent were women and 49 per cent men; 42 per cent graduated before 2015.

This group was supplemented by the 14 professors, 15 students, 10 graduates, 2 former managers and a relative of a graduate who participated in focus groups in the first phase of developing our instrument, as well as 6 specialists in education and psychology in the second phase.

Results

In this section, we present the results from our survey using this instrument to measure the impact of the Avancemos programme on the students who participated in it. These results include impact perception measurement, item analysis, factor analysis of the instrument itself, reliability, descriptive statistics and correlations (see Figure 1).

Figure 1
figure 1

Notes: The numbers on the vertical axis (1–65) correspond to the item numbers shown in Table 7

Bar graph of standard scores of perceived impact.

Impact perception measurement and item analysis

The 65 items on the scale were applied to the 132 graduates, and 124 full responses were received, with an average of M = 4.00 and a standard deviation of SD = 0.139. From these data, we calculated standard scores (Z = X – M / SD) for each of the items to determine the impact of the programme as perceived by the respondents. Items with deviations of + or 1 indicate greater or lesser perceived impact.

Items perceived to be affected with the least impact

Based on deviation no. 2, we were able to determine the items on which graduates perceived the programme to have had the least impact (Z < 1). These items, in ascending order, were:

  1. 1.

    1 “The way I feel about my physical appearance” (Z = 2.73);

  2. 2.

    37 “Expand my group of friends” (Z = 2.49)

  3. 3.

    58 “The ability to generate business ideas” (Z = 2.38);

  4. 4.

    33 “Being interested in the needs of my community” (Z = 1.51);

  5. 5.

    47 “My ability to transform things with my hands” (Z = 1.51);

  6. 6.

    49 “How quickly I read a text” (Z = 1.51);

  7. 7.

    46 “Devising innovative solutions to problems” (Z = 1.33);

  8. 8.

    61 “Keeping the job I have” (Z = 1.28);

  9. 9.

    2 “Paying more attention to my personal care” (Z = 1.21);

  10. 10.

    24 “The skills to generate problem-solving alternatives” (Z = 1.16); and

  11. 11.

    50 “My ability to convey my ideas in writing” (Z = 1.04).

Items with greater perceived impact

At the other end of the scale were the items with scores of Z > 1, reflecting greater perceived impact. The items which received the highest scores, in ascending order, were:

  1. 1.

    11 “My performance in academic activities” (Z = 1.10),

  2. 2.

    43 “My ability to generate opinions on a topic” (Z = 1.10);

  3. 3.

    32 “The importance of working for the well-being of my community” (Z = 1.16);

  4. 4.

    34 “Recognising the importance of caring for the environment” (Z = 1.22);

  5. 5.

    3 “Recognising my personal qualities” (Z = 1.33);

  6. 6.

    40 “Valuing my family” (Z = 1.33),

  7. 7.

    64 “Persevering to achieve my goals” (Z = 1.33);

  8. 8.

    30 “My interest in motivating other people to finish their studies” (Z = 1.45);

  9. 9.

    61 “Setting myself goals to achieve what I want according to my life project” (Z = 1.57);

and finally, with the highest score, item

  1. 1.

    65 “Progressing to higher education” (Z = 1.74).

Factor analysis

Next, we carried out a Confirmatory Factor Analysis (CFA) using the Maximum Probability method in order to test the model obtained from the previous phases (see Table 8). We tested different fitting models through Chi-square (χ2) (Hu and Bentler 1999), the Goodness of Fit Index (GFI), the Comparative Fit Index (CFI), and the Normed Fit Index (NFI), in which close values above 90 are indicators of a good fit (Abad et al. 2011; Hair et al. 2004), and the Root Mean Square Error Approach (RMSEA), where values of .08 or less are acceptable (Byrne 2006). We carried out our analysis using the Statistical Package for the Social Sciences (SPSS) AMOS version 24 software programme for the review of fitting models in the CFAs. We reviewed several models. Some included all 65 items; others included only those items that showed item-total correlations greater than 0.30 and factor loads less than 0.40 in the Exploratory Factor Analysis (EFA) (Ferguson and Cox 1993). In this article, we present the results of the four models (1–4) we considered most representative of the final model (4).

Table 8 Fit models of the impact perception scale of a reduced graduate education programme with maximum likelihood estimation

The first model we present corresponds to the null hypothesis, where the perception of the impact of an educational programme is represented by a single factor and not by six areas or dimensions. The second model corresponds to the six-dimensional composition driven by the construction phases of the instrument described above. The third model was formed from a single factor of the items with weight factors lower than 0.40. The fourth is a five-factor model with 26 items that showed a better fit, with χ2 = 507,224, p < 0.001, df = 289 and AIC = 631,224. GFI fitting indexes were .762, CFI = .875, NFI = .756, RMSEA = .078 and SRMR = 0.06.

The results for our final model (4) show that the scale now has a five-dimensional factorial structure: (1) “Personal sphere”, with six sub-dimensions and fluctuating loads between 0.62 to 0.79; (2) “Social skills”, featuring six sub-dimensions with loads between 0.73 and 0.83; (3) “Life project”, four sub-dimensions with loads between 0.45 and 0.71; (4) “Knowledge”, five sub-dimensions with loads between 0.67 and 0.70.; and (5) “Economic situation”, five sub-dimensions with loads between 0.51 and 0.75 (see Figure 2). We reduced to five dimensions because we discovered that they were the items with the best factor loadings and that they generated the best fit. The dimensions “relationship with the environment” and “interpersonal relationships” have disappeared because their items generated multicollinearity.Footnote 9 In Figure 2, “Life project” is the same as “Life project, goal setting”.

Figure 2
figure 2

Source: The authors, based on SPPS AMOS 24

Confirmatory factorial analysis diagram of the perceived impact of an educational programme.

Reliability, descriptive statistics and correlations

The five dimensions/factors in our final model showed acceptable internal consistencies at between 0.734 and 0.899. Judging by the responses obtained from Avancemos graduates, the programme had little impact on their economic situation (M = 3.86; SD = 0.72), but a high impact on their life project (M = 4.19; SD = 0.65) and their personal sphere (M = 4.08; SD = 0.79). Correlations reveal medium and high relationships between different factors, showing that, although the factors are related, they are nevertheless distinct.

Strong and positive correlations (r > 0.70) were identified between the perceived impact of the programme in the personal, knowledge and economic dimensions. Social skills and knowledge were also strongly related. Wa also found relatively strong relationships (r > 0.60) between the personal and life project dimensions; between social skills, knowledge and economic situation; between life project and knowledge; and between knowledge and economic situation. An average relationship (r > 0.5) was found between economic situation and life project (see Table 9).

Table 9 Relationships among evaluated variables

Discussion

This research led to our development of EduIMPACT, a measuring instrument for the evaluation of literacy programmes for youth and adults, designed from a theoretical construct of the impact perceived by graduates. We measured a total of 65 items according to a 26-point scale and organised in terms of six central dimensions: (1) individual, (2) social, (3) relationship with the environment, (4) interpersonal, (5) cognitive and (6) economic, academic and labour-related. These dimensions show that improving literacy also generates benefits in personal and social spheres (Hamilton 2014; Lucas-Molina et al. 2015; Banerji et al. 2017; Yamashita et al. 2019).

Literacy is one of the key goals of the United Nations (UN) and the Organization of Ibero-American States for Education, Science and Culture (OEI). In Latin American countries, adult education is one of the main strategies being employed to close the literacy gap. We need therefore to be sure that the programmes implemented in service of this task will have as great an impact as possible on the lives of those who participate in them. The instrument we have devised is thus an important contribution to the self-evaluation and continued improvement of educational institutions.

Our measurements show that the Avancemos programme has indeed had a positive impact on students’ lives. The strong impact of the programme on the dimensions of life project and personal sphere may indicate that this type of training can have a positive effect on students’ goals and objectives for life, as well as on their self-perception, self-efficacy, and contribution to the social environment. This resonates with other studies, especially those concerning the effects of improved literacy on youth and adults’ self-image, self-esteem, and their desire to pursue higher studies (Brennan et al.; Beder 1999; Lucas-Molina et al. 2015).

Impact on learners’ economic situation, in terms of enhancing their employability and entrepreneurship, was more moderate. This can be seen as an opportunity to develop improvements and find better ways to further students’ skills in these areas. This recommendation coincides with insights of Beder (1999), who emphasises students’ desire to pursue higher studies and to improve their employability, job status and income. Relationship analysis allows us to infer that those who participate in adult education believe that it furnishes them with better skills and greater earning potential than they had before. This finding is supported by the work of Yamashita et al. (2019),who identify that one of the benefits of adult education lies in helping students to adapt to changes in the workplace and the wider world.

This research carries a number of implications for those who administer youth and adult education programmes. First, we have proposed a scale to measure the impact of adult education programmes. This instrument can be adapted/translated into other languages if required, provided that the programme to be evaluated operates in similar conditions. This scale has undergone a meticulous process of theoretical and empirical development and features the necessary dimensions to measure the impact of a programme on its beneficiaries.

Our second contribution consists in proposing dimensions that should be taken into account when developing education programmes for youth and adults. These dimensions are a means of organising the various intended impacts into a small number of overarching categories, according to which curricula and activities can then be constructed.

Third, our results provide evidence confirming insights of several studies (many of which have been cited in this document) that emphasise the impact of adult education programmes on students’ cognitive and economic capacities and well-being.

The recommendations resulting from this research exercise allow for the evaluation of impact on 65 separate items with 33 sub-dimensions, which according to the theoretical review and the judgment of experts should be interpreted separately. On the other hand, the 26-sub-dimensional scale with validated factorial composition allows for correlational analysis with other variables and/or dimensions.

One limitation of this study was the small sample size (n = 132), which could have affected the result of the factor model found. If different results are found by other researchers, we recommend testing the model on larger samples and continuing to adjust it for subsequent studies.

We believe that this study makes a positive contribution to the evaluation of youth and adult education programmes in Colombia and other Latin American countries. Such evaluations may make it easier to design targeted programmes effectively. Evaluation of the impact of training programmes usually makes use of objective indicators and qualitative techniques (ILO 2011). We present a scale that allows for the collection and quantification of qualitative first-hand information about students’ or graduates’ perceptions of a training programme. This instrument will facilitate descriptive and correlational research on the perception of impact, and will complement the process of creating and improving educational programmes. The scale is undergirded by a robust review that gives theoretical support to each dimension. The results of our study demonstrate the existence of a construct of perceived impact.