Colombia: Can a Management and Information System Improve Education Quality?

successful in expanding access to education (getting children in school) than in improving the quality of education (producing better learning outcomes). While many factors contribute to learning outcomes, it is difficult for schools or governments to know which indicators of education quality to focus on. To address this, many countries have invested in monitoring and evaluation (M&E) systems to collect and disseminate information on the quality of education. However, this information is often highly fragmented, with different indicators used by different schools, regions, and programs. This makes it difficult for governments to design policies to address the most urgent priorities and to help the schools in greatest need and prevents coordination between the national, regional, and local levels of education management. Furthermore, many schools see M&E systems as punitive rather than as a way to help them to improve.

Most countries have been more successful in expanding access to education (getting children in school) than in improving the quality of education (producing better learning outcomes). While many factors contribute to learning outcomes, it is difficult for schools or governments to know which indicators of education quality to focus on. To address this, many countries have invested in monitoring and evaluation (M&E) systems to collect and disseminate information on the quality of education. However, this information is often highly fragmented, with different indicators used by different schools, regions, and programs. This makes it difficult for governments to design policies to address the most urgent priorities and to help the schools in greatest need and prevents coordination between the national, regional, and local levels of education management. Furthermore, many schools see M&E systems as punitive rather than as a way to help them to improve.
These issues have limited how useful these M&E systems have been in improving education quality. However, if well designed, an education quality monitoring system can have many benefits, including providing real-time information on the current state of education quality, improving managerial practices, identifying key areas that need improving and how to do so, fostering coordination, and informing evidence-based policies and programs. By re-orienting schools towards results instead of inputs, such a system can also help to establish the preconditions for implementing performance-based incentives at the school, local, regional, or national level.
The Results in Education for All Children (REACH) Trust Fund at the World Bank funded the development of a Management & Information System to monitor the quality of the education system in Colombia. This system builds on existing monitoring tools, which focus on outcome measures such as test scores but do not capture intermediate quality indicators that can shed light on how learning outcomes are achieved. The overarching purpose of this system is to foster improvement in the education system by informing the decision-making and everyday activities of education practitioners and policymakers. This can be achieved by: (a) gathering detailed and relevant information about activities within schools and (b) managing the information efficiently and making it accessible to users to enable them to analyze, understand, and provide evidencebased recommendations on how to improve education quality. This monitoring system is not intended to be an accountability mechanism for schools but rather a management tool for stakeholders to understand the strengths and weaknesses of the education system and take appropriate action.
The scope of this monitoring system was defined using in the following three methods: (a) focus group discussions with national, regional, and local education stakeholders; (b) a review of the literature dealing with the causal relationship between various factors and education quality in other settings; and (c) quantitative analysis to further investigate the relationship between these factors and education quality in Colombia. The results of these three exercises were synthesized into questionnaires aimed at principals, teachers, students, and parents to conduct a self-diagnostic assessment of each school. The information in these assessments was then posted on a digital dashboard that allows principals and teachers to explore an individual school's results and enables government staff to view data aggregated to the local, regional, or national level.

CONTEXT
Colombia has recently undergone a radical social transformation, with the historic Peace Agreement with FARC in 2017 that improved the prospects for sustained peace, and several important macroeconomic, structural, and social reforms have also been implemented since 2003. Investment in education has increased significantly, rising from 2.7 percent of GDP in 1990 to 4.5 percent in 1999, and then remaining stable as a share of GDP since then. 1 However, while these investments have led to modest improvements in the performance of Colombian students on the Saber and PISA standardized tests, Colombia remains internationally uncompetitive, with 38.2 percent of students failing to reach baseline proficiency in 2015. 2 There is also significant inequality in scores between rural and urban areas and between high-income and low-income households. For example, the average science score for students from the lowest income quintile was 385 compared to 461 for those from the highest income quintile. 3 While the national government through the Ministry of National Education (Ministerio de Educación Nacional, or MEN) is in charge of all public policy related to education, regional authorities are responsible for the administration of educational services, and schools themselves are directly in charge of education provision. Since 1991, the education system has been moving towards a more decentralized structure, giving schools more decisionmaking power over curricula and pedagogy. While this approach was taken to ensure the provision of diverse pedagogies to meet the varying needs of a widely diverse population, it makes it difficult to ensure the quality of the curriculum and pedagogy in all of Colombia's 57,000 schools. Meanwhile, schools have very little autonomy in the management of education resources (human, physical, or monetary). As a result, schools are being held accountable for their learning results but have little control over many of the factors that contribute to the achievement of these results. Given this situation, there is a critical need for a coordinated system for gathering information about schools to be used to design and target interventions to improve learning.

WHY WAS THE INTERVENTION CHOSEN?
The Management System for Education Quality (Sistema de Gestión de la Calidad Educativa, or SIGCE) in Colombia was developed to solve two related problems in the education system in Colombiainformation fragmentation and a lack of coordination among stakeholders-both of which limit the extent to which the education system can be improved. The way in which education ministries and local governments are typically organized makes it difficult to coordinate programs that span across units of the government, and they have few incentives to coordinate with each other. Each local government in Colombia has a different development plan with different objectives, which may or may not be aligned with the objectives of the MEN. This leads to fragmented programs, uneven implementation, and a lack of policy effectiveness. These different units also typically get their information from different sources for different topics, often with long lag times for data requests. There are 150 disconnected information systems within Colombia's MEN, and most deal only with enrollment, access, and learning outcomes rather than the more detailed indicators that would shed light on what factors contribute to education quality. As a result, while schools currently have access to data on their own learning outcomes, they have little information about how to improve them. Similarly, local, regional, and national governments lack a system-wide view of what schools need or how to devise policies to address those needs. Furthermore, even when data are readily available, technical staff may lack the expertise or management tools to analyze them effectively.
To address these problems, the SIGCE was designed to provide coordinated information to all stakeholders within the education system to enable them to track the system's progress and results, develop improvement plans at the school level, and design and prioritize policy interventions at the national level. A critical feature of the SIGCE was that it was designed with the aim of focusing the entire education system on results. By using the tool for planning as well as monitoring, schools and policymakers can be more effective in setting specific objectives, in selecting which interventions to prioritize to achieve these objectives, and in deploying resources effectively to achieve the desired results.
The first step in the development of the SIGCE was to define its scope. This involved defining: (a) the purpose of the system as a management tool to inform everyday decision-making; (b) the intended users (schools, local governments, and MEN officials); (c) what information would be included; and (d) how best to present this information to different types of users.

HOW DID THE INTERVENTION WORK?
To determine what information to include in the system, "education quality" was defined not just as learning outcomes but also as all of the factors that cause learning outcomes. Three methods were used to select which of these factors would be included in the monitoring system. First, focus group discussions were held with representative samples of principals, teachers, students, parents, government officials, and other education stakeholders to gather their input on which variables should be included based on their relevance to the participants and on how feasible it would be to collect them. Second, a literature review of recently published academic papers was conducted to gather the best available information on which variables already have been proven to have a causal relationship with education quality in other settings, using standardized test scores as a proxy for quality. Third, quantitative analysis was done to test hypotheses about which variables have a statistically significant relationship with student academic performance in Colombia, using Saber and PISA standardized test scores and Colombia's Synthetic Index of Education Quality as outcome measures. In defining the scope of the monitoring system, it was also necessary to identify which types of stakeholders and how many of each type would be included in the focus groups, the depth of information that would be included in the monitoring system, and the extent of the resources available to implement it.
The initial output of this three-part exercise was a comprehensive list of variables, categorized into 6 major dimensions, 25 areas, and 39 sub-areas. The variables were prioritized based on their demonstrated impact on education quality, the feasibility of collecting them, their ability to be easily understood by any actor, and their lack of potential self-reported bias. Each variable was then mapped to a specific question to be posed to principals, teachers, students, and parents, for whom corresponding questionnaires were then developed. Furthermore, thresholds were set for each indicator that identified the desired level for a school to achieve as well as the level that would trigger an alert. The final output of this process was an indicator list with clearly defined logical links between the various dimensions' areas, sub-areas, variables, questions, and indicators of education quality.
After the scope of the monitoring system had been defined, the SIGCE was implemented in three phases. The first phase involved each school developing a self-diagnosis using the questionnaires developed from the final indicator list. It was determined that principals, teachers, students, and parents would be the best sources of relevant information for the monitoring system, and, therefore, these stakeholders were tasked with completing the self-diagnostic assessment of each school. The second phase involved analyzing the information collected through these assessments to construct values for each indicator for each school. The third phase involved presenting the information captured in these indicators to all relevant stakeholders in a digital dashboard, which was designed to be comprehensive while at the same time user-friendly and easy to understand and to use for practical decision-making. The dashboard includes customized views for different types of users, giving an overview of the Colombian education system across the six dimensions of education quality and more detailed information on each area, sub-area, and indicator. The dashboard also allows government staff to develop detailed analyses by education level, school schedule, urban or rural location, and municipality to inform policymaking.

WHAT WERE THE RESULTS?
The first key result of the focus group discussions was the construction of six major dimensions to encompass all aspects of educational quality. Through the focus group discussions held with principals, teachers, students, and parents, the six dimensions of education quality were defined as: (a) Teachers, which captures variables related to the professional development and continuous improvement of teachers and school staff; (b) Academic and Pedagogical, which captures variables related to the process of knowledge acquisition at the student level; (c) School Environment and Wellbeing, which captures variables related to the emotional and physical environment of schools; (d) Family, School, and Community, which captures variables related to the relationship between the school and its local community; (e) Management and Administrative, which captures variables related to the smooth functioning of the school and the continuity of the learning process; and (f) Infrastructure, which captures variables related to the quality of schools' physical infrastructure.
The focus group discussions also yielded some key findings regarding schools' strengths and weaknesses: • Principals and teachers believe that the curriculum needs to be improved to strengthen schools' academic performance.
• Students believe that their teachers have strong classroom skills.
• Parents believe that the pedagogic and academic dimension in schools is weak.
• Teachers and principals consider themselves to be a strength in their schools, and a key element for achieving excellent school quality.
• Students believe that their teachers' profiles are the main cause of both the advantages and difficulties that they face in school.
• Parents believe that schools are mostly inclusive and non-discriminatory.
• Students believe that the school climate is one of the most positive aspects of their learning experience.
• All respondents believe that complementary services for school communities are lacking.
• All respondents believe that school buildings are in poor condition.
• Principals believe there is a significant lack of administrative personnel in schools.
The literature review identified several types of interventions that have been proven to improve education quality across the six dimensions. See Table 1.
The quantitative analysis confirmed that several of these variables have a significant relationship with education quality in Colombia. Data from several databases within Colombia were analyzed to measure the relationships between education quality variables and student performance, while controlling for student and school characteristics that are known  Table 2 were identified, with statistically insignificant results reported as not applicable (N/A).
It is important to mention that other dimensions also affect the quality of education, such as the sociodemographic characteristics of students and their families, the location and the surroundings of schools, and the academic and professional experience of teachers and principals. However, because schools are not able to influence these factors directly, principals and teachers should consider these dimensions simply as contextual information, but regional and national policymakers could use them in to develop comprehensive public policies.
The final result of the focus groups, literature review, and quantitative analysis was an indicator list to be included in the SIGCE. The final SIGCE indicator list consisted of 604 indicators that were then included in questionnaires to be distributed to the principals, teachers, students, and parents who would conduct the self-diagnostic assessments of their schools. The list included 51 indicators in the Teachers dimension, 163 in Academic and Pedagogical, 51 in Family, School, and Community, 208 in School Environment and Wellbeing, 29 in Management and Administrative, and 102 in Infrastructure.

WHAT WERE THE LESSONS LEARNED?
While the process of developing the SIGCE was careful and rigorous, an information system alone is not enough to guarantee resultstools must be used. Thus, it was also crucial to actively promote the use of the system in planning and implementation Furthermore, since the SIGCE relies on schools to conduct self-diagnostic assessments, the entire system depends on schools taking this process seriously.
It was critical to ensure that school managers understood that the system would help them to plan and to elicit more effective responses from local and central governments to address any shortcomings that were identified so they would see that the information that they entered into the SIGCE would produce useful results for them.
The design of the dashboard was also vital to ensure that the information contained in the SIGCE would prove to be valuable to both schools and governments. No matter how accurate and carefully measured the data were, they would not be used widely unless they were accessible and presented in an easily understood visual interface. Several interfaces were developed to allow users to see the data presented in different ways, and easy visual alerts were designed to help users to interpret the data, such as red-yellowgreen stoplights to put a school's performance on each indicator in the appropriate context.
Throughout all of the steps of the development process, feedback was sought from all relevant stakeholders on the indicator list, the logical flow from dimension to indicator, the design of the dashboard, and other features of the system to test and validate the monitoring system and to ensure that the system's users were on board.
Principals and teachers pointed out that the SIGCE can also reduce the administrative burden on them caused by the large number of information requirements from public and private institutions that schools must fulfil. If the new dashboard can solve this problem, it will be a valuable management tool for them.
One of the main conclusions of the focus group discussions with principals, teachers, students, and parents was that schools should use the SIGCE to identify their own strengths and weaknesses taking into account the perceptions of students, parents, teachers, and principals. Therefore, SIGCE can be used by schools for self-reflection and evaluation, rather than looking towards external entities to provide feedback and develop action items.
Finally, it is important to highlight that, during the development of the SIGCE, teachers and principals expressed an interest in continuing to use the SIGCE in the future and in receiving technical assistance to generate action plans to improve education quality at the beginning of the following school year.

CONCLUSION
This intervention has successfully created a management and information system for the public management of education in Colombia. The system-the SIGCE-is capable of collecting large amounts of school-level information at a relatively low cost and of aggregating and presenting this information in a user-friendly way. This system is expected to yield many benefits. First, it will improve school management by helping schools to identify their own strengths and weaknesses and to develop school improvement plans. Second, it constitutes a key source of information for evidence-based policymaking, helping policymakers at all levels of government to identify the factors that help to improve learning outcomes and to establish policies and programs that will help schools to improve in those areas. Third, it will help policymakers at Creation of the SIGCE management system establishes the necessary precondition for including performance-based incentives in Colombia.
all levels of government to monitor the progress of these policies and programs and to assess how well they are meeting schools' needs. Fourth, it will increase coordination between schools and the local, regional, and national education authorities in managing and improving education quality. Fifth, it will lay the foundation for the results-based evaluation of education policies based on their impact on both intermediate indicators of education quality and final learning outcomes. Sixth, it could potentially promote transparency in the public management of the education system by making information available to the public, including parents and other members of school communities. Finally, the creation of the SIGCE establishes the necessary precondition for introducing performance-based incentives in Colombia, for example, through the provision of incentives to local governments based on the performance of schools in their municipalities. This performancebased approach is likely to result in the more efficient use of resources by making it possible to target those schools and interventions that have the greatest potential for improving education quality.