Analysis of Mathematics National Exam Questions based on TIMSS Taxonomy

The purpose of this research was to find out the spread out content and cognitive dimension of the question of Mathematics National Exam for Junior High School (SMP) / Islamic Junior High School (MTs) in the school year of 2005/2006-2018/2019 and find out the suitability question of Mathematics Final Exam in Junior High School based on government guidelines and TIMSS taxonomy. This type of research is a qualitative approach. This research use collection data with document analysis. Data analysis techniques consist of data reduction, data display, and verification. The research subjects were in the form of 14 SMP / MTs national mathematics exam texts from the 2005/2006 to 2018/2019 academic year. The indicators of content dimensions and cognitive dimensions used are based on the TIMSS taxonomy. Checking the validity of the data, the researcher uses a triangulation by an expert through Online FGD (Focus Group Discussion).The research result as follows: 1. An analysis a question of Mathematics Final Exam in Junior High School in the school year of 2005/2006-2018/2019 based on a taxonomy TIMSS obtain results that percentage spread of content and cognitive dimension still not appropriate yet with TIMSS Assessment Framework. At the content dimension, the spread of a question dominated by geometry domain while data and probability have spread of a question very few. Then, algebra and number domain pretty close proportion which have been specified by TIMSS. 2. the question of Mathematics Final Exam in Junior High School to the scope of material which set by the government was not given percentage distribution of materials obviously while cognitive level from government obtains a result that different with findings result in the researcher based on TIMSS taxonomy because there is different definition at government cognitive level from with TIMSS cognitive domain. There is a question that includes an application in government cognitive level however it includes knowing domain based on TIMSS taxonomy. Suitability a question of Final Exam from the school year of 2005/2006 till 2018/2019 indicate there are percentages increase from year to year close appropriate to proportion TIMSS Assessment Framework. It shows the percentage each other domain close proportion TIMSS in content and cognitive dimension.


TIMSS (Trends In International Mathematics and Science Study) is an international study organized by the International Association for the Evaluation of Education (IEA),
which is an international association to assess Mathematical and Science achievements in Education (Hadi & Novaliyosi, 2019). TIMSS will measure the achievements of students grade IV and VIII in mathematics and science of participating countries. For Indonesia, the benefits obtained include knowing the achievements of Indonesian students when compared to other students performances in other countries as well as knowing the factors that influence it. TIMSS aims at determining the improvement of mathematics and science learning where the framework for assessment of mathematics ability is assessed using terms of dimensions and domains (Pratiwi, 2016).
The basis of mathematical and science achievement assessments in TIMSS are categorized in two domains, namely the content and cognitive domain by considering the curriculum in the country. Content dimension consists of four domains, namely: numbers, algebra, geometry, data and opportunities. Each domain is further detailed on several topics (Mullis, I. V. S., Martin, 2019).
The proportion of ability that is assessed on the dimensions of the content are divided into four domains as follows: 1) Numbers (30%) with the topic of whole number, fractions, decimals, and integers, ratios, proportions, and percent. 2) Algebra (30%) with the topic of algebraic expression and operation, equations and inequality, relations and functions. 3) Geometry (20%) with the topic of geometry forms, location and displacement. 4) Data and opportunities (20%) with the topic of data characteristics, data interpretation, and opportunities.
The cognitive domain consists of three domains, they are knowing, applying and reasoning. Cognitive dimensions are defined as the expected behaviours of students when they are dealing with the mathematical domain included in the content dimension. The proportion of the ability tested on the cognitive dimension is divided into three domains as follows: 1) Knowing (35%) with the topic of recall, recognize, classify / order, retrieves, and measure. 2) Applying (40%) with the topic of determine, representatives / models, and implements. 3) Reasoning (25%) with the topic of analyse, integrate / synthesize, evaluate, draw, generalize, and justify. Curriculum changes in schools or madrasas are very common in the educational world (Prastowo et al., 2018). In the history of primary and secondary education in Indonesia, there were at least 10 types of curriculum that were used. That curriculum that were applied and used since the era of post-independence to the current curriculum The role of the government to improve the education system in Indonesia is by changing the curriculum. (Prasetyo & Rudhito, 2016) stated that the rationale behind the launching of the 2013 curriculum policy was the low competency of human resources as reflected in the results of the TIMSS (Trends International Mathematics and Science) where the international mathematical competency test results placed Indonesia in the lower ranking.
The implementation of the 2013 curriculum (Kemendikbud, 2013a) requires competency-based assessment includes attitude, knowledge, skills that are integrated with the learning process and make a portfolio as the main instrument (Sumaryono, 2016). The main purpose of mathematics learning in junior high school / MTs (Kemendikbud, 2013a) is that the students can develop attitudes, understanding and skills that are in accordance with the characteristics of mathematics. In the case of developing attitudes, students are expected to be able to think critically, logically, analytic, and creative, to respect mathematics in life by growing curiosity, attention, and interest in studying mathematics, resilient and confidence in solving problems in their daily lives.
The 2013 curriculum emphasize the essence of the scientific approach in learning (Lestari, 2018). The scientific approach is believed to be a good way to develop attitudes, skills, and knowledge of students. This scientific education includes observing, questioning, trying, processing, presenting, concluding, and creating in all subjects. Those process of thinking is in accordance with the mathematical thinking where mathematics has a structure with a strong and clear relationship with others and has the deductive and consistent mindset.
These efforts from the government are aimed to improve the intelligence of students, especially in mathematics subjects, but international mathematics achievements Hipotenusa Journal, 3 (1), June 2021 Nur Colis in junior high school students is still low. (Mawarni, 2020)  The learning outcomes indicators can be used as a basis for the assessment of students in achieving the expected learning outcomes and performance (Wulan & Rusdiana, 2013). An assessment or evaluation is needed to find out the achievement of learning outcomes. The results of the evaluation can describe the progress of education itself, and more specifically, the quality of education from time to time. The results of the evaluation can also be used to compare the achievement of schools in one region or between regions.
In general, the purpose of learning evaluation is to determine the effectiveness and efficiency of the learning system (Asrul et al., 2014). The intended learning system includes objectives, material, methods, media, learning sources, environment and the assessment system. In addition, learning evaluation is also aimed at assessing the effectiveness of learning strategies, assessing and increasing the effectiveness of the curriculum program, assessing and increasing the effectiveness of learning, helping students in learning, identifying the strengths and weaknesses of students, and to providing data that helps teachers and stakeholders to make decisions.
The National Examination is a term for the assessment of national student competencies at the level of basic and secondary education. Reporting from the Kemendikbud Puspendik page (www.pusmenjar.kemdikbud.go.id, 2019A), the National Examination was held to measure the achievement of graduates students competencies at an education unit as a result of the learning process in accordance with the Standards Competency of Graduates (SKL). In addition, the National Examination can also be used to map the level of students' achievements at the respective education institution. This national exam is the peak point of the students' learning achievements for 6 years for the elementary school level and three years for the junior high school level (www.pusmenjar.kemdikbud.go.id, 2019b).
Whether or not the implementation of the curriculum succeeded, it can be seen from the success of the national examination, namely the students' ability to master the basic competencies stipulated in the curriculum according to the school level (Fahmi, 2011). The policy carried out by the government regarding national exams is basically an evaluation step to set a value standard to map the quality and competence of graduates.
Therefore, the preparation of the Pre National Exam questions should consider which cognitive level to be measured in order to prepare the students for the real National Examination.
Lately, Indonesian students have rejoiced because the national examination is permanently abolished by the Minister of Education and Culture (Mendikbud) Nadiem Makarim in 2020 and replacing it with the assessment of minimum competencies and character surveys (Damaledo, 2019). He said the policy was a follow-up to President Joko Widodo's direction to improve the quality of human resources (HR). Meaning that starting from 2021, the assessment of minimum competency and character surveys will be carried out for the first time. Totok said in recent years since the computer-based national exam (UNBK) was held, the average score of the National Examination fell. According to him, it is a true condition. The Ministry of Education and Culture hopes that the students' ability will increase from the previous year, which is currently a lot them still struggle with the low The influence of TIMSS on Indonesian education is very high. It is reflected in the change of the national education policy as planned in the upcoming year 2021, that the National Examination will be replaced with the minimum competency assessment and character surveys. (Makdori, 2019) This policy is none other than one of its factors referring to TIMSS. This was conveyed directly by the Minister of Education and Culture, Nadiem Kariem, who said that the direction of this policy also refers to the international level, such as Pisa and TIMSS.
The assessment of the minimum competency scheme and the character survey will be similar to the TIMSS research scheme. The implementation of the minimum competency assessment will be carried out on students in the middle of the school level, for instance it will be carried out for students class 4 at the level of elementary / equivalent, grade 8 at the level of junior high school or equivalent, and grade 11 in senior high school / equivalent.

Data analysis
The data is analysed using Miles & Huberman model which includes three activities, namely data reduction, data display and conclusions (verification). Data reduction refers to the process of selecting, focusing, simplifying, abstracting, and transforming the data that appear in written up field notes or transcription. Data reduction is carried out to select the data that is needed and to remove unnecessary data so that the research can be proceed to conclusions and verification. The data was presented in the form of a brief description about the results of an analysis of national exam questions so that the researcher will find it easier to understand about what happened and what should be planned. The conclusions will be in the form of description of an object that has been clearly assessed.
The assessments of mathematical and science achievement in TIMSS are categorized in two domains, namely content and cognitive domain by considering to the curriculum applied in the country. The distribution of specifications and assessment is as follows:  Draw conclusion makes a valid conclusion based on information and evidence.

Domain Proportion Topics
Generalize creates statements that represent more general relationship and the broader terms that apply.
Justify provides mathematical arguments to support a strategy or solution.
The study was conducted on a large scale, namely identifying the questions then mapping them based on the prepared guidelines.  Based on the result above, it can be explained further as follows:

An Analysis of the National Exam questions academic year 2005/2006
For the content dimension, the results showed that 16 items are questions about geometry (53%), followed by 8 items about algebra (27%), 5 items about numbers (17%), and 1 item about probability (3%). It can be seen that the data and probability domain has not been spread evenly according to TIMSS.

An Analysis of the National Exam questions academic year 2006/2007
The study showed that for the content dimension, about 27% of the questions are for numbers domain with 8 items of questions, 27% as well for algebra domain with 8 items, while geometry dominated the questions for about 40% with 12 items, and 6% for data and probability domain with 2 items. It can be seen that the data and probability domains have the lowest percentage, and on the contrary, the geometry domain has the highest percentage, and only the algebraic domain that was close to the TIMSS taxonomy. has not been spread evenly according to TIMSS taxonomy.

An Analysis of the National Exam questions academic year 2008/2009
The study showed that distribution of the content dimensions is 25% for numbers with 10 items of question, 30% for the algebraic domain with 12 items, 37.5% for the geometry domain with 15 items, and 7.5% for the data and probability domain with 3 items. It can be seen that the data and probability domains have the lowest percentage,

An Analysis of the National Exam questions academic year 2009/2010
The taxonomy.

An Analysis of the National Exam questions academic year 2010/2011 (Package 15)
It is showed that 17.5% of the questions for content domains were for numbers with 7 items, about 32.5% for algebraic domain with 13 items, 42.5% for geometry domain with 17 items, and only 7.5 % for data and opportunity domains with 3 items. It can be seen that the data and probability domains have the lowest percentage while the geometry domain has the highest percentage. The percentage in the geometry domain exceeded more than double the proportion set by TIMSS, the algebraic and numbers domain were still less than the proportion set by the TIMSS and the data and probability domains were still far below the predetermined proportion. This indicated that the mapping of the questions on the content dimension of the Mathematics National Exam Package (15) academic year 2010/2011 has not been evenly distributed according to the TIMSS taxonomy.
The result of the study also explained that the percentage of knowing domain in the cognitive dimension was 60% with 24 items, followed by 30% for applying domain with

An Analysis of the National Exam questions academic year 2011/2012 (Code: C32)
The findings showed that for the content dimension, about 22.5% of questions was for number domain with 9 items, 22.5% as well for algebraic domain with 9 items, 40% for the geometry domain with 16 items, and 15% for data and probability domain with 6 items of question. It can be seen that the data and opportunity

An Analysis of the National Exam questions academic year 2014/2015
The result of the study showed that for the content dimensions, 22.5% of the questions were for numbers domain with 9 items, 27.5% for algebra domain with 11 items, 40% for geometry domain with 16 items, and 10% for data and probability domain with 4 items. It can be seen that the data and opportunity domains have the lowest percentage while the geometry domain has the highest percentage. The percentage of the geometry domain exceeded double proportion set by the TIMSS taxonomy, the numbers and algebra domains were closer to the standard proportions, while the data and probability domains were still less than the specified proportions. This indicated that the mapping of the questions on the content dimensions of the Mathematics National Exam academic year 2014/2015 has not been evenly distributed according to the TIMSS taxonomy.
While for the cognitive dimensions, more than half of the questions (57.5%) came from the knowing domain with 23 items, followed by 32.5% for applying domain with 13 items, and 12.5% for reasoning with only 5 items. It can be seen that the knowing domain has the highest percentage while the reasoning domain has the lowest percentage.

Analysis of the National Exam questions academic year 2015/2016
It was found in this study that approximately 27.5% of the questions were for numbers domain with 11 items, 27.5% for algebra domain with 11 items, 30% for the geometry domain with 12 items, and 15% for data and probability domain with 6 item question. It can be seen that the data and opportunity

An Analysis of the National Exam questions academic year 2017/2018
The result of this study showed that for content dimensions, about 35% of questions were for numbers domain with 14 items, 25% for algebraic domain with 10 items, 27.5% for geometry domain with 11 items, and 12.5% for data and probability domain with 5 items. It can be seen that the data and opportunity

An Analysis of the National Exam questions academic year 2018/2019
The findings of the study showed that 30% of the questions on the content dimensions were for numbers domain with 12 items, 25% for algebraic with 10 items, 30% for geometry domain with 12 items, and 15% for data and probability domain with National Examination that the distribution of algebraic material from the government was 22.5%, and geometry was 32.5%, that was different from the findings of this study which were 25% and 30% respectively. At the 2018 National Examination, the government has determined for the algebraic material that should be 27.5% and geometry at 25%, but it was different from the findings of researchers in this study that were 25% and 27.5% respectively. And at the 2017 National Examination, the government has set algebraic material at 25% and geometry at 37.5%, but it is different from the findings of researchers in this research which were 27.5% and 35% respectively. Meanwhile, in the 2016 and 2015 National Examination, the distribution set by government is in accordance with the findings of this study, namely the material on numbers, algebra, geometry, data and probability with percentage of 27.5%, 27.5%, 30%, 15% respectively. In 2016 and 22.5%, 27.5%, 40%, 10% in 2015. The proportion of questions that was not set clearly by the government makes it difficult for both teachers and students to predict the questions in the national examination. This issue might as well confused the educators in helping the students to succeed in the national examination because the government did not determine the percentage of material in detail and clearly even though the material guidelines has been given.
The cognitive levels, which determined by the government, were categorised in three levels, they are knowing and understanding (25-30%), application (50-60%), and applying ( were dominated by the knowing domain from TIMSS which equivalent to the level of knowledge and understanding from the government. This difference occurred because of differences in perceptions between the government's cognitive level and the TIMSS cognitive dimension. The knowing domain, TIMSS has a definition that is similar to the level of application from the government, so that the results of the analysis showed that the knowing domains dominated the questions. This is shown in the fraction story questions, at the cognitive level, the government considers the questions to be at the application level, while TIMSS categorised it in knowing domain. Therefore, the findings of this study found the differences between what has been set by the TIMSS and the government. Apart from the government's cognitive level guideline, the Mathematics National Based on the conclusions above, the writer can provide some suggestions as follows: 1) For other researchers, this research instrument can be used as a consideration to assess the national exam questions more deeply as an effort to develop higher quality questions, 2) For educational evaluators, the results of this study can be used as a reference in determining the criteria for developing questions to improve the mathematics achievement of students in Indonesia both domestically and internationally.