A Short Instrument for Measuring Students' Confidence with ‘Key Skills' (SICKS)_ Development, Validation and Initial Results

Despite a shift in educational landscape towards more competence-based pedagogies that focus on developing students' higher order thinking skills, there remains a dearth of pragmatic measures to evaluate the skills deemed desirable in emerging curricula. The “Student Instrument for measuring Confidence in ‘Key Skills'” (SICKS) is based on a pre-existing, teacher-focused instrument. SICKS can be used to assess post-primary students' (ages 12 19) confidence levels across six variables corresponding to what are commonly designated ‘key skills': communication, collaboration, critical thinking, creativity and innovation, self-direction, and technology for learning. This paper presents the rationale for the development of the instrument, and a psychometric analysis of its validity. It also reports on preliminary analysis of responses of 507 students from 20 schools, demonstrating the power of the instrument to provide insightful information for practitioners and policymakers. Gender differences and disparities between socio-economic groups in their confidence levels were revealed, as were the positive implications that increased confidence in key skills may have on students' wellbeing, aspirations, and other experiences in school.


Introduction
The necessity to prepare young people for full participation in a rapidly changing society has been noted by educational policy makers at an international level (Shear, Novais, Means, Gallagher, & Langworthy, 2009), leading to increased emphasis being placed on the development of higher order thinking, and 'key skills', also known as '21st Century skills and competencies' (Ananiadou & Claro, 2009). There is a significant body of research that points to the need for '21st Century' methods of teaching and learning (Dede, 2010b;Voogt & Pelgrum, 2005;Voogt & Roblin, 2012), but mainstream institutions have been slow to change their approach (Fullan & Langworthy, 2014). Despite moves to change the curricular focus, the traditional model of schooling, with an exam-driven emphasis on content knowledge is still prevalent, and does not provide adequate preparation for the modern, knowledge-based society (Claxton, 2013;Fullan & Langworthy, 2014).
A number of reasons have been cited for the lack of uptake of these new pedagogies, including a lack of resources, inadequate professional development, and systemic difficulties relating to curriculum and assessment (Bray, Bauer, & Oldham, 2018;Euler & Maaß, 2011). Teachers are at the frontline of any educational reform and in order to increase the likelihood that it is enacted as intended, it is important to ensure that practitioners agree with the reasoning behind the reform as well as its implications for their students (Kärkkäinen, 2012). According to Fullan and Langworthy (2014): 'One of the biggest systemic challenges to the spread of the new pedagogies is that they are not yet being measured in any coherent way' (p 9). The work presented in this paper aims to go some way towards addressing these issues, through the provision of a short, practical scale for measuring students' confidence with six constructs commonly designated as 'key skills', and an exploration of the impact of differing levels of confidence on students' engagement with education.
This paper first provides a literature review that clarifies what we mean by 'key skills', discusses related curriculum changes and provides a foundation for the research aims of this work, examining the need for a scale to measure students' confidence in this area (Section 2). In Section 3, the key concepts underpinning the scale are described along with the process of validation of the scale for use with post-primary students. Section 4 presents the results of analysis of data gathered using the scale, highlighting the relevance of a scale to measure students' confidence with key skills in the current educational landscape.

Literature Review
This literature review address three areas. The first aims to outline what is meant by 'key skills' through an examination of various frameworks and meta-analyses. The second examines related changes in curriculum and pedagogy, and the third outlines assessment practices in relation to key skills. Based on this review, an argument for the work presented in the paper is provided.

What are 'key skills'?
While there is no single, globally agreed upon definition of key skills there are a number of frameworks that aim to outline the most relevant of the competencies. Such frameworks include the UNESCO four pillars of learning (Delors, 1996), the OECD DeSeCo (OECD, 2005), and the EU Key Competencies for Lifelong Learning (European Commission, 2006), as well as those provided by the Organisation for Economic Cooperation and Development (Ananiadou & Claro, 2009), the Partnership for 21st Century Learning (2015), and Ravitz, Hixson, English, and Mergendoller (2012)).
The meta-analyses of international frameworks for key skills provided by Dede (2010a); Voogt and Pelgrum (2005), and Voogt and Roblin (2012) highlight a common recognition of the importance of certain competences. These incorporate traditional academic skills as well as communication and collaboration, problem-solving, creativity, technological fluency, and self-direction (Fullan & Langworthy, 2014;Shear et al., 2009). Such skills are seen as being transversal (not subject-specific) and multi-dimensional (impacting on attitudes and knowledge), and are generally classified as higher-order thinking and learning skills (Voogt & Roblin, 2012).
For the purposes of this study, we combine this broad conception of key skills with the definitions provided by the Irish Junior Cycle 1 curriculum and the assessment framework of Ravitz (2014) described in Section 4. Six distinct key skills have been identified. The six skills, together with descriptions of their scope, are: (1) Critical Thinking (CT) -analysis of complex problems, investigation of questions for which there are no definitive answers, evaluation of information sources and use of appropriate evidence to draw conclusions; (2) Collaboration (CO) -ability to work together on projects or to solve problems, to work effectively and respectfully in teams to accomplish a common goal, while assuming shared responsibility for the completion of tasks; (3) Communication (CM) -ability to organise thoughts, data and findings and to share these effectively through a variety of media, including written reports, oral and digital presentations, film, etc.; (4) Creativity & Innovation (CR) -generation of solutions to complex problems or tasks based on analysis and synthesis of information, and the combination or presentation of the results in new and original ways; (5) Self-direction (S) -taking responsibility for one's own learning through the identification of topics to pursue and processes for learning, and for reviewing one's own work and responding to feedback; (6) Using Technology for Learning (T) -creation of products and management of learning using appropriate digital technologies.

A shifting educational landscape
The broad discourse around a European education that focuses on key competencies or skills can be traced back to the Lisbon Strategy in 2000, which prioritised the establishment of a competence-based education at all levels and forms of public education by 2013(Rodriguez et al., 2010. This transformed educational context prioritises a model of learning partnerships between and among students and teachers, that aims to develop knowledge, skills and competences that are relevant for the individual in society, and are enabled by ubiquitous digital access (Cort, 2014;Fullan & Langworthy, 2014).
As such ideas have gained traction, various educational systems have responded with attempts to transform their objectives, pedagogies and curricula in order to help students in their development of these key skills (Dede, 2010b). However, given that the curriculum is often already crowded, decisions regarding how to incorporate the new practices are challenging (Dede, 2010a). While there have always been outstanding teachers who were capable of incorporating a skills-based approach into their pedagogy, the challenge for curriculum development is to embed such practices within the design of curricula, in a broad, deliberate, and strategic manner (Dede, 2010a).
At international and European level, policy documents can be seen to recognise the need to embrace key skills within competencybased curricula (Cort, 2014;Drew, 2012; European Centre for the Development of Vocational Training (Cedefop), 2013). This paper 1 https://www.curriculumonline.ie/getmedia/def48e3f-68f9-42e4-95de-f30086321fd0/JSEC_Key_Skills_of-JC_English).pdf. A. Bray, et al. Thinking Skills and Creativity 37 (2020) 100700 has been written within the context of a period of systemic curriculum overhaul at lower secondary school level (age 12-15) in Ireland. One of the key changes illustrated in the statement on principles and implementation of the new Junior Cycle (O'Sullivan, Quinn, Irwin, MacGabhann, & King, 2015) refers to the need to recognise a wide range of learning. This concept forms the heart of the reform measures, and can be elaborated on as the promotion of active and collaborative learning that can facilitate a balance between the development of subject content knowledge, and the development of important life skills and thinking abilities (p. i). In order to achieve this, it is recognised that assessment practices will need to change, with a reduction in focus on a terminal examination and the introduction of classroom-based assessments. However, in order that it be successful, it is essential that teachers understand and are convinced of the benefits of such a shift in pedagogic focus.

Assessment of Key Skills
Although references to key skills are now well established in curricular frameworks and policy documents internationally, practical methods to assess them remain underdeveloped (Adamson & Darling-Hammond, 2015). This gap in assessment can be attributed to a number of factors, such as: trying to assess skills that are not yet well understood; an absence of reliable, valid and practical assessment measures; and students and teachers being more attuned to traditional, high-stakes assessments.
The second decade of the 21 st Century has seen pockets of innovation in the assessment of key skills (Care & Kim, 2018). Three projects of note that have attempted to measure these skills using innovative pedagogies and digital tools are: the Assessment and Teaching of Twenty-First Century Skills Project (ATC21S); the UNESCO-supported Education Research Institutes Network (ERI-NET) and Network on Education Quality Monitoring in the Asia-Pacific (NEQMAP) Project; and The Organisation for Economic Co-operation and Development (OECD)'s Programme for International Student Assessment (PISA).
During the years 2009-2017, the ATC21S project worked with education systems in Australia, Finland, Portugal, Singapore, England and the USA. They targeted two areas that had not been previously explored for assessment purposes: learning through digital networks, and collaborative problem solving (Griffin, Care, & McGaw, 2012). The project published 3 volumes of research, the last of which draws attention to the concerns and questions of researchers and practitioners. In particular it was highlighted that although 'the shift towards twenty-first century skills in national education systems is occurring, [it] is raising implementation issues in terms of teacher education and strategies', and that although new approaches to assessment are being explored, 'transfer of these new approaches remain to transition into the classroom' (Care, 2018, p. 15).
Since 2013, ERI-NET and NEQMAP have been exploring the status and reach of transversal competencies in the Asia Pacific region. Of particular interest is a study that explored the assessment of these skills. Findings revealed that there were high levels of awareness, both at school and policy levels, of the need to assess these skills, but that implementation of such assessment practices were curbed by teachers' lack of understanding of the key skills, as well by inadequate materials and resources to assess them (Care & Luo, 2016).
In 2015, PISA signalled its recognition of the importance of assessment of key skills, with the administration of an innovative, online assessment of collaborative problem-solving (OECD, 2017). This initiative has garnered a good deal of attention and provided a major impetus for similar initiatives worldwide. However, while current state-of-the-art technology like that used in PISA can capture problem-solving processes such as strategies used, the nature and number of attempts and the time taken, the argument remains that practical, robust and meaningful assessments of the key skills are still some way off (Bennett, 2015).

Research Aims: A need for pragmatic measures
In order to be able to assess whether the curriculum changes are having an impact on the taught curriculum, and thus on students, it is necessary to have reliable and practical tools available for use in authentic school settings. As educational researchers and policymakers explore new and innovative approaches, the instrument developed in this research can be viewed as a tool to create a common understanding and language around the key skills, as well as to illuminate some of the benefits that development of confidence in these areas may have.
It is important to highlight that the SICKS is not intended to be used to measure individual students' varying levels of acquisition of key skills. Rather, it is intended to provide support for academics and practitioners interested in engaging in a critique of current practices, and to explore ways in which those practices can be redesigned to better support the needs of students. From a research perspective, the instrument can be used to gauge the effectiveness of interventions that aims to increase students' confidence with key skills, while from a practitioner point of view, it can be utilised for formative assessment within the classroom, as a tool to engage the students in discussion relating to skills, or to inform the development of an assessment rubric as practitioners become more familiar with the types of activities that embody/represent these skills.
The initial requirement for a confidence measure of this type stemmed from a need to evaluate the outcomes of a particular largescale school intervention. The Trinity Access 2 (TA) programme is a university in-reach programme that works with twenty schools in areas of low progression to Higher Education. TA provides opportunities for professional development to teachers in partner schools, as well as a variety of student programmes. The desired outcomes of student engagement with the programme are multi-faceted and include the development of confidence with key skills. This necessitated the development of a short and practical scale that could measure such constructs and be combined with other scales into a larger overall instrument. Having a measure of this kind has permitted an exploration of the influence that confidence with key skills may have on other aspects of education such as engagement, aspirations and goals, and wellbeing, thereby providing a rationale for teachers to engage with teaching and learning practices that focus on building students' confidence and competence in these areas.
Thus, the aims of the research presented in this paper are to develop a short, valid instrument that can be used by students to selfreport on their levels of confidence with key skills, and to explore the value that its use may bring, particularly when it is used in conjunction with other, dependent measures relating to students' experiences in school.
The following sections describe the development, validation and evaluation of the SICKS as a short and reliable measure of students' levels of confidence with key skills. This process involved the following steps ( Fig. 1):

A framework for measuring key skills
As educational policy has shifted in emphasis towards a more competency-based approach to education, various frameworks have identified and described the desirable skills or competencies, incorporating the ability to think critically and creatively, to collaborate and communicate eff ;ectively, to select and utilise suitable information and communications technologies (ICTs), and to use these skills in a self-directed manner.
This shift in focus is reflected in recent Irish post-primary curriculum reforms, particularly with respect to the new Junior Cycle Framework (ages ∼12 -15), which outlines eight key skills that ought to be prioritised across the curriculum (National Council of Curriculum & Assessment, 2014) ( Fig. 2).  A. Bray, et al. Thinking Skills and Creativity 37 (2020) 100700 Having explored the various frameworks outlined in Section 2.1, it was recognised that, for the purpose of this work, the framework provided by Ravitz et al. (2012), was most appropriate. It presents a concise and comprehensive definition of key skills, and provides a questionnaire, validated for use by teachers and practitioners, which measures frequency of implementation of eight specific key skills. The eight skills include the six described in Section 2.1 (critical thinking, collaboration, communication, creativity and innovation, self-direction and using of technology for learning), as well as (7) Global Connections (G) -understanding global and geo-political issues including the history, politics, geography, culture, and literature of other countries; (8) Local Connections (L) -application of what has been learned within local contexts and communities.
These eight constructs are measured through the use of a 5-point, Likert-type scoring system to generate quantitative data about teachers' frequency of usage of each of the skills. Each set of items is preceded by the question "How often do you ask your students to do the following". A total of forty-eight items are used in the scales.
Items from the first six of the skills classified by Ravitz et al. (2012) can be easily mapped onto five of the key skills as defined by the Irish Junior Cycle (Table 1). (It should be noted that the Junior Cycle key skills also include the constructs of numeracy, literacy and staying well, which are not considered in the current framework.) (CT) Try to solve problems or answer questions that have no single correct solution or answer (CT) Compare information from different sources before completing a task or assignment (CT) Draw your own ideas based on analysis of numbers, facts, or relevant information (CT) Summarize or create your own interpretation of what you have read or been taught (CT) Analyze different arguments, perspectives or solutions to a problem (CT) Use evidence to develop arguments (T) Judge how good and useful online resources are A. Bray, et al. Thinking Skills and Creativity 37 (2020) 100700 The process of mapping the items from the original instrument to the Junior Cycle key skills removed the constructs of Local and Global Connections, reducing the total number of statements from forty-eight to thirty-four.

Instrument Development
In the validity and reliability analysis of teacher data presented in the original paper (Ravitz et al., 2012), factor analysis verified that use of technology for learning and self-direction emerged as discrete factors. However, the constructs of critical thinking, collaboration, communication, and creativity and innovation were less empirically distinct, and while each of these conceptually linked items loaded on their own factor there were also some overlapping concepts. For example, the item 'invent a solution to difficult problems', while intended to represent creativity and innovation, also loaded onto critical thinking. The authors of the original instrument suggested that there was scope to increase validity by refining and shortening the scales. As our own requirement was for a short student instrument that could be incorporated into a longer survey and still be practical in terms of administration, the research team decided to select items from the original survey that we believed best captured what we were attempting to measure. Additionally, although the original instrument was intended for use as a measure of the frequency of integration of key skills in classrooms, the process of validation that follows explores whether a sub-section of the items can be used in a valid and reliable instrument that measures students' levels of confidence with key skills.
In order to attempt to establish construct and sampling validity, the choice of items for the SICKS was discussed with a panel of experts (Rubio, Berg-Weger, Tebb, Lee, & Rauch, 2003), including academics and practitioners, from the fields of sociology, education and computer science. Having agreed that the selected items were the best choices to capture the desirable concepts, a pilot instrument was developed (See Appendix A for Items selected for inclusion). It is important to note that an exploration of convergent/ divergent validity was beyond the scope of this study as multiple measures of the constructs under consideration were not collected (Campbell & Fiske, 1959). This will however be considered in future efforts to strengthen the construct validity of the scale.
Once the pilot instrument was agreed upon, it was piloted with a class of twenty grade 10 students (ages 13/14), from a coeducational school that has been designated as 'disadvantaged'. The reason for this was to ensure that the language of the instrument could be easily understood by a group of young people who were roughly representative of our target participants. No issues emerged with the language in the SICKS.

Context and Participants
The development of the Student Instrument for measuring Confidence in Key Skills was carried out as part of a large research and evaluation programme for a major longitudinal educational initiative in Ireland, that aims to improve students' active participation in education, and to increase the progression of students from lower socio-economic backgrounds to further and higher education. The SICKS makes up a part of a more comprehensive instrument, designed to gather information on participating students' experiences in school. In addition to data relating to key skills, the other sections of the survey gather information relating to: demographics, subject levels and choices, students' views on their schooling, their understanding of the opportunities available to them, mentoring, leadership, and wellbeing. The intention is to re-administer the survey at a number of points during the project, in order to ascertain changes in the student experiences, attitudes and aspirations as a result of the project.
The survey was created online, and was distributed to twenty-one schools, eighteen in an urban setting (Dublin, Ireland) and three in a rural location (Kerry, Ireland). Twenty of the schools are classified as 'disadvantaged'; the remaining school is a private, feepaying school and was chosen as a high-socioeconomic status comparison (HSSC). In each school, all grade 10 students were requested to complete the questionnaire. It was recommended that administration of the questionnaire should occur during class time, but if students were absent, it was acceptable for students to fill in the survey for homework. A total of 1240 students completed the full survey, with 1013 valid responses in the key skills section. The gender breakdown of respondents was 50% male and 50% female.
In addition to the validation presented herein, the analysis presented in this paper provides baseline information for comparative purposes at a later stage in the overarching project. It has enabled a comparison of confidence levels between genders and school types, as well as the identification of higher confidence levels in certain key skills as significant predictors of other desirable variables.

Validity and Reliability
A cross-validation approach was used to analyse the factor structure of each of the constituent scales of the SICKS. Two random samples of approximately equal size were obtained from the data gathered using the IBM Statistical Package for the Social Sciences 24©) (SPSS). Missing values were addressed through a process of multiple imputation (Jakobsen, Gluud, Wetterslev, & Winkel, 2017).
The first random sample (N = 506) was initially explored to ensure suitability for factor analysis: The Kayser-Meyer-Olkin (KMO) measure of sampling adequacy was 0.913, exceeding the recommended minimum value of 0.5, and Bartlett's test of Sphericity was highly significant (< 0.001); these results leading us to the conclusion that exploratory factor analysis (EFA) was appropriate (Field, 2009).
The Principal Axis Factoring (PAF) extraction method with Promax rotation was used to identify the underlying factor structure of each of the key skills scales. This extraction method was selected as levels of skewness of the data were outside the recommended levels of ± 1.96 for some items. A Promax Rotation was utilised as the items were not expected to be orthogonal; scrutiny of the factor correlation matrix confirmed that this was the case (Brown, 2009;Field, 2009;Worthington & Whittaker, 2006). Through the A. Bray, et al. Thinking Skills and Creativity 37 (2020) 100700 process of EFA six factors were extracted, with the rotated solution (Fig. 3) explaining a total of 67% of the variance. The number of factors was confirmed by the scree plot.
The factors 1 to 6 relate to using technology for learning, collaboration, creativity and innovation, critical thinking, self-direction and communication, respectively.
Chronbach's Alpha scores were calculated for each of the scales, in order to verify their internal consistency. The six scales were found to have high levels of reliability, with alpha coefficients of 0.8 and above.
The results of the EFA indicated the presence of six distinct factors relating to the key skills. In order to verify the factor structure, the second random sample (N = 506) was used to conduct a confirmatory factor analysis (CFA) using SPSS AMOS¨c) Graphics software. A chi-square test, the root-mean-square error of approximation (RMSEA), the Comparative fit index (CFI), and the Tucker A. Bray, et al. Thinking Skills and Creativity 37 (2020) 100700 Lewis index (TLI, sometimes known as the non-normed fit index, NNFI) were used to assess goodness-of-fit of the model (Fig. 4). The overall fit of the six-factor model was good. While the chi-squared test was significant, it is recognised that this test should not serve as the sole basis on which to judge the goodness of fit as it is highly sensitive to sample size and skewness (Schermelleh-Engel, Moosbrugger, & Mñ/4ller, 2003). In fact, Wheaton, Muthen, Alwin, and Summers (1977) suggest that, for large sample sizes, values less than 5 are reasonable, particularly when considered in relation to other, descriptive goodness-of-fit indices. The values of the goodness-of-fit indices considered appropriate for this study are outlined in Table 2 (Hu & Bentler, 1999;Schermelleh-Engel et al., 2003;Wheaton et al., 1977).
The CFA process outlined above permitted identification of the underlying structure in the data, and confirmation that the items used in SICKS can be used to measure students' confidence with the six key skills. Inferential analysis was then conducted on the second sample in order to assess the impact of confidence levels in key skills on students' experiences in school.

Findings
In order to explore possible factors that might account for some of the variance in students' levels of confidence with key skills, and to ascertain relationships between such confidence levels and students' experiences in school, this section first provides a descriptive overview of the data before looking at analysis of variance by gender and school type, and finally exploring relationships through correlation and regression analysis.

Descriptive statistics
General descriptive statistics indicate that across all of the six scales, the mean confidence levels fall between neutral (3) and confident (4). It is clear from Table 3 that mean levels of confidence are highest with respect to using technology for learning, and lowest in relation to communication.

Analysis of variance across gender
Without exception, the values of the dependent variables of confidence with each of the key skills were all within the acceptable range ( ± 1) of skewness and kurtosis to prove normal univariate distribution (Darren & Mallery, 2016). Thus, independent samples ttests were used to explore how much of the variation in the means of each of the scales can be associated with the independent variable of gender.
The independent samples t-test indicated that gender had a significant impact on four of the six confidence scales, with male students having significantly higher levels of confidence in relation to the constructs of communication, creativity, critical thinking, and the use of technology for education than females. No significant differences were identified in relation to gender and confidence with collaboration or self-direction (Fig. 5).

Analysis of variance across school types
The participants included in the second random sample in the study included students from 18 urban schools (N = 453), 3 in a rural setting (n = 24), and 1 high control school (n = 29). In order to provide an indication of the variance across the three types of A. Bray, et al. Thinking Skills and Creativity 37 (2020) 100700 school, use was made of one-way, between subjects ANOVAs on each of the six key skills scales, with Bonferroni tests identifying where any significant differences lay. Table 4 presents the mean ratings of each of the six of the SKSCS scales, as well as the ANOVA Fand p-values.
In relation to each of the key skills, no significant differences were identified between the rural and urban cohorts from disadvantaged schools (Fig. 6). However, the high control cohort report significantly higher levels of confidence than rural participants in communication and technology, higher than urban participants in relation to creativity and critical thinking, and higher than both of the other groups with respect to collaboration and self-direction.

Key skills as predictors
Correlations, linear regressions and multiple linear regressions were performed in order to identify whether levels of confidence with the key skills could be used as predictors for changes in students' scores on the eight dependent variables of self-worth, active engagement with learning, sense of purpose in education, student-teacher relationship, student voice, aspirations and goals, engagement with the community through education, and well-being. Each of the dependent variables were measured using validated, likert-type scales. The well-being scale was measured from 1 -7 and all of the other variables were measured on scales of 1 -5.

Correlations
In order to apply linear and multiple linear regression to a dataset, the independent variables must have a linear relationship with the dependent variable. In this case each of the dependent variables were significantly positively correlated with each of the independent variables (confidence with key skills). This indicates that as confidence with each of the key skill increases, so too do students' levels of self-worth, active engagement with learning, sense of purpose in education, student-teacher relationship, student voice, aspirations and goals, engagement with the community through education, and well-being.
The magnitude, or strength, of the association in the majority of cases is moderate (0.3 < | r | < 0.6), with 2 of the dependent variables exhibiting weak correlations (0 < | r | < 0.25) with some of the key skills (student-teacher relationship with collaboration, communication, creativity, critical thinking and technology for learning, and student voice with collaboration, communication and technology for learning).

Simple linear regressions
After establishing that all necessary data assumptions were met simple linear regressions were conducted on each of the dependent and independent variables. In each case (48 regressions in total), the independent key skills variables were identified as A. Bray, et al. Thinking Skills and Creativity 37 (2020) 100700 being statistically significant predictors of the dependent variables. In order to provide an example of this, an overview of the relationship between the key skills and the dependent variable of aspirations and goals is presented (Table 5). A detailed breakdown of the relationship between aspirations and goals and the predictor (independent) variable of self-direction is also provided for illustrative purposes.

Example 1. Confidence with self-direction as a predictor of aspirations and goals.
The level of confidence with practices relating to self-direction statistically significantly predicted the level of a student's aspirations and goals, F(1, 996) = 271.720, p < .000, with an adjusted R 2 of .219. According to this model, the level of confidence with collaboration accounted for roughly 22% of the variance in levels of aspirations and goals. Participants predicted level on the aspirations and goals scale is equal to 2.591 + 0.445(confidence with self-direction score). That is, a one unit increase on the confidence with self-direction scale leads to an increase of 0.445 on the aspirations and goals scale.

Multiple linear regressions
Multiple Linear Regressions were performed to test whether the 6-scale model of confidence with key skills had a significant bearing on students' levels of self-worth, active engagement with learning, sense of purpose in education, student-teacher relationship, student voice, aspirations and goals, engagement with the community through education, and well-being.
In each case, analysis showed the 6-scale model to be a significant predictor of students' levels in the dependent variables, with a greater effect size than any of the R 2 values calculated for the individual linear regression models (Table 6). For each of the dependent variables, it was found that between two and five of the key skills could be considered as statistically significant predictors. Once again, the relationship between the dependent variable of aspirations and goals is expanded upon as an example.
Example 2. Key skills as predictors of aspirations and goals.  Bray, et al. Thinking Skills and Creativity 37 (2020) 100700 Upon fitting a multiple regression model which contained aspirations and goals as a dependent variable and the six key skills as independent variables, it was found that three of the key skills variables could be considered as statistically significant predictors. The results of the regression indicated the model explained 29% of the variance (R 2 = .29, F(6, 946) = 64.424, p < .000). This is a greater effect size than any of the R 2 values calculated for the individual linear regression models. The key skills that significantly predicted aspirations and goals scores were confidence in self-direction (β = .270, p < .000), critical thinking (β = .111, p = .001), and technology (β = .160, p < .000), with the β value indicating the level of increase on the aspirations and goals scale given a 1-point increase on each associated key skill scale.
It is clear from these data that increasing students' levels of confidence with the key skills is likely to have a positive effect on their experiences in school.

Discussion
The picture that has emerged as a result of the data analysis in this paper indicates that the stereotypical disparity between the confidence levels of male and female students persists in many areas (Beyer & Bowden, 1997;Jakobsson, Levin, & Kotsadam, 2013), with male students reporting higher confidence with communication, creativity, critical thinking, and using technology for learning. Statistically significant differences were also recorded in each of the key skills constructs between students from the schools in disadvantaged areas, and those from the school with high progression rates to post-secondary education, with low SES-status students scoring lower on all scales.
The analysis presented in this paper also draws attention to the strongly positive relationship between high levels of confidence with the key skills, and higher post-secondary school aspirations, greater levels of well-being, more active engagement with education etc., which should serve to emphasise the importance of bolstering students' confidence in these areas. This provides a strong argument for teachers to engage with pedagogies that promote confidence and competence with these skills.

Limitations and future work
As with any research, there are limitations to this study. In relation to the development of the instrument, we have not been able to establish divergent/convergent validity by comparing the instrument to others measuring apposite variables. The reason for this is two-fold: firstly, the rationale for the development of the instrument was the dearth of existing scales that measure relevant constructs; and secondly, from a practical perspective, as the SICKS makes up a part of a larger instrument, it would not be feasible to add to that in order to collect data using various scales that measure similar constructs.
In relation to the findings outlined in this paper, it should be noted that this research provides the foundation for a larger study (as noted in Section 2.4), in which the relationships between the variables will be explored in much greater depth. The data and analysis presented herein has permitted the validation of an instrument that is now being used to collect data from students across 20 schools as part of a longitudinal study. The findings have allowed us to establish hypotheses in relation to the positive impact that higher levels of confidence with key skills may have on students' experiences in school, and to begin to identify practices and approaches that should encourage their development. Our intention is to further develop this research in order to provide teachers with a rationale for engaging with curriculum reform that prioritises these skills as well as the foundations of rubrics that could be used to explore and assess the development of these key competencies in their students.

Conclusion
Traditional assessment practices have primarily focused on the measurement of students' fluency with routine skills, and their ability to reproduce subject content (Dede, 2010a), and this has without doubt, had an impact on classroom practice. In order to move towards a competency-based curriculum, it is fundamental that practitioners understand and agree with the reasoning behind the reform, its implications for their practice, and the consequences for their students (Kärkkäinen, 2012). To achieve this, it is important that practitioners are provided with robust evidence that clarifies the potential benefits of the curriculum for their students, as well as furnishing them with the language and tools to understand and assess the objectives of the reform.
One of the primary challenges to the implementation of the objectives of the competency-based curriculum is that traditional, high-stakes testing does not generally assess key skills and competences (Fullan & Langworthy, 2014). Thus, there is a need to align our assessment processes with the objectives of the reformed curricula (Dede, 2010a;Voogt & Roblin, 2012). As we reach the beginning of the third decade of the 21 st Century it is clear that progress has been made in relation to assessment of key skills. However, to date this progress has been primarily theoretical and conceptual; there is an urgent need to move toward more pragmatic solutions to the teaching and assessment of key skills (Care, 2018). As the stakeholders in educational assessment chart these new territories, the SICKS provides an easy to understand framework to measure and compare confidence with these skills. This instrument could function as a starting point for the development of practical assessment methods and resources to support practitioners and students as they await the transition of more cutting-edge assessment methods such as those in PISA 2015 into the classroom.
Policy documents, curricula, and society are all calling for the development of the kinds of competencies described in this paper, and yet, in an education system that is based on high-stakes terminal examinations, it is difficult to convince teachers and students that these are highly prized skills. It is essential that we provide evidence of the value of these skills within the context of the classroom, and that we move towards some kind of assessment of such competencies. There is a requirement for a variety of measures A. Bray, et al. Thinking Skills and Creativity 37 (2020) 100700 that can be used to evaluate students' development of key skills, and the SICKS has the potential to play an important part in this. This instrument can be used for students to self-report on their confidence with the key skills, and this, in conjunction with other formative and summative measures, can be used to develop a clear picture of students' progress in these areas. It is our view that the SICKS offers a common language for practitioners and students to develop their understanding of the kinds of activities that reflect the key skills and therefore can form the basis for an assessment rubric that could be negotiated between teacher and students. It is clear from the analysis emerging from this research that the value of an educational approach that has the potential to increase students' confidence with key skills, is something that cannot be underestimated. In fact, a system of education that not only values, but teaches the key skills intentionally and well, can only have a positive impact on our society.

Declaration of Competing Interest
None.

Appendix A. A Short Instrument for Measuring Students' Confidence with 'Key Skills'SICKS): Development, Validation and Initial Results
Each item is scored using a Likert-type scale (1 -5): Strongly disagree, disagree, don't know, agree, strongly agree.
• Working with others -collaboration. How confident are you to: o Work in pairs or small groups to complete a task together o Work with other students to set goals and create a plan for your team o Create joint products using contributions from each student • Communication. How confident are you to: o Communicate your ideas using media other than a written paper (e.g., posters, video, blogs, etc. o Use technology to work in a team (e.g., shared work spaces, email exchanges, giving and receiving feedback, etc.) o Use technology to keep track of your work on assignments o Use technology to help to share information (e.g., multi-media presentations using sound or video, presentation software, blogs, podcasts, etc.)