Analytics for Action: Assessing effectiveness and impact of data informed interventions on online modules

: Investigating eﬀectiveness of learning analytics is a major topic of research, with a recent systematic review ﬁnding 689 papers in this ﬁeld (Larrabee Sonderlund et al., 2019). Few of these (11 out of 689) highlight the potential of interventions based on learning analytics. e Open University UK (OU) is one of few institutions to systematically develop and implement a learning analytics framework at scale. is paper reviews the impact of one part of this framework - the Analytics for Action (A4A) process, focusing on the 2017-18 academic year and reviewing both feedback from module teams and interventions coming out of the process. e A4A process includes hands-on training for staﬀ, followed by data support meetings with educators when the course is live to students. e aim being to help educators with making informed, evidence-based interventions to aid student retention and engagement. Findings from this study indicate that participants are satisﬁed with the training and that the data support meetings are helping in providing new perspectives on the data. e scope and nature of actions taken by module teams varies widely, ranging from no intervention at all to interventions spanning over multiple presentations. In some cases, measuring the impact of the actions taken will require data analysis from further presentations. e paper also presents ﬁndings indicating room for improvement in the follow up of the actions agreed, support given to module teams to implement such actions and ﬁnal evaluation of impact on student outcomes.

e Open University (OU) is the largest University in the UK, offering to its students high quality higher education via distance learning.Since its creation in 1969, over 2 million students from 157 countries worldwide have registered for studying at the OU.
e OU offers undergraduate and postgraduate degrees.All degrees, except some research doctorates, are studied in the distance learning modality.e curricula is organised by modules (courses).
Each module is produced and managed by a multidisciplinary team, led by an academic leader : the Module Team Chair (MTC).is team is known as the Module Team (MT).A typical undergraduate module is worth 30 or 60 credits, and a typical Bachelor degree is conceded when the student has completed 360 credits.All module contents and activities are available online via the OU's Virtual Learning Enviroment (VLE), which is a customised version of Moodle.Some modules still provide printed material, but this content is always also available online.Modules are assessed using a combination of quizes, Computer Marked Assessments (iCMAs), Tutor Marked Assessments (TMAs), End Of Module Assessment (EMAs), Projects and Exams.
e OU has been systematically using learning analytics to improve students' outcomes since, at least, 2014, when the OU initiated its Learning Analytics programme.One of the main components of the programme was the Analytics for Action (A4A) process, which promoted the systematic collection and analysis of the data with the objective of improving the design of the University´s modules and, subsequently, the student's outcomes, using the A4A evaluation framework to structure the process (Rienties et al., 2016).Aer a two-year pilot, a decision was made to mainstream the A4A approach into business as usual activity in the 2016-17 academic year.e Learning Design team (LDT) was selected to run the process as the team members had expertise in Learning Design, were familiar with the data and, in most cases, had already worked on design of the participant modules.
e A4A process included the provision of training for participating staff and a series of data support meetings (DSMs) among academics, support staff, data analysts and learning designers.At these meetings the available data were reviewed, and specific actions were agreed to address the issues found.e training covered the basis of the A4A framework and the use of the basic data tools.
In a typical data support meeting, the data reviewed includes the Key Performance Indicators (KPIs), the profile and study record of the students registered in the module, the assessment submissions and results, the retention and withdrawals data and the students interaction with the VLE.Additional data could be also included for discussion at the meeting.For each of these meetings, the LDT prepared an analysis of the data and a comprehensive report was circulated aerwards.ese reports contained a summary of the data and the discussions, plus recommendations and possible actions for both the MT and the LDT.
In the 2017-18 A4A cycle, the LDT provided support to 49 modules across all faculties, reaching over 35,000 students.is represented an increase of 69 % in the number of modules and 40% in the total student population reached compared to the previous cycle .A total of 136 module support meetings were held and 20 training sessions were delivered to a total of 128 staff.Out of the 49 modules included in A4A, 43 were offered three DSMs during their presentation.e remaining six modules were offered an alternative 'light touch' process.As no formal records are kept from the surgery style sessions, the data from participant modules in that modality are not included in this report.

LITERATURE REVIEW
ere are a wealth of studies looking at learning analytics in its broadest sense.Furthermore, it is now embedded in the plans of numerous higher education institutions worldwide.A recent systematic review (Larrabee Sonderlund et al., 2019) found 689 papers relating to effectiveness of learning analytics.e same review found only 11 that highlighted the potential of interventions.As noted in the review, each of the final papers analysed more closely take a similar approach of using analytics to identify at-risk students and to disseminate that information to students and tutors (Larrabee Sonderlund et al., 2019).
e approach taken at the OU toward ongoing analysis and developing interventions is based on the Analytics for Action framework (Rienties et al., 2016) and utilises the Community of Inquiry methodology, initially developed by Garrison et al. (2000;2007) as a guiding principle for categorising types of intervention.
Looking more closely at the existing literature around learning analytics programmes, there is literature investigating impact at many levels and with differing results.Drachsler et al. (2014) look at the impact across the whole Dutch education system, whereas others such as Dawson et al. (2017) examine the impact of a specific learning analytics programme on student retention, using a predictive model to identify atrisk students and to make supportive interventions.eir work found positive association between the intervention and retention, but statistical methods found low to no effect of the intervention.A study by Kostagiolas et al. (2019) undertook a survey of students at a Greek university to explore the relationship between student satisfaction, self-efficacy and retention.e work found a correlation between student satisfaction, self-efficacy and student retention whilst also evaluating how academic information resources fulfil student information needs.Coming back to the OU context, a further study by Rienties and Toetenel (2016) also used learning analytics to analyse the impact on student retention of different Learning Design approaches, indicating that student behaviour was strongly predicted by the learning design of the course and that communicative activities and social learning was a particularly strong predictor of student success.
A number of other studies have been successful in finding a link between specific interventions at course level and improved retention or student performance (Fritz, 2011;Kim et al., 2016;Lu et al., 2017).Lu et al. (2017) found a 17.4% better performance from an experimental group where instructors were receiving analytics reports to inform their advice to students than the control group where no such reports were provided.e study by Kim et al. (2016) investigated student use of learning analytics dashboards and found that lower performing students were more motivated by their use of the dashboard than higher performing students.Finally, Fritz (2011) found from evaluation of a "check my activity" tool (CMA) enabling students to check their LMS engagement with that of other students, that 91.5% of students used CMA at least once, and compared to students who did not use the tool throughout the semester, these students were 1.92 times more likely to earn a C or above (Fritz, 2011).
As Rienties et al. (2016) flag in their paper about three case studies of learning analytics interventions, one of the largest challenges for the field of learning analytics research and practice is how to put the power of learning analytics into the hands of teachers and administrators.is points to the question of adoption at both institutional and practitioner level.Ferguson et al. (2015) identify that analytics implementation requires change of practice across educators, learners, support staff, library staff, administrators and IT staff.e study also links back to findings from 40 years ago highlighting that "Researchers should get clients politically, emotionally, and financially committed to the outcome of the research.ey are then more likely to take notice of its results" (McIntosh, 1979, cited in Ferguson et al. 2015).Dawson et al. (2018) unpick this further by drawing on complexity theory and seeing the need for institutions to implement learning analytics with an awareness both of the complexity of the institution and of the change to be brought about by implementing learning analytics.

Aim and objectives. Research questions
Our objective with this review was to answer the following three research questions: • RQ1.Are the DSMs matching the expectations from faculty staff involved in the process?• RQ2.Are faculty staff satisfied with the content and delivery of the training sessions?• RQ3.Is there any measurable impact of the actions taken aer advice provided to faculty staff at the DSMs?

Methodological approach
In order to answer the research questions above, there were three key activities undertaken to provide the required evidence: For the first question (RQ1), relating to expectations of faculty staff with DSMs: in order to evaluate if the meetings matched the expectations from faculty staff, we invited the faculty staff involved (usually MTC and Curriculum manager) to complete an anonymous online survey aer the final support meeting.We received in total 17 responses to this online survey.Among the respondents were 13 MTCs and 4 Curriculum managers.e online survey included Likert scale questions as well as free text answers.
For the second question (RQ2), relating to satisfaction with training sessions: in order to evaluate the quality and pertinence of the training sessions, we asked the trainees to complete a questionnaire at the end of each session.In this questionnaire we asked 10 Likert scale questions (where 1 = totally disagree to 5 = strongly agree), and two free text response questions.We received and analysed 106 responses.
For the third question (RQ3) relating to impact of actions taken following DSMs: aer each DSM, the LDT circulated a full report that included the data covered at the meeting, the discussions about the data in the context of the module performance and the actions agreed.e actions were allocated to either the MT or the LDT.
For a sample (7 out of 43, all from the STEM Faculty) of the participant modules we reviewed the meeting reports and identified the actions agreed.We then asked MT members whether these actions were taken and reviewed the available data in search of any measurable impact.

RESULTS AND DISCUSSIONS
For the DSMs : A total of 136 DSMs were held in 2017-18.A large proportion of the second meetings needed to be rescheduled due to strike actions at the the OU.Whilst all meetings were successfully rescheduled, the knock-on effect of these delays meant that the third and final meeting happened much later than originally planned, affecting the chances of introducing changes within presentation.
Most attendees were satisfied with the data support meeting provision.e attitude and knowledge of the trainers were highly regarded.No respondent expressed dissatisfaction with the meetings, and only 2 provided a "neutral" response.
While 88% of all respondents agreed the facilitators provided clear interpretation of the data, this figure dropped to 76% when asked about identifying actions.
Table 1 shows the answers to the Likert scale questions (Q1, Q2, Q3 and Q5) in the survey: Proportion of respondents that strongly agree/agree with the statements quoted Question 4 was a free text answer related to whether new perspectives of the data were identified in the interpretation by the meeting facilitators.Table 2 shows positive and negative comments on the data interpretation provided by the facilitators.Proportion of positive / negative comments on data interpretation provided from all respondents e survey also included two questions (Q6 and Q7) with open/free text answers, which asked about the aspects of the meetings that worked well and what could be improved.
Q6.What did you like about the support meeting?When asked this question, attendees made comments related to: a) Facilitators attitude and knowledge (10 comments).For example, "Clear presentation of data, clear understanding of what data sources meant, everyone on similar page as to what we're trying to achieve, willingness to go beyond standard analytics for us".b) e time and space to review the data (4 comments).For example, "e meetings were a good place to sound out ideas and different theories as to why students withdraw or don't progress as we expected.e atmosphere was one of learning by doing, and by learning together with colleagues who were supportive" and "ey provided clear guidance on the use of analytics that can be used to improve modules".c) Data provision and visualisation (4 comments).For example, "ey took on board our queries and found ways of reporting back at the next session with additional information, very helpful" and "Nice to see some visualisations of the data".
Q7.What could we do to improve the support meetings?When asked this question, the attendees made comments related to: a) Data systems and Data provision (4 responses).For example, "e technology didn't work all the time, so in the meeting the analytics SAS website went down.Some meetings were joined on-line, and it was difficult to see the 'live' data remotely".b) Meeting preparation/customisation (4 responses).For example, "In some cases I felt a bit rushed and wished we had more time to look over and analyse the results and trends, but I do respect the idea that it is difficult to get all these meetings into the presentation diary.Having more cross-referenced data would be great -for example knowing all the characteristics of students likely to not succeed (e.g. who are our target groups for support?) could be really useful, especially at the ont end of the module.Also knowing when and how to share findings with our tutors could also be considered -we need to progress in that area if we can".c) Follow Up (3 responses).For example, "I also felt that we sometimes le the meeting without a clear plan of what we were going to do.Obviously, there wasn't time to cover that in the meeting, but followup meetings between ourselves to discuss actual changes should have been built in to the approach as a 'requirement' for participation.Obviously, it was up to (us to) put this as an agenda on our own meetings, but these were not necessarily at a good time to fit in with the data support meetings".d) Nothing to improve (3 responses).For example, "Can't think of anything.Facilitators were open to suggestions and followed up with actions aer each meeting".Institutional constraints (2 responses).For example, "Perhaps make a bigger deal out of them, e.g.promote them a bit more with MT".e) Facilitator's knowledge/attitude (1 response).For example, "ere were some aspects that the facilitators were not clear about, for example, om what point the retention data was calculated.ey also had some preconceptions about things, e.g. that a lot of participation in the Student Forum was a good thing, when, in fact, more participation was usually related to issues and problems -students have a tutor group forum for interaction with each other as well as Facebook and other self-initiated groups for mutual support".
For the training sessions: a total of 128 staff attended the 20 regular A4A training sessions between October 2017 and July 2018.From October to December there were weekly sessions exclusively for MT members of those modules selected for A4A, followed by bi-weekly sessions open to all staff.e training offered trainees the opportunity to use the data tools on live, current data related to any specific module of interest.e sessions were restricted to a maximum of 12 trainees per session and the ratio of trainers to trainees was kept to a maximum of 6:1.Table 3 shows the average score for each question and the percentage of respondents that totally agreed/agreed with each question statement.Average scores and percentage of the trainees totally agreeing agreeing with the corresponding statements for all respondents Additionally, 8.6% provided a neutral response for Q10.No trainee expressed dissatisfaction with the training sessions.Trainers' attitude and the instructions they provided (Q8 and Q9) were highly regarded with, at least, 92.5% of trainees agreeing with the correspondent statements and an average score of 4.46 (89.2%) and 4.54 (90.8%) for Q8 and Q9 respectively.
Trainees found the data tools easy to use and reported that they could get the data tools to do what they wanted, with an average of 87.6% of trainees totally agreeing or agreeing with the statements in Q1-Q3.
Both scores and proportion of trainees totally agreeing or agreeing with the statements in questions Q4, Q5 and Q6 are consistently lower than for the rest of the questions.is could be because these questions were related to improving productivity and effectiveness in teaching, and a significant proportion of trainees are academic support staff rather than teachers.Two thirds of all respondents agreed with the statement in Q7, that most staff will need formal training to use these tools.
Q11 and Q12 were free text response questions, which were answered as follows: Q11.What did you like about the training session?When asked this question, trainees commented on the hands-on approach and practical experience of the training exercise, the instructions provided, the opportunity to explore and experiment with the data tools, the quality and relevance of the advice provided by the trainers, the relevancy of the data (particularly the fact that the data was coming from their own modules) and the pace of the session.A few trainees mentioned the workbook provided to each trainee and considered it to be a good idea.Table 4 shows the frequency of each comment.Frequency of positive comments by attendees Q12.What could we do to further improve the training sessions?When asked this question, trainees mentioned: a) 'system issues' (25 responses): the overall speed of the system was the most mentioned issue and was considered as a blocker for working with data b) 'training style' and 'materials' was mentioned 20 times: more worked examples, sending the slides aer the session, providing a printout of the presentation and/or providing it in advance c) 'session length' and requests for sessions targeted at advanced and beginners were mentioned 12 times, with more data analysis mentioned several times.
Table 5 shows the frequency of each comment:

Frequency of comments suggesting improvements
For impact of the actions taken aer advice provided via DSMs: In this section, we reviewed the actions taken by a sample of the participant MTs which related to the discussions held within the A4A process.Each example outlines the actions taken, the results, future planned actions and provides info on how those future planned actions will be evaluated.Rather than summarising the data, we have taken the decision to present the findings from each of the modules separately.is provides some insight for the reader in how the discussion unfolds for each module and to demonstrate the link between each section.It also demonstrates the difference in scale between the interventions.
As some of the data being presented contains sensitive information this data has been anonymised.

Module 1 2017
Actions taken by MT: Assignment 1 for the following presentation (2018) was changed significantly (reduced in size and scope) based on the submission data and feedback from Associate Lecturers.e submission date was changed from week 7 to week 6, and the weighting remained the same as before (3%).
Results: e assignment submission rates increased by 0.8% for Assignment 1 whilst Assignment 2 also saw an increase of 1.31%.
Future planned actions: a) e MT will be doing similar work on Assignment 2 for 2019.e MT and LDT will monitor submission rates for both assignments and investigate any other factors that could have had an impact on submission rates, beyond the changes introduced b) For students going on to Level 2, the MT is working to secure proactive interventions from the Student Support Team (SST) to encourage registrations next year.c) Looking to get more of the Level 2 MTs to talk to our students too.Students had some choice point tutorials in 2018, and MT intend to include more of these in 2019.
Assessing results by:  At the second data support meeting, a declining trend in VLE usage was detected between weeks 9 and 12 (Christmas break).e LDT suggested a review of the timing of the first two assignments, with the hypothesis that bringing them closer together would have a positive impact on the trend.However, it was considered too difficult to change the timing of assignments for 2018.Instead a bridging video was introduced to "… prepare and reassure students, and to dispel some of the anxiety around anatomical terminology related to the human nervous system".LDT also reported an increase in student withdrawals aer the cut-off for Assignment 01.e possible reasons for this were discussed but aer further investigation, it became clear that most of these students had not submitted the first assignment.Concurrency issues were also reported, particularly with Module 2b and Module 2c, as these modules have very similar assessment dates around weeks 7-8.Aer consideration, the MT concluded that this situation will cease to be an issue when Modules 2b and 2c are replaced by new modules in 2020 Results: Although the bridging video was introduced in 2018, there was no change in the VLE engagement pattern for the same period in that year.e decline in VLE engagement between the Assignment 01 submission date and Christmas can be seen in Figure 2. Engagement levels bounce back but never reach the pre-Christmas levels.i. Updated some text in the Python weeks to clarify things -especially how to study Python (taking notes) and more specific guidance on the activities for Assignment 04.ii.Produced an additional screencast video to demonstrate how to build up larger Python programs.iii.Changed Activity 3.2 in Python Activity 1 to be about explaining a program (rather than writing one) and include the response to this activity as an extra question in Assignment 02.iv.Added a question 10 to the exam (and specimen exam paper) about explaining a brief Python program related to the screencast mentioned in (ii).c) Module Chair to make weekly videocasts (1-1.5 minutes) filmed on iPhone and uploaded to VLE (with transcripts) pointing out the key things coming up in the next week of study. Results: • Python cluster forum shows an increase in student engagement from week 7 to week 22, in the second presentation for 2018 (18J) compared to the first in 2018 (18B) Percentage of students visiting the Module 3 Python cluster forum per week Also, Maths support forum has a slight increase in usage compared to 18B.
• Python activity 1 shows an increase -from 35% to 48%-in the level of student engagement with the corresponding resource at the week the activity was due.It also shows students in 18J revisiting the resource when preparing for Assignment 02.Percentage of students visiting the Module 3 Python Activity 1 per week Figure 5 shows VLE engagement has been slightly higher for 18J, but differences may be related to presentation pattern (J vs B).Percentage of students visiting the Module 3 VLE site per week Future/planned actions: LDT to develop a report that shows current progress vs planned progress for a cohort and link it to final students' outcome.
Assessing results: Monitor whether there is any increase in the proportion of students progressing as planned.

Module 4 2017
Actions taken by MT: a) Concurrent study monitored.'At risk' students referred to SST. b) MT has developed additional preparatory materials linked to a diagnostic quiz.c) e MT produced additional mathematics support for Blocks 4 and 5. d) Workload -signposting materials were produced, covering: i. pre-requisite knowledge/conceptual understanding required to study the block ii.key points regularly examined iii.material that is crucial to learning but not directly tested iv.material written for interest only.e) Regular assessment reminders were provided via module news Results: • By week 23, 45% of the 2017 student cohort were studying more than one module concurrently, and 17% were studying 120 credits.At the same point in the 2018 presentation, these indicators were 41% and 23%, respectively.• Access to the new diagnostic was provided via link to a pdf on the module website.However, engagement with 'Are you ready for' resource was still very low.Number and percentage of students visiting the AYRF resource per week • Additional mathematics support for blocks 4 and 5 was provided via an additional module-wide online tutorial just prior to the start of Block 4 (12 November).e tutors provided a set of questions to be completed before the session.Figure 7 shows that this online tutorial -in orange-was attended by 21 students (out of 186 registered) Number of Module 4 students that engaged with online tutorials on the dates shown • Clear peaks in workload were identified with excessive direct teaching.Efectiveness of the changes expected to be visible for next presentation.• Result expected to be measurable in the next presentation.
Future planned actions: • MT and LDT to review the "Are you ready for" material for 2019, in order to increase student engagement.• MT to consider reviewing assessment timings before and aer the Christmas break, with the aim of shortening the periods between Assignment 02, the Christmas break and Assignment 03.• MT actively looking for ways to trim materials to reduce workload.Tutor feedback about topics that can be removed has been sought Assessing results: Measuring Assignment 03 submission rate.

Module 5 2017
Actions taken by MT: a) Using predictive analytics data, in 2018 the MT identified students who did not complete Assignment 01 or achieved a low score as being at risk.In total, 25 at-risk students were identified.b) MT wanted to investigate pass rates for students in Scotland, to understand whether the lower fee level paid by Scottish students has any impact on incentive to pass the module.c) MT has collated assessment dates for all relevant modules and presented in a one-page view grid.
Conflicts are worst for Module 5b and Module 5c.d) MT considered moving Assignment 02 to an earlier date, considering changes made to Module 5d and Module 5e assessment (removal of Assignment 03).However, it was found that Module 5 Assignment 02 date is difficult to move since students must study a specific topic in the preceding week.If moved closer to Assignment 03, then the desired equal spacing between the assignments will not be achieved.e) e Module 5 exam questions were reviewed by the MT, as suggested by Learning Design, but they appear fair and stable.f) MT planned to provide extra support to students who have banked assessment.In the 2018 presentation, 13 students were identified, and the MT has started to compile their progress and propensity to pass the module, but further work is required.

Module 6 2017
Actions taken by MT: e MTC actively monitored formative quiz scores and submission rates, as there were concerns about student engagement levels.ese assessments are formative but there is a threshold relating to the number of quizzes students need to complete in order to pass the module.is was a concern because Module 6b and 6c had reported problems with students failing solely due to not completing enough formative quizzes.
Results: a) Students were using the quizzes to recap and revise the module material.In the end, the number of students and the number of attempts of quizzes were both high.Although students interacted with the quizzes differently to how the MT envisaged, there was no need to change them because overall engagement was good.b) e issue with formative quizzes observed in Module 6b and Module 6c was not present in Module 6b.No students failed Module 6b due to non-completion of the required number of formative quizzes.
Future/planned actions: MT to continue actively monitoring formative quizzes.LDT to analyse the end of module survey results.
Assessing results: Measuring and comparing formative quiz submission rates.

Module 7 2017
Actions taken by MT: Module 7b and Module 7 should not be studied at the same time, however a few students were registered to study both concurrently.LDT suggested the use of an 'Are you ready for' diagnostic quiz.Due to decreasing student satisfaction scores, LDT suggested the MT may wish to consider use of other data collection tools available to explore students' experiences of learning on Module 7.
Results: Since Module 7 has only one presentation le (2019), evaluation based on questionnaires will be considered for the replacement module, Module 7c Future/planned actions: Use Module 7 collected data to inform Module 7c design.
Assessing results: Measuring pass, completion and satisfaction rates for Module 7c and comparing with historical figures for Module 7

CONCLUSIONS
To revisit the research questions from earlier, this paper has found favourable results relating to both research questions 1 and 2. In terms of research question 3, the results are more variable as action has not always been taken by MTs; however, there is evidence of impact where actions have been taken and one example of lack of impact where a team has taken a different action to that recommended to them.
e responses to the training questionnaire show a clear positive response toward the training, with 91.5% of staff saying that they were satisfied with the training.is is also reflected in the positive comments outlining that staff appreciated being able to experiment and explore, and the hands-on and practical exercises.
Satisfaction levels with the DSMs are also strong, with 87% of staff expressing satisfaction with the meetings.e analysis of the qualitative feedback has also provided insight for the LDT into which aspects of the meeting are working and areas for improvement -a theme that stands out is around having a limited time in the meeting and that there was not always a clear plan for the MT coming out of the meeting.is is something that is being taken forward in the recommendations for ongoing work-to allow time for the meetings and to include space for outlining the recommended next steps.On the positive side, there is a clear recognition of the knowledge of the facilitator, with 10 responses flagging that the facilitator was knowledgeable and willing to push deep into the data to support the team with their analysis.
Assessing the impact of the actions taken as a result of the process, research question 3 reveals a mixed situation.
On modules 1 & 3 there was an impact on retention (module 1) and student engagement (module 3).ese were positive impacts that we can draw back to the interventions made by the module teams as a result of acting on learning analytics findings.On module 2, there was no discernible impact.However, this was an action the module team took having decided the approach suggested in the data support meeting was too complex.In itself this was a useful finding for the OU as it provides evidence that such action does not lead to any impact.is can now be taken forward as part of an internal evidence bank for similar future scenarios.
Modules 4, 5 and 6 are useful examples of an emerging conversation led by review of the learning analytics.We can see these as examples of teams who are not quite ready to take action as they do not fully understand the issues and need the analytics to provide more data or see a different reason for the problems being encountered.Again, these are useful examples as the learning analytics has been able to prove/disprove hypotheses and enable the team to focus in on the problem.
Module 7 is a different case study and shows the importance of targeting the A4A resource at the correct modules.In this case a recommendation has been made to the module team; however, they have pushed it onto the upcoming replacement module.ere are similar experiences where a team toward the end of the module lifecycle have positively engage with data capture and used the analysis to inform the new module.ere is a task here to refine the selection process of modules for A4A to ensure teams buy-in to the need to respond positively to proposed actions.
Module teams that engage more fully with the A4A process are likely to get more insight and results from it and could become champions at faculty or Board of Studies level.Academic staff involvement is essential for the success of A4A, as they are responsible for ensuring that agreed interventions are actioned.
Beyond the research questions and looking at impact within the OU, other University -wide organisational processes related to quality control have adopted elements of the A4A methodology in their approach-such as the regular monitoring of the data, the provision of basic data training and the use of a menu of actions available.
e information regarding the actions taken was collected by contacting the corresponding MT and asking them to report on their modules.is may not be scalable to all modules.A more systematic process for the follow up and evaluation of the actions taken is required to assess the overall impact of A4A.e details of an enhanced process should be agreed between the LDT and the faculties.is would involve the provision of further resources both from the faculties and the LDT.
Further research is required to obtain a better understanding of the benefits derived from the process.Indepth interviews with key participants and independent evaluation are recommended.
Annex 1 provides an outline of the conclusions and recommendations fed back into the University from the internal version of the report.Annexes 2 and 3 show the questions included in the surveys to assess training and DSMs.
Figure1shows the number of Module 1 2017 students achieving a pass grade sorted by their 2018 module registration choice.

FIGURE 1
FIGURE 1 Number of Module 1-2017 students that passed and registered for a Level 2 module

FIGURE 2
FIGURE 2 Percentage of Module 2 students visiting the VLE per week for 2017 and 2018 presentations

FIGURE 6
FIGURE 6 25 students were contacted by the SST.b) Data reviewed showed this was not the case.Retention for Module 5 2017 in Scotland retention is 76% compared to 73% for whole cohort.e 2018 cohort shows a similar trend.c) MT using and updating the grid to minimise clashes.d) No changes to Assignment 02 dates.e) No major changes to Exams.f) Preliminary data suggests that students who bank their assessments for summative quiz 01 and Assignment 01 are much likely to succeed that those who only banked summative quiz 01.Future/planned actions: a) MT to keep tracking student behaviour around assessment banking and their outcomes, and then consider if a change of rules is required for assessment banking.b) Single component assessment will be implemented for Module 5 in 2019 c) MT to contact other modules in the grid to inform them about assessment clashes.Assessing results: a) Measuring overall retention.b) Comparing the outcome for students in 2017 that didn't do or achieved a low score in Assignment 01 -with the results of the 2018 students contacted by the SST.