Learning Analytics in Medical Education

The purpose of this article is to propose a framework to think about the use of learning analytics in medical education. This MedEdPublish article is presented in three sections – 1) a single topic summary illustration; 2) a short "elevator pitch"; and 3) a more "traditional" article format piece (in four main parts). It is hoped that this presentation format will give readers the option to review this topic either visually', to review the "essence" of the topic in the form of a brief "corridor chat" or "elevator pitch", or a more expansive quasi-traditional article (with four main complementary section-parts which deliberately have overlapping content; with the intention to stimulate thinking on this topic, and provoke-form the starting point for further discussion, on the MedEdPublish platform).


Introduction
The purpose of this article is to propose a framework to think about the use of learning analytics in medical education. This MedEdPublish article is presented in three sections -1) a single topic summary illustration; 2) a short "elevator pitch"; and 3) a more "traditional" article format piece (in four main parts). It is hoped that this presentation format will give readers the option to review this topic either visually, the option to review the "essence" of the topic in the form of a brief "corridor chat" or "elevator pitch", or a more expansive quasi-traditional article (with four main complementary section-parts which deliberately have overlapping content; with the intention to stimulate thinking on this topic, and provoke-form the starting point for further discussion, on the MedEdPublish platform).

Section 3: Abstract and Introduction
The purpose of this article is to propose a framework to think about the use of learning analytics in medical education. This article will examine this topic iteratively in four sections -1) by posing a series of questions; 2) followed by a reflection on practice, and personal reflections section; 3) then examining this topic from a student behaviour, and instructor observation viewpoint; 4) followed by a look at possible indicators that an educator might use (including data analytics). Key observations to be made include the importance of "big data" and "small data"; the usefulness of quantitative, and qualitative data, used in a mixed methods research and evaluation paradigm; the idea of a longitudinal learning and training process; the close analogy between observations and data obtained from "traditional" teaching, learning, and practice settings; and their online-mobile learning equivalent; and the need to not just observe engagement and activity; but more importantly indices of actual learning, maturation and deepening of cognitive development and thinking, applied problem solving ability, skill development, and emotional and empathetic sensitivity development.

Part 1
How we spend our time, and our bank balance, shows what we value. Sustained, significant time and effort consciously applied to a program of learning and training has a transformative effort on who we are, and who we become. How do we go beyond "marking" class attendance, online page views; paying attention in class, (asking questions in class/or online -both quantity and quality); online page "click-through's"; viewing and reviewing (by a teacher or observer of a student's) quantity and quality of notes taken in class, or soon after class, both hardcopy and digital formats; toward indicators and measures of learning, and ability to apply and transfer classroom learning to the workplace setting? How do we go beyond "spot quizzes", both in class, and online, (or within an eTextbook, online video, or eCourseware), toward more accurate measures and indicators of comprehension, ability to integrate knowledge, demonstrate ever increasing skill levels, and higher levels of emotional development and empathy? How does the move to the workplace setting, and workplace training and apprenticeships, as well as alternating classroom and workplace (skill -implying application) training programs take advantage of classroom time to integrate and Goh P MedEdPublish https://doi.org/10.15694/mep.2017.000067 Page | 4 stimulate reflection, and workplace settings to practice knowledge application and increase skill development? How can we best use pre-class preparation and pre-reading; in-class interactive and reflective-integrative activities, in both traditional and online class settings; and after class note reviews, assignments, and projects to facilitate, and reinforce learning? How do we take advantage of the different one off teaching-learning-training encounters (lectures, presentations, workshops, symposia, panel discussions), combined with longitudinal, programmatic learning and training programs to increase knowledge, understanding, and insight; build skills, both basic and higher level; and increase our "feel" and emotional -empathetic understanding of a topic? What are signs, indicators, and evidence of higher levels of professional development, increasing competency and proficiency; expert level performance, and mastery of a topic, or field?

Part 2
Learning analytics (and adaptive learning) has been highlighted in the 2016 Horizon Report as an area of important development in educational technology with a time to adoption of one year or less. The purpose of this article is to propose a framework to think about learning analytics or "the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs" (1st International Conference on Learning Analytics and Knowledge). We can examine this topic from the point of view of an instructor observing the learning activities and outcomes of learning for one student; and for a cohort, group or class of students. Throughout this process, an underlying theme would be an examination of the value and impact of a blended online-face to face educational process, with the following; a recent faculty development presentation (for around 20 university single school faculty including Dean), symposium (APMEC main conference symposium for around 150 participants), workshop (APMEC pre-conference workshop for around 35 participants), postgraduate small group tutorial (for around 10 radiology residents), undergraduate small group tutorial (for around 24 students), and undergraduate large group lecture (for 300 students) used as case studies. These case studies illustrate and review the teaching and engagement practice of a single teacher-academic practitioner, at the tertiary and professional level (faculty of medicine, topic of clinical radiology; and faculty development for health professions educators); with the use of blogs + Padlet (over 5 years); as well as Instagram + WhatsApp more recently (last year), as the teaching and engagement platform, and the use of "off the shelf", and free to use, Google Analytics, SlideShare analytics, embedded SurveyMonkey results, together with documented (anonymised) discussion and session output on Padlet and WhatsApp as indicators and evidence of student engagement and learning. An attempt will be made to examine insights one might reasonably glean from this data of online activity; and how this can be supplemented and complemented by more traditional classroom and assignment observations, and outcome data.
A reflection of my personal experience (as both participant-student; and now as local Singapore faculty for the MHPE-S program) with the Maastricht MHPE program over two years, run as a blended offline, and online program is also instructive. To faculty, student attendance, punctuality, class and module contribution (both quantity, frequency, and quality), ongoing development and maturity as an educational practitioner and scholar (through class participation -quantity, frequency and quality; and specifically by assessment and evaluation of 3 to 4 submitted individual and group assignments per module; and after graduation by practice and academic output) is visible both in (live) class; as well as online (through examining frequency and duration of online log-ins to the learning management system [which incorporates not only a class management system, but also discussion and interaction platform, and module instructional content], participation in the online discussion forums, degree and quality of Q and A with tutors and faculty by email, and quality of the submitted module assignments and Master Thesis -which includes use of recommended module learning resources as well as additional resources used and cited by the Goh P MedEdPublish https://doi.org/10.15694/mep.2017.000067 Page | 5 student); and after graduation by evaluation of the quantity and quality of educational-professional, and academic output.

Part 3 Traditional metrics of student engagement and learning
Let's start this process by examining what an instructor might observe, or want to observe, if we had access to, and were able to observe the whole continuum of a single students' engagement with a single large group teaching session, or small group tutorial-discussion. The following would be useful: -time spent; how time was spent (including material reviewed, for how long, to what extent; discussion time; problem solving, including what types of problems, assignments and scenarios); how material reviewed was used in practice (through arguments, citations and referencing); evidence of increasing interest, and engagement with material (from additional reading/online reviewsearch for new material, not on recommended reading-reference lists, including innovative search strategies and areas, going beyond initial topic parameters to adjacent and unexpected but related areas, use of analogies); and evidence of longitudinal increase in knowledge, insight, practice and practical application and transfer, and innovative-new use/application.

Online equivalent (of traditional metrics of student engagement and learning)
What analogies might a teacher or instructor use in a "live" classroom setting? How might this translate into online metrics and data? What might one look for, and look at? We could look at time spent on webpage, online content; follow through online viewing, onto linked, and cited online content; same session note-taking/knowledge building as viewed through concept maps, and quality of note taking, and online note-reflections; use of online content to support assignments, and in case-based scenario discussions and problem solving; quantity, and quality of online discussions -including degree, and depth of participation in online discussion platforms and forums (including number, type and nature of questions asked by students; and quality of "live" answers to questions and points posed and raised by either instructor, or other student participants); the type/nature, degree, and depth of synchronous online search for additional resources during and in parallel with "on-site" and "off-site" individual, and group based learning activities; performance on formative and summative assignments, problems and projects; portfolio construction and items; and "workplace" and realistic scenario performance, in knowledge, skill and attitudinalaffective domains.

Part 4 Online Data -Analytics (Quantitative Data, or Big Data), Informing "Who, Where, When and What"
At this juncture, it might be helpful to highlight how Social Media Analytics and Web (eg. Google) Analytics can assist and educator; focusing on the features, functions-functionality and differences between both (note that each word in preceding sentence is hyperlinked to separate webpage-article).

Small Data (Personal Observation, Focus Groups, Rich Data, Qualitative Data), Informing "Why, How" and "To What Extent"
The key idea is to not only to have visibility of what is used often, and found to be useful; but to evaluate "why" this Goh P MedEdPublish https://doi.org/10.15694/mep.2017.000067 Page | 6 is so, for an individual student, or student cohort. With the aim to provide more of this content, or learning processactivity. And conversely with content and learning processes that are "unpopular", where students spend little time on, do not use, and by implication find not useful; to investigate why this is so, and either replace, or refine this material or learning process-activity.
How can data analytics inform our teaching practice -a framework (in quasi table form) for thinking about and evaluating the use of Web 2.0 technologies as an online teaching / engagement tool and platform. (TeL = Technology enhanced learning; eL = eLearning)

When is TeL or eL used? Where is it used?
For example -1) Views -volume, timing (quantitative); 2) Reviews -qualitative and quantitative; and 3) Preference over other competing content.

Why is TeL or eL used?
1) Access, convenience; 2) Accessibility; 3) Useful, fit for purpose; 4) Easy to use, intuitive; 5) Simple to use; and

6) Recommended by others
How is TeL or eL used? 1) Before, during or after class; 2) Before examination/assessment; 3) To look up something, on demand reference/learning, performance support. 2) Assessment; 3) Illustrate theory; and for 4) Mastery training.

Traditional, TeL or blended approaches?
1) When to use?
2) How to use?

Where is the value/what is the value; and impact of TeL? What is the evidence of this?
1) Views; 2) Reviews; 3) Recommendations; 4) Actual use -how, when, where, with whom, why; 5) Evaluation of integration and use of Knowledge and Skills learnt, and creation of new knowledge, insights and applications/skills.

Conclusion and Take Home Points
It is hoped that this article (with its multi-section/multi-part format) has provided useful ideas, and a framework to think about; and use learning analytics -both "big data" and "small or rich data" in medical education.