Investigating engagement in a blended learning course

Proponents of a blended course paint an ideal picture of participants leisurely learning and reflecting on how they can apply their new knowledge. The reality is of course much more complex, especially in the lives of working adults. This study sought to understand the complexity better through analysing the experience of 123 participants enrolled in the 9-h in-service blended course. In particular, it investigated participants' engagement by examining their experience as they interacted with elements of the blended environment. The mixed methods approach was employed with quantitative data from the course analytics and responses from the 34 participants who returned the evaluation questionnaire at the end of the course. This was complemented with one-to-one interviews with 10 participants. The findings suggest that designers of blended professional development courses should bear in mind the characteristics of both the learner and the online platform to achieve greater cognitive, behavioural and social engagement. Subjects: Adult Education; Educational Technology; Technology in Education


Introduction
The term "Blended learning" (BL) is relatively new on the pedagogical scene, being coined and popularised only in the last decade. Like in other emergent fields, there is yet a commonly accepted ABOUT THE AUTHOR Tay Hui Yong joined the Curriculum, Teaching and Learning (CTL) Academic Group as a lecturer at the end of 2013. In the many years serving in secondary schools prior to joining CTL, she was an English and Literature teacher, HOD (EL), dean (Curriculum) and vice principal. Her research interests grew out of her line of work: her MEd thesis looked into role conflict experienced by HODs; her PhD focused on self-regulated learning and authentic assessment, important areas to her as driver of the Integrated Programme in her school. Above all, she is interested in all things that will enhance students' learning experience in school. Since joining CTL, she has had the pleasure of working with a whole range of teachers: preservice student teachers, in-service, experienced teachers including senior and lead teachers as well as school management such as heads of department and school leaders.

PUBLIC INTEREST STATEMENT
Despite the increasing publicity on "Blended Learning" (BL), there is a lack of clarity on what constitutes BL and how it impacts learning. The present study sought to provide a deeper understanding of the learner's experience of BL through examining learners' experience using surveys, interviews and analysing data on their participation in the course. Conducted in the context of an in-service training course on basic assessment literacy, it investigated specifically how the learner's engagement is influenced by the elements unique to the BL experience: the online content delivery mode, the 24/7 access to resources and the interaction (both online and face to face).
definition (Means, Toyama, Murphy, & Bakia, 2013;Oliver & Trigwell, 2005). At best, there is a broad consensual understanding that BL is the purposeful integration of asynchronous online learning experience with face-to-face learning. This is the definition adopted for this paper.
BL has been on the rise with the literature generally supporting it. Proponents cite its benefits which include flexibility for learners (Cheung & Hew, 2011) and access to learning for large numbers of students (Garrison & Kanuka, 2004). They suggest that the younger learners, often called the Digital Natives (Prensky, 2001), will be more engaged learners in BL, drawn to its interactive and media-rich online component. Some of these claims have been borne out by studies (Giannousi, Vernadakis, & Derri, 2009;Means et al., 2013).
The present study aimed to look beyond the rhetoric to seek a deeper understanding of the learner's experience of BL. Conducted in the context of an in-service training course on basic assessment literacy, it investigated specifically how the learner's engagement is influenced by the elements unique to the BL experience: the online content delivery mode, the 24/7 access to resources and the interaction (both online and face to face).

Definitions
Either because of its short history or it was initially more widely used in corporate training (Driscoll, n.d.), BL is currently under-theorised (Halverson, Graham, Spring, Drysdale, & Henrie, 2014). The term itself is not well explicated, with the definitions being often stated in operational terms of how it is used. For example, BL has been variously defined in terms of a combination of: • activities: "learning that mixes various event-based activities: self-paced learning, live e-learning, and face-to-face classrooms" (Alonso, López, Manrique, & Viñes, 2005, p. 231) • locations: "any time a student learns at least in part in a supervised brick-and-mortar location away from home and at least in part through online delivery with some element of student control over time, place, path and/or pace" (Horn & Staker, 2011, p. 3); • delivery modes: when "25% or more, but not all, of the instruction on the content to be assessed occurred online" (Means et al., 2013, p. 6).
• experiences: "the thoughtful integration of classroom face-to-face learning experiences with online learning experiences" (Garrison & Kanuka, 2004, p. 96) These descriptions may help explicate the part of the phenomenon that is blended but does not help us understand the learning in BL: how it takes place and why. For example, it is not clear if BL is merely adding two different types of learning (traditional and online) or that it is more than the sum of its parts as suggested by writers such as Garrison and Kanuka (2004).
The present paper argues that at the heart of BL is the learning experience and hence, draws upon Dewey's notion of an educative experience as way of understanding it (Dewey, 1938). Dewey posits that any experience is constituted by the interaction between the learner and his environment. Hence, in designing experiences to be educative, we should attend to the environment, be it conditions in the external ("objective", p. 42) condition or personal ("internal", p. 42) within the learner. This perspective thus frames BL in terms of the learner's experience of the blended environment, both online and face to face, as well as the interaction between the two. As such, this paper, adapting Garrison and Kanuka's definition, delimits BL as the purposeful integration of asynchronous online learning experience with face-to-face learning.
In this study, BL experience is analysed in terms of its unique constituent parts: the online content delivery, the 24/7 access to resources, and the interaction (both face to face and online). This by no means detracts from Dewey's view that experience is essentially holistic, but the approach helps in examining the contribution of each part in engaging the learner. Also consistent with Dewey's notion of experience being shaped by the environment, engagement is conceptualised as a state of being that is highly influenced by contextual factors, rather than an inherent attribute of the learner (Fredricks, Blumenfeld, & Paris, 2004;Furlong & Christenson, 2008). Also, engagement is increasingly seen as a multi-dimensional construct. Anderson, Christenson, Sinclair, and Lehr (2004) proposed a taxonomy of engagement that comprised behavioural (e.g. attendance and class participation), academic (i.e. time on task and academic learning time) and psychological (i.e. sense of belonging and relationships with teachers and peers). This is largely consistent with the three types of engagement (behavioural, cognitive and emotional) presented by Fredricks et al. (2004). These latter writers explicated behavioural engagement as "doing the work and following the rules"; cognitive engagement as incorporating "motivation, effort and strategy use"; and emotional engagement as "interest, values and emotions" (p. 65). While both sets of writers cite many sources, they do not offer any theorising to explain the choice of constructs. A theoretical perspective will help in understanding the constructs, their connection and hence offer insights on possible causal effects that will inform practice (Halverson, 2002). This paper posits that such a theoretical lens to understand engagement better lies in Bandura's triadic reciprocality (Bandura, 2001). It proposes that three constructs-behaviour, personal factors (e.g. cognitive processes and affect) and environment-mutually interact to influence each other. As an example, a learner chooses to participate in the online forum (behaviour) to clarify a doubt or to connect with other participants in the course (personal factors). Also, elements in the environment can influence the learner's behaviour and feelings (Zimmerman, 1989). For instance, the learner's continued participation in the course may be facilitated by the 24/7 availability of resources or hindered by lack of social connection with the rest of the class whom they have not met face to face. The triadic reciprocality also means that personal factors can influence the environment as in the case when learners act to change their learning situation (e.g. fast forwarding the video in the online lesson unit) if they perceive it to be unhelpful. As can be seen, Bandura's theory helps us rationalise the focus on cognition, affect and behaviour, as well as explicates the connection among them. The emphasis on the situational task context of the learner is also particularly suited to the present study that seeks to examine how the elements in the BL environment influence the learner's behaviour and affect.

Empirical studies
What light have empirical studies shed on the effect of BL? In a recent meta-study by Means et al. (2013), the writers found that on average, students in online learning conditions (either purely online or blended) performed modestly better than those receiving face-to-face instruction. Specifically, the effect size for blended approaches is larger than that for purely online approaches when contrasted against face-to-face instruction. It is also interesting that there appears to be a differentiated effect of BL on learner types. The weighted effect size of effectiveness on undergraduates was .309 (p < .001) compared to .100 for graduates and those on professional training. This suggests that studies should look into the use of BL specifically in in-service courses. However, only 3.5% of the top-cited BL papers investigated this area (Halverson et al., 2014). The lack of empirical evidence of how working adults experience BL is unfortunate since it is recognised that such learners have their own unique needs. These include the immediacy of application, the role of experience as a source of knowledge, self-direction and ownership of their learning (Knowles, 1990).
It also appears that learner engagement and learner-content interaction in BL are two areas currently not well researched (Halverson et al., 2014). Fortunately, the literature does suggest a few leads. Firstly, a narrative approach using stories or anecdotes has been shown to yield better learning outcomes with both children (Dalton, 2011) and adults (Butcher, 2006;Jensen et al., 2014). Proponents suggest that such an approach helps learners grasp concepts with the help of the context in which the new knowledge is embedded (Doyle & Carter, 2003). In addition, stories can engage the learner at "more than a cognitive level; they engage our spirit, our imagination" (Clark & Rossiter, 2008, p. 65) because they evoke the learner's prior experiences. As already noted in the previous paragraph, this focus on experience is particularly relevant in this study because of the adult participants involved.
Secondly, a study that focussed on what American adult learners valued in a blended environment found that the top two elements that participants felt would most contribute to their success in BL were firstly, provisions for individualisation or customisation of learning and secondly, conditions that facilitated self-directed learning (Ausburn, 2004). One simple and cost-effective way of meeting these needs is to make available the resources online so that the busy adult learners can access them at their own convenience. The same study also found that adults also placed high value on effective two-way communication with their classmates. Hence, an implication for BL design is to supplement face-to-face interaction with online forums and other Web 2.0 tools to facilitate such communication. Building such a community of learners has also been noted and recommended elsewhere by designers of BL (Helms, 2012) and of professional development courses in general (Schneider & Randel, 2009).
As a result of the literature review, the present study was designed to investigate engagement of local working adults in a BL course. Learner engagement was delimited in the following terms: • cognitive engagement which is the extent of participants' interest in and understanding of the concepts presented; • behavioural engagement which is reflected in active participation in the course; and • affective engagement which refers to the participants' sense of connection and support to tutors and other participants in the course, The study also specifically focused on the following elements in the BL environment: (1) online delivery of content through a narrative approach using animated videos; (2) self-access resources (e.g. readings) and quizzes to check understanding after each session; and (3) interaction through online forums, online chats and face-to-face seminars.

Context and participants
The context of the study was a BL course on basic assessment literacy designed by two tutors, one of whom was the present writer. Most of the 123 participants were teachers from various institutions and levels across Singapore as well as officers from the Ministry of Education (see Table 1). One notes that almost half of the participants came from secondary schools.
The content was delivered in six online sessions. It was estimated that each session would take a participant 1 h to complete. Typically, each session started with an overview before participants were invited to watch a video designed to explain the core concepts for that unit. This was followed by a quick self-assessment quiz and recommended readings. Lastly, participants were invited to contribute their questions or experience pertaining to the unit in the online forum. Each session was launched every Monday starting 2 June 2014. However, participants were free to log in anytime, at their convenience, to view the videos that delivered the content, attempt the self-check tests with answers and follow up on recommended readings at the end of the sessions. As a way of promoting a community of learners, participants were asked to introduce themselves in the online forum at the end of the first session and also reminded at the end of each session to contribute views and questions. Some questions in the self-check tests were deliberately controversial to encourage discussion. The two tutors took the opportunity to address these as well as any areas of uncertainty (by analysing participants' answers submitted) in the online chats midway through the course (1 July 5-6 pm, 1 July 8-9 pm and 29 July 8-9 pm). These chats used a shared Google document as a platform. The online sessions culminated in the only face-to-face session on 8 September. It lasted 3 h with participants engaging in discussions based on frequently asked questions and good assessment practices. It is to be noted that participants took the course for their own professional development and that there were no credits attached to the course.

Research design
The study sought to answer the following research questions: • Did the use of narrative approach increase participants' cognitive engagement?
• Did the ease of access to resources increase participants' behavioural engagement?
• Did the use of Web 2.0 tools help in increasing participant's affective engagement?
A mixed methods approach was adopted so that the triangulation of data added to our confidence in the inferences made. Quantitative data were collected from the course analytics: learners' participation in the activities in each unit was tracked. In particular, their viewing of the videos in each unit was tracked using YouTube analytics. This objective set of data complemented participants' responses in a self-report questionnaire at the end of the course. The questionnaire asked participants to rate "Strongly agree", "Agree", "Disagree" and "Strongly disagree" whether the following enhanced their learning in the course: (1) The organisation of each unit (Introduction-Video-Test-Resources) (2) The use of videos (3) The use of animations (4) The ease of access to resources (e.g. additional reading) (5) The self-assessment ("Test your understanding") (6) The use of forum/Google chat (7) The face-to-face session Their responses were converted on a four-point Likert scale: 4 (Strongly agree), 3 (Agree), 2 (Disagree) and 1 (Strongly disagree) and the data were analysed using paired sample t-tests to locate any significant difference (p < .01). Data analysis also included a test of Pearson's correlation to investigate any significant correlation between questions (p < .01). This was to check any consistencies or inconsistencies in the participants' response across different questions.
Participants were also invited to write remarks to the following questions: (1) What made you sign up for this blended course?
(2) Was it important for you to complete this course?
(3) What elements in the course helped in making you continue with the course?
(4) Describe your experience on this course in a few words/sentences.
The understanding from the quantitative data was enriched with one-to-one interviews with 10 participants of an equal mix of male and female and of different age groups and from different year levels (see Table 2). Proportionately, more interviewees were shortlisted from secondary school teachers because they comprised half of the course participants. By analysing data on online participation, interviewees were shortlisted one week before the face-to-face session, based on the four types of online learners identified by Kizilcec, Piech, and Schneider (2013): • "Completing" learners who complete all or majority of the course, including activities planned for the course (for example, self-check tests and readings); • "Auditing" learners who are instead engaged by watching the video lectures but rarely participated in activities; • "Disengaging" learners who start off as "completing" learners but show marked decrease in engagement over time; and • "Sampling" learners who typically watch video lectures only once or twice to find out what the course was about, before deciding the course is not what they are looking for and disengaged completely.
Three of the participants originally shortlisted as Sampling, Auditing and Disengaging did not reply to the invitation to be interviewed and hence were respectively replaced by Teachers GM, J and ST because their profile of learning matched the ones who did not reply. They were also acquaintances of the researcher. Aware that this could threaten the validity of inferences, the researcher was extra careful to seek confirming and disconfirming evidence from the interviewees.
The semi-structured interview drew upon the following main questions based on the research questions: (1) What made you sign up for this blended course?
(2) Was it important for you to complete this course?
(3) What made you continue/discontinue with the course?
(4) What elements in the course helped in making you continue with the course?  The interview transcripts were analysed for themes based also on the research questions. But if a new theme was observed through recurring mention in the interviewees' comments, an inductive code was assigned. Microsoft Office Excel software was used to record and manage excerpts that supported these themes.

Findings
This section will begin with some quick observations gleaned from the quantitative data before proceeding to answer the research questions in detail.
In general, as seen from Table 3, the participation declined as the course progressed. They participated less in the activities, including watching the videos in the units. It was noted that many participants logged on just before or shortly after the face-to-face session. (The course was to be closed one week after the face-to-face session). The final figures of the four types of learners were Completing (51.7%), Auditing (1.4%), Disengaging (19.2%) and Sampling (27.7%).
The YouTube analytics found that when the videos used a narrative approach (using animations) to deliver content, participants were slightly more likely to watch them to the end. The average percentage for animations was 37.7% compared to 32.8% for videos using PowerPoint presentations with a voice-over. However, this longer engagement could be due to the fact that these videos using animations were shorter; the average length was 6 min, less than half of the average length of the rest (at 13.5 min).
Secondly, the responses from 34 participants who completed the questionnaire showed that the elements that most enhanced their learning were the organisation of the unit (Q1) and ease of access (Q4) (see Table 4). They were positive about the use of videos and animations with the percentage of those who agreed or strongly agreed that the use of videos or animations enhanced their  learning being 88.2 and 94.1%, respectively. However, there was no difference in the two mean scores (t = .00, df = 33, p > .01). In fact, the scores for Q2 and Q3 were statistically significantly correlated (r = .66, p < .01), confirming that respondents considered animations no different from the other videos in terms of their learning. There was however a statistically significant difference in the means of Q6 and Q7 (t = −2.82, df = 33, p < .01). It suggests that the respondents preferred live compared to online interaction. But with Cohen's d at −.46, it can be considered a small to medium effect. There was a greater difference between Q5 and Q6 (t = 2.89, df = 33, p < .01, d = .57) which suggested that respondents found the self-assessment more helpful than online interaction. However, given that very few participated in the forum and Google chat sessions (estimated at about 20 in total), this comparison is probably based more on belief than actual experience.
Though the generalisability is probably limited by the low return rate of the questionnaire (27.6%), the quantitative findings are largely consistent with the qualitative findings from the participants' written remarks in the questionnaire as well as interview data. I will organise these qualitative findings round the three research questions.

RQ 1: Did the use of narrative approach increase participants' cognitive engagement?
As mentioned earlier, there appears to be no advantage from using animations in enhancing interest and understanding. However, interviewees were more likely to recall what they had watched in the animated videos. Three interviewees mentioned specifically an anecdote used in the first unit, of an encounter in the canteen to illustrate the concepts of reliable and valid conclusions. They mentioned that the story helped, making the concepts "simple enough" and "easier to follow" and so it "really stuck". In comparison, PowerPoint presentations were less engaging, particularly when they tended to be longer.
However, the mode of delivery was less important to the participants than the search for clarity and suggestions that they had hoped they could easily apply to their school context. Reasons for registering for the course were in connection to their classroom teaching ("to improve and vary the types of assessment that I conduct for my own students in N(T) stream") or to their work as part of school team looking into assessment ("I'm in a … team on assessment for learning through rubrics"). As such, participants felt that it was important to complete the course to gain enough learning "to improve assessment matters for (my) dept".
These reasons for attending the course were echoed by the participants who were interviewed. It was a coincidence or an indication of the larger group of participants that the interviewees all held key positions (e.g. senior teachers or subject heads) and were already overseeing or embarking on reviewing assessments in their schools. They were charged with leading other teachers and so wanted to "understand further" assessment principles so that as one interviewee put it, "share with the teachers when (they) do lesson observations". This sense of responsibility kept them engaged in the course. What was also apparent through the interview was that they were already familiar with basic assessment concepts (e.g. reliability and validity) because they were part of their work. However, what was almost palpable was that the participants' concern was how to better apply what they know. As Teacher J shares, when it comes to … rubrics … We know the theory … but then again … I really do not know what type of rubrics are suitable for what kind of … assessment … So sometimes … we know these are some examples, but what measures and what is more suitable and how can we better craft our descriptors?
In short, with regard to what would facilitate their cognitive engagement, the participants reflected that they were less interested in the delivery mode than how to use what they had learnt in their work. They looked to the course for fresh insight on their practice. It was also frequently mentioned they were looking for "bite-sized", "practical and implementable" suggestions that could close "the gap between theoretical and application" (Teacher W).

RQ 2: Did the ease of access to resources increase participants' behavioural engagement?
The qualitative feedback from the questionnaire as well as the interviews was consistent with the quantitative data that the 24/7 access to the course contributed much to their participation in the course. One interviewee even remarked, "If I had to go down for all the different sessions, I wouldn't have signed up". The convenience went beyond saving on travelling time. It included the flexibility in catering to their individual schedule, as elaborated upon by Teacher J, "…there might be days that you are unavailable. So for this, it was just nice. Days that we are free we will just go in and do our own self studying". There was also the added convenience of learning at their own pace. Such an "own time, own target" (OTOT) approach (as Teacher Z called it) meant they could also selectively focus on some areas they found less familiar. They could also rewind or fast forward the videos to better understand the lesson. Another interviewee, Teacher C, elaborated: Sometimes after watching a part I will stop then I will start thinking about certain things … sometimes I watch it with another fellow teacher who's not in the course … then we start talking about it. And what we are doing currently and how is it related to what we watched on the video. So actually it took longer than what it should be but at least it gave us the time to pause and then talk about it … Or the problem that we are facing … can somehow we see some light by watching the videos.
However, the convenience of learning at home was a double-edged sword. While participants appreciated the flexibility, they had to wrestle time from family commitments after work to complete the online units. Teacher J had to wait till her young child was napping before she got to sit in front of the computer to access the course. Another mother, Teacher SK, shared how she was not able to participate in the Google chat as it was her "cooking time" for the family dinner.
The easy access could also mean procrastination or less priority because since it was "available somewhere, so we just leave it there until when we need it then we use it". The fact that it was accessed online meant that participants had to contend with "all sorts of other distractions", for example, email. In fact, Teacher SK, mentioned in the earlier paragraph, commented that at least with a face-to-face course, "Fully we will be there, our mind will be fully there. Not spread with other things occupied with other things". Another participant suggested that there should be several half-day sessions built into the course "for legitimate time away from work to review readings".
As pointed out earlier, either due to procrastination or packed schedules, many participants finished the six units only just in time for the face-to-face interaction. Interestingly, the interviewees who were identified as "sampling" or "auditing" learners one week before the face-to-face session (because of their low participation rate) did "a crash course" to finish most of the units. Even the interviewee identified as a disengaged learner rushed through the materials because he felt that "by the time we came to the face to face session, (he) needed to have some basic understanding. Otherwise, (he) wouldn't know what's going on". One notes here the relation between the online and live experiences with the prospect of the latter having an influence on the participants' engagement in the former. There was a sense that they had to be prepared for the face-to-face session ("confrontation" as one participant phrased it) and so it motivated them to finish the online sessions as the date drew near.

RQ 3: How did the use of Web 2.0 tools help in increasing participants' affective engagement?
The relatively less positive response to the use of Web 2.0 tools (see Table 4) was not surprising because there were a few technical hitches. Firstly, the forum was not as easily accessed because it was sited outside of the portal where the course resided. The two Google chats using a live Google document met with a lukewarm response because of the timing (at 5 pm and 8 pm). Those who participated in the Google chats had different sorts of misgivings. Some were due to the process as one interviewee shared, "I went in but there was some lag. I didn't know how to really type and get people to respond. People were responding but to what questions, I got no idea". But the issue could be even more fundamental as seen in this case, "There was this reservation about how much I should reveal or how much I should sort of go in and discuss". For these various reasons, the course was not able to create that sense of community of learners that the tutors had hoped for. Such a community was described by two interviewees who had previously enrolled in the Teaching for Understanding (TfU) course which was conducted entirely online by Harvard. Every week, participants were to post their responses to assignments online. Teacher W felt that the learning was "quite rich" because of the coach's feedback on their posting as well as the response from others. Participants were encouraged to give feedback to each other so each posting generated "easily about 8, 9, 10" responses. He summed his experience on that course as such: That's why I find that TFU, even though there is nobody talking and I don't hear any voice, the written words give you a lot of ideas or sometimes it inspires you and you provide feedback and generate certain things … You need to be inspired by others along the way. The dialogue personally, I feel is much more important than the lessons online.
Other interviewees who had not experienced such support from an online community frequently mentioned how they preferred face-to-face sessions where they get to see the other participants and to raise questions with the tutors. They often mentioned how though the resources were easily accessible, help was not. So, "when (participants) have a question, (they) can't ask (their) questions then and there. So if there are things that (they) need to clarify, there is no one there to clarify with".
The findings suggest that participants valued dialogue with others on the problems they faced during the course or on the job. This is not surprising considering the kinds of assessment issues the participants faced. Some examples that they raised during the Google chat involved raising teacher competency in classroom assessments and trading reliability for validity. These are complex issues that are arguably best addressed in an extended face-to-face discussion with a tutor who can facilitate a critical reflection by all present. That is not to say that there is no value in an online interaction. As Teacher S pointed out, … sometimes through writing, you are forced to clarify exactly what you are trying to say. Whereas sometimes you try to express yourself to someone but you're trying to do it at a moment's time, you can't exactly articulate what you are trying to say, so it may not be the most useful all the time.
Unfortunately, as mentioned earlier, the technical glitches prevented greater investigation into the contribution of Web 2.0 tools.

Discussion
The present study sought to better understand the learner's experience of BL defined here as the purposeful integration of asynchronous online learning experience with face-to-face learning. In particular, the study wanted to examine the learner's engagement conceived here as a multi-dimensional construct comprising interactive cognitive, behavioural and affective elements which are influenced by the situational context of the learning. The context of interest in this study focussed on unique constituent parts of the blended environment: the narrative approach used to deliver content, 24/7 access to resources and Web 2.0 tools. The implications of the findings to each of the research questions are presented before concluding with an overall synthesis.
Firstly, the finding that participants were more concerned about the theory-practice nexus is consistent with adult education theory that highlights adults' expectations of the personal relevance in what they learn. They enrol on professional development courses typically because they face some problems on the job. Hence, their orientation to learning is problem centred with an emphasis of how to apply what they have learnt immediately to alleviate their problems (Knowles, 1990). This suggests that any in-service course, including that delivered in BL mode, should be designed round the practical problems these adults face and how theory can inform their consideration of solutions. Such an approach will help keep participants interested and facilitate their understanding of the content.
Secondly, it is not surprising that the ease of access to resources facilitated participants' active or continued participation in the course. The finding is consistent with those from other BL studies (Ausburn, 2004;Maclachlan et al., 2014). The findings also support the view that engagement is malleable to contextual factors. As mentioned, interviewees with originally low or decreasing participation were very engaged online just before the live session. Hence, there were two types of "completing" learners: those who took their time to engage deeply in the content (like Teacher C quoted earlier) and those who did a "crash course". The lack of online participation in sampling and auditing learners may be due to reasons that had less to do with the course than other contextual factors. For example, Teacher SK, a Mother Tongue Language teacher, skipped viewing the videos because she had difficulty understanding the language used while Teacher GM and Teacher J skipped parts that they had learnt in their recently completed Masters in Education. In essence, the findings are consistent with the literature that adult learners value self-direction in terms of their learning resources. However, they also suggest that though participants may be willing to learn, these busy working adults need some form of accountability that will help prioritise the course above the many other competing demands.
The findings to the research question on whether the use of Web 2.0 tools increased participants' affective engagement were inconclusive. The returns from the self-report questionnaire suggested that participants did not find online interaction helpful. However, the interview data indicated that participants valued opportunities to discuss the assessment issues that they faced or learn from how others had resolved them. Certainly, more research is needed to understand this area as well as to test if Web 2.0 tools indeed provide platforms "where participants can confront questionable ideas and faulty thinking in more objective and reflective ways than might be possible in a face-toface context" (Garrison & Kanuka, 2004, p. 99).
Viewed as a whole, the findings give us a better understanding of how the environment both motivates and constrains these adult learners. The issues that they face in their job environment motivated them to sign up for the course, with the hope that they would find the insight or practical solutions to their problems. At the same time, the many other commitments and distractions make it often difficult for them to devote time after working hours to the online sessions.
There are also two sides to the BL environment. Firstly, there is an advantage of using videos to enhance learning. However, from the findings, there appears no particular advantage of using a narrative approach in the videos. At best, there is some limited evidence of longer viewing and better retention because interviewees were able to recall the anecdotes used. Secondly, the availability of the online resources 24/7 means that learners can access the learning flexibly in their own time and at their own pace. However, the benefits from the accessibility can be undermined by procrastination or busy schedules. The learning can also be constrained unless there is ready online access to the course tutor or peers. Otherwise, participants cannot clarify their doubts or reflect with others possible ways of applying their learning.

Limitations
The findings may be limited, though, by a few factors. The first limitation involves the low return rate of the questionnaire (27%) and the small sample of interviewees (10). Their views may potentially under-represent that of the majority of the participants, even though care was taken to ensure that the interviewees were a fair representation of the rest. Secondly, inferences about participants' BL experience may be limited by the inadequate Web 2.0 tools in this study. As mentioned earlier, participants' experience of the online community might have been enhanced if the online forum was more accessible. Lastly, it would have been more ideal if the researcher had been a disinterested third party. Instead, because the interview was conducted by one of the two course tutors, it could conceivably have an influence on the interviewees' reports of their experience. Though there was no advantage for them since there were no credits attached to the course, the findings from the interviews would have been strengthened if they were conducted by someone who was not so closely associated with the course.
However, to the extent that they may be generalisable, the findings of the study could contribute to the needed theoretical understanding on learner engagement in blended settings (Halverson et al., 2014). Drawing on Dewey's perspective, the study viewed BL not as a stand-alone event but part of the continuity of the learners' experience. The learners bring into the BL experience their work-related contexts that impact their thinking and behaviour during the course. By focussing on adult learners, the study also adds to the current sparse knowledge base in this area.

Conclusion
The past decade has seen the rapid rise in the use of Blended Learning. Proponents paint an ideal picture where participants can learn leisurely and reflect on how they can apply their learning. The reality is, of course, much more complex, especially in the case of working adults. This study sought to understand these complexities, through examining participants' engagement in BL in terms of the learners' experience as they interact with the blended environment.
The findings have practical implications for institutions serving adult learners and planning to design BL courses for them. Such courses cannot be online reincarnations of traditional courses conducted in brick-and-mortar settings. Neither should they be mere occasions of online learning supplemented by face-to-face interactions (or vice versa). They should be designed to capitalise on the relative strengths of online and human interactions to enhance the overall learning experience. The online platform offers the advantage of flexibility and self-directedness in learning, especially valued by working adults. However, these learners also value the dialogue with others on their workplace problems, whether online or live. These considerations will help contribute to the participants' cognitive, behavioural and social engagements in their blended learning.