skip to main content
10.1145/3613904.3642283acmconferencesArticle/Chapter ViewFull TextPublication PageschiConference Proceedingsconference-collections
research-article
Open Access
Artifacts Available / v1.1

KOALA Hero Toolkit: A New Approach to Inform Families of Mobile Datafication Risks

Published:11 May 2024Publication History

Abstract

Children today are deeply immersed in the online world, where their activities are routinely tracked, analysed, and monetised. This exposes them to various datafication risks, including harmful profiling, micro-targeting and behavioural engineering. Most existing measures focus on immediate online threats, rather than informing children about these implicit risks. In this paper, we present The KOALA Hero Toolkit, a hybrid toolkit designed to help children and parents jointly understand the datafication risks posed by their mobile apps. Through user studies involving 17 families we evaluate how the toolkit influenced families’ thought processes, perceptions and decision-making regarding mobile datafication risks. Our findings show that KOALA Hero supports families’ critical thinking and promotes family engagement. We identify future design recommendations for family support, featuring ideas such as integrating triggering moments and bonding moments in toolkit designs. This work provides timely inputs on global efforts aimed at addressing datafication risks and underscores the importance of strengthening legislative and policy enforcement of ethical data governance.

Skip 1INTRODUCTION Section

1 INTRODUCTION

Children today are engaging with the online world more than ever before, and this trend continues to escalate. Recent statistics indicate that in the United States, online media use by 8- to 18-year-olds has grown faster during the two years of the pandemic than it had over the four years before the pandemic [66]. In the UK, an astounding 97% of children between the ages of 3 and 17 are active online, with over 60% of ten-year-olds owning their own smartphone or tablet [72]. Alongside this accelerating increase in children’s online media usage is the alarming trend of younger users (under 13) flocking to social media platforms, an age group that is, according to platform terms and conditions, not permitted to use most of the social media platforms. A recent US survey revealed that social media use among 8- to 12-year-olds has risen from 31% in 2019 to 38% currently [67]. Similarly, a report with UK children showed that YouTube was the most used online platform among 3- to 17-year-olds (88%), followed by WhatsApp (55%), TikTok (53%), Snapchat (46%), Instagram (41%) and Facebook (34%).

The ongoing and increasing adoption of online services by children raises concerns about the risks associated with children’s online data privacy and associated datafication risks: children’s actions online are pervasively recorded, tracked, aggregated, analysed, and exploited by online services in multiple ways that include behavioural engineering, and monetisation [61, 68, 116]. Central to these datafication risks is the capability of online service providers to infer details about users. They analyse user data, aided by algorithms, to assess personal attributes associated with an individual [58]. Specifically, they seek to assess or predict factors like an individual’s behaviours, preferences, or health. This gathered information is then strategically used to steer users’ online activities, engagement, and content choices, leading to significant impact on their online experience [61, 68]. Such datafication practices largely operate behind the scenes of apps or services, remaining invisible to the users. As a result, they are less understood or discussed compared to more straightforward data privacy concerns, like the direct collection or disclosure of user data.

To tackle growing concerns about children’s online safety, legislators worldwide have increasingly introduced child-specific regulations such as the UK’s Children’s Code [9], the US’s COPPA [5] and EU’s GDPR-K [7]. These laws tend to set strict requirements for the processing of data relating to young users (i.e. those under 13 in the US and the UK, and under 16 in the EU), including parental consent before data processing. Yet, enforcement remains challenging due to the vast number of apps and services regularly used by children and widespread non-compliant practices [74, 80]. For instance, while there is an age limit of 13 for registering an account on many apps [45], several reports showed that underage children still heavily use these platforms [25, 67, 72]. Beyond legal and policy protections, there has also emerged a new genre of safety apps, known as parental control apps, which are specifically designed to help parents oversee their children’s online activities and protect them from potential online harm [112]. However, such approaches largely focus on direct online harms, such as cyber-bullying or inappropriate content, and provide little support for children to comprehend risks regarding the more implicit datafication harms. Furthermore, monitoring or surveillance-based approaches that are typically employed by current parental control apps not only diminish mutual trust between parents and children [39, 99, 105], but also place a significant burden on parents to possess a deep understanding of relevant issues [55, 86]. This is particularly challenging given that most adults have limited awareness of how their own data is collected, processed, and exploited to shape their digital experiences [28].

To address these challenges, we developed The KOALA Hero Toolkit, a hybrid (digital and physical) toolkit that comprises a mobile tracker app, a set of data cards, and a task sheet accompanied by worksheets. This toolkit is designed to help children and parents collaborate to better understand implicit datafication practices online, especially those associated with the use of mobile apps, while fostering trust and communication between parents and children. More specifically, we aim to explore three research questions:

RQ1: How do families with children perceive and navigate risks associated with datafication on mobile devices?

RQ2: In what ways, if any, might the KOALA Hero Toolkit influence families’ perceptions of such risks, and the thought processes undertaken in risk evaluation?

RQ3: To what extent, if any, might the KOALA Hero Toolkit support families in becoming more informed about datafication decisions? What additional support might they require?

Through 17 user studies with 17 families (17 parents and 23 children aged 10 – 14), we found that the KOALA Hero Toolkit strongly influenced both parents and children’s perceptions of datafication risks, and prompted families to critically reflect and introspect about potential datafication risks associated with mobile apps. Families also demonstrated changes in how they made data-related decisions as supported by the toolkit, often demonstrating more democratic and interactive family-joint decision-making processes as a result. These findings provide timely input to the current global effort aimed at addressing datafication risks of minors through providing better support for families, and highlight an urgent need for more active implementation of legislative and policy advancement of ethical data frameworks.

Skip 2BACKGROUND AND RELATED WORK Section

2 BACKGROUND AND RELATED WORK

2.1 Datafication Risks of Mobile Apps

Most smartphone apps collect and share information with various first and third parties snippets of code, so-called trackers, that collect and send information about a user’s online activities to other companies [22]. A study analysing 1 million Android apps showed that trackers are found in almost every single app [96]. Research has also indicated that 90% of the websites include at least one tracking script [32], resulting in a multi-billion dollar business where many companies earn huge amounts of money by selling or leveraging the data collected from users [53, 62]. A more recent 2023 study on popular children’s mobile applications revealed that 13 out of 15 apps shared more user information with third parties than was disclosed in their respective privacy policies [30].

Along with the huge amount of data being collected from users, there is a growing worry about how online platforms may further exploit this data [22], giving rise to a set of datafication risks that extend beyond commonly addressed concerns like inappropriate online content and excessive screen time [57, 72]. Datafication here refers to the process that children’s actions are pervasively recorded, tracked, aggregated, analysed, and exploited by online services in multiple ways that include behavioural engineering, and monetisation [61, 68, 116]. Such practices can be found in almost every online platform, such as how Facebook provides personalised group recommendations to users [83], how Google tailors news for different users [16], and how Instagram presented “idealised body images” to teenage girls [94]. While data tracking and subsequent datafication can result in enhanced services and potentially improved user experiences, this may also lead to harmful profiling of children’s data, which is exacerbated by increasingly sophisticated online surveillance [84, 116], leading to more nuanced and individualized harms. A 2021 study showed Facebook profiles underage users using personal data, often sensitive, directing them to specific content providers, including potentially harmful or risky interests, such as smoking, gambling, alcohol, or extreme weight loss [104]. In fact, such datafication is core to the business models of numerous online service providers [44, 79, 104]. A recent report revealed advertisers pay around $3.03 to target a thousand youths interested in alcohol, $38.46 for those into extreme weight loss, and $127.88 for those curious about smoking [44]; intensifying concerns about harmful micro-targeting and behavioral engineering.

On the other hand, efforts have been made to mitigate issues related to datafication. Prior research has led to advanced privacy control techniques in mobile apps, such as reverse-engineering app source code and network traffic analysis [22, 50, 70, 78], which enable users to trace personal data flows from first to third parties, thus offering a clearer view of data management [31, 90, 97]. Self-regulatory initiatives by industry leaders have also emerged with the intention to offer users greater transparency and control over their personal information. Apple, for example, has introduced privacy labels to inform users about data tracking and linkage [3]. Google followed suit with its own privacy labels for apps, enabling developers to declare their adherence to data security best practices [12]. However, these efforts are mainly tailored to general users with some prior knowledge, making it difficult for specific groups like children to understand and use the provided information. At present, the majority of child-focused solutions are parental control apps designed for parents, which primarily concentrate on general online safety for children, providing limited guidance on the more specific issues of datafication (see Section 2.3).

2.2 Children and Parents’ Perceptions of Risks Around Their Data Online

In response to these growing concerns relating to children’s data privacy, researchers have looked into how we may better support children and their parents in gaining a better understanding or awareness of such practices. Zhao et al. [115] found that children aged 6 to 10 were capable of recognising privacy risks like oversharing or revealing identities online, but had a limited grasp of issues like online tracking or personalised advertisements. Kumar et al. [51] found that children between the ages of 8 and 11 began to grasp the idea that data collection on online platforms may pose certain risks to them, but often tend to link such risks solely to “stranger danger”. Older children, aged 14 to 18, had interpersonal concerns but frequently overlooked potential privacy threats from first- and third-parties using their data [75]. Children have also been found to struggle particularly with drawing personal connections to “data trackers” [14], focusing more on data they knowingly provide than on data extracted without their awareness or consent[58], and struggling to view the ongoing flow of their data as a dynamic process [24]. Some more recent studies found that children below the age of 13 have some basic understanding of datafication, like how their personal data could be processed to make “assumptions” about them. However, they often lack the ability to understand the mechanisms and reasons for data sharing across different platforms and the resulting collective profiling that occurs [100]. Another study demonstrated a significant desire among children to receive support in managing datafication risks, not limited to the collection of their data, but particularly those related to the processing and profiling of their data [101].

Meanwhile, researchers have also been looking into how parents perceive their children’s online privacy. A survey with 2,032 UK parents showed that online privacy is the top barrier for parents’ internet use. The survey also found that digital privacy skills are not uniformly distributed among parents and children, with only approximately half of the parents reporting knowledge on how to modify privacy settings for their children [54]. Another survey with 1,300 parents from 51 countries showed that although parents and guardians might have a certain level of awareness and knowledge about their children’s online privacy, this knowledge was confined to understanding how online platforms collect their children’s data and did not extend to the potential risks associated with other datafication practices such as sharing, processing, and profiling [59]. Studies have found that parents reported being confused and concerned regarding online privacy issues [57, 72]. Although children often turn to them for guidance about privacy online, parents felt ill-prepared and were sometimes equally confused about the digital environment and the dangers it poses to privacy, let alone the more implicit risks related to the datafication of children’s online activities [57].

2.3 Parental Control, Parental Mediation, and Family Joint Media Engagement

Traditionally, many existing interventions were crafted under the premise that parents would guide their children through the digital landscape [42, 57]. Parental control apps, for instance, are mobile apps that allow parents to monitor and restrict their children’s activities online[112]. Research has been trying to assess how effective these apps are for safeguarding children online. A review of 75 parental control apps revealed that the commercial market primarily adopts restriction and monitoring approaches [105], such as limiting access or implementing strict screen time rules. Children frequently found these apps to be invasive and overly restrictive, leading to increased tension between them and their parents [39, 40, 99]. On the other hand, active mediation approaches, like promoting communication [41], collaborative rule-setting [43], and co-learning experiences [47], were preferred by both children and parents [99], leading to heightened privacy awareness and stronger protective attitudes in children [91, 106].

Some more recent research has shifted focus from specific parental mediation strategies to exploring varied patterns of co-engagement between parents and children. Joint Media Engagement (JME) here refers to the shared experience of individuals, such as parents and children, siblings, or peers, interacting with media content together, deepening their collective understanding. Research on JME, including activities like co-viewing [92, 93], playing digital games together [60, 110], co-searching [60, 77], co-reading [65, 82, 109, 114], and co-designing [19, 107, 111] between family members, demonstrated that it significantly enhances family learning by fostering shared understanding, promoting linguistic and social development, and supporting discourse and media practices. This research direction also resonates with recent calls emphasising a “family-centered approach” [29], which advocates for creating meaningful and contextual experiences for both children and their families.

While there is a wealth of research focusing on the dynamics between parents and children concerning children’s online wellbeing, a noticeable gap persists in the literature. Prior studies have primarily focused on the immediate online safety issues, such as inappropriate content, or have taken a broader approach to digital literacy [17, 18, 112]. Few have delved into how to assist parents in addressing the subtler risks of datafication tied to children’s use of mobile apps. These risks encompass not only data collection and processing but also extend to threats posed to a child’s online engagement and digital autonomy through practices like profiling and related behavioural manipulations. We posit that studying collaborative interactions within families, including parents and siblings, can enrich our understanding of these risks and enhance our support for intra-family communication and trust.

Skip 3DESIGNING THE KOALA HERO TOOLKIT Section

3 DESIGNING THE KOALA HERO TOOLKIT

As discussed in related work, existing support for family privacy and datafication hardly considers fostering family communications and co-development, which misses the opportunities of facilitating children’s risk coping development. Furthermore, research also shows that a critical understanding of the implications associated with datafication is crucial for users, particularly children, to take actions [100, 101]. To address these vital needs, we believe it is essential to create practical support mechanisms to: 1) raise families’ critical awareness about the implications of datafication by drawing on theoretical frameworks; and 2) promote family joint engagement by encouraging open family discussions on the subject and supporting collaborative family approaches to privacy and datafication issues. With these objectives, we designed and developed the KOALA Hero toolkit. This toolkit encompasses three key components:

A mobile tracker app, designed as a practical tool for families to navigate and control mobile datafication risks.

A set of data cards, designed to facilitate discussion and support situated understanding.

A task sheet accompanied by worksheets, designed to facilitate interactive family engagement activities.

In this section, we first introduce the design considerations behind the toolkit, followed by the design specifics of each component. Please note that we do not position this toolkit as a tool for educational purposes, as such a claim may necessitate structured assessments and clear benchmarks [63, 85], which are beyond the scope of this study. Instead, its design aimed to raise and support the families’ awareness of potential datafication risks, as well as stimulate and guide their thoughts and discussions concerning data-related issues around them.

3.1 Design Considerations

We drew inspiration from Kafai et al.’s computational thinking framework [46], which advocates a multiple-perspective approach for supporting children’s development of computational thinking, and contains three key frames: i) Cognitive thinking focuses on the understanding of key computational concepts, practices, and perspectives and the associated skill building and competencies; ii) Situated thinking encourages learning to take place in contexts that the learner cares about so that they include their personal expression and social engagement in their pathway of learning; and finally iii) critical thinking emphasises the importance of supporting the questioning of larger structures and processes behind the computational phenomenon. While existing online safety and privacy measures often prioritise children’s cognitive understanding, they often overlook contextualization in personal situations or the promotion of critical examination of observable phenomena [33]. While we do not claim the approach from Kafai et al. to be definitive, we used it as inspiration to enhance our family support design.

Our design goal is to create a toolkit with an app that enables families to discuss and collaboratively understand the datafication risks on mobile devices. We began by outlining the app’s key components, such as the ability to detect and disable trackers, or gain an overview of trackers associated with all the apps on the device. We ensured that these components comprehensively address all aspects of Kafai et al’s theory. All co-authors actively contributed to the ideation of these fundamental components, which are detailed in Table 1. The first three co-authors, each with extensive design experience, then proposed design features for each component to foster various aspects of Kafai’s computational thinking. These features were then jointly evaluated by all co-authors, by considering factors like user impact, innovation, and technical feasibility. The top features for each component were then implemented.

Table 1:

Table 1: Mapping the KOALA Hero Toolkit features to Kafai et al.’s computational thinking framework. The toolkit is aimed to be age appropriate and encourage open-ended play in its design choices.

3.2 KOALA Hero Tracker App

As a result of these design considerations, the KOALA Hero Toolkit contains the following key components, summarised in Table 11. Figure 1 illustrates the six main screens of KOALA Hero tracker app, which are described in detail below:

Figure 1:

Figure 1: KOALA Hero tracker app. (a). Intro Video & Help Button (b). Apps Dashboard (c). Trackers Dashboard (d). Trackers View for Individual App (e). Data Destinations View for Individual App (f).Trackers Control for Individual App

Intro Video & Help Button, shown in Figure 1(a), supports children and parents with cognitive understanding of datafication risks. It provides a 1-min short story video that presents introductory information about data trackers, portrayed as different types of “elves” (e.g., functional/essential, social, or advertising), providing children and parents with basic information about trackers and their functions. We used animated and colourful elves to engage children. Users can access this introductory video at the app’s launch or by clicking the help button anytime.

Apps Dashboard, shown in Figure 1(b), provides a summary of all the apps on children’s device and the number of trackers associated with each app. By designing the KOALA Hero tracker app as an app that children can install on their own devices, children can see the trackers associated with their own apps. This encourages children’s situated understanding of the datafication risks they are currently experiencing during their use of devices. It aims to create scenarios that may be more relevant to the families, and allowing children to be more in control with their app choices.

Trackers Dashboard shown in Figure 1(c), provides a summary and ranking of all the trackers found on children’s device. Similar to the apps dashboard, this feature also fosters situated thinking and encourages families to explore apps and trackers associated with higher risk factors. By configuring which trackers to block in Tracker View, this interactive element is intended to potentially provide children with a sense of achievement, by observing how the overall ranking of trackers may change.

Trackers View, shown in Figure 1(d), provides a summary of all the trackers of each individual app, grouped into four major types: essential (necessary for basic app functionality), advertising (used for marketing purposes), social (for social media integration), and others (trackers with functions that are not immediately recognizable). For each tracker category, we provide a basic explanation about each type of trackers, enhancing children’s cognitive understanding of the concepts. As users click into each of the tracker categories, we provide a list of the exact trackers that have been collecting their data (Figure 1(f)), fostering a more situated reflection on the datafication risks in their daily lives.

Data Destinations View, shown in Figure 1(e), displays a map showing global destinations of their data for each app, supporting both the cognitive and situated thinking on their datafication risks. Using this intuitive map presentation, we aim to foster an environment that encourages children to freely explore and engage at their own pace.

Trackers Control for individual app is shown in Figure 1(f). Children and parents can use the “block” buttons to block data collection by certain companies. By enabling children to explore this function alongside parents and exercise agency over their own data choices, we aim to support families’ critical thinking and enhance their sense of control.

Implementation. The KOALA Hero app is an Android app that is built, for the purposes of tracking analysis, on top of TrackerControl (TC) [49], which itself expands on the popular Android firewall NetGuard [23]. It offers dynamic and static tracking analysis of apps. Using Android’s VPN functionality, the dynamic analysis examines and blocks app network communications on the device. Network traffic is cross-referenced with lists of tracker domains, such as those by Disconnect.me [34] and App X-Ray [22, 48]. For static analysis, it evaluates tracking libraries within apps using the Exodus Privacy tracker database [37], bypassing the need to monitor app network traffic.

Figure 2:

Figure 2: Examples of the KOALA Hero Data Cards (18 in total).

3.3 KOALA Hero Data Cards

To better facilitate hands-on exploration for families and foster deeper, more situated reflections, we incorporated physical elements into our toolkit design. The second component of our toolkit is a set of 18 data cards (see Figure 2), each illustrating the different types of data that might be collected from children’s apps, ranging from account username, personal information such as age, gender and location, to online behavioural data such as browsing and search history. While these 18 data types do not cover all data online platforms collect, they represent the most significant categories from our analysis of major platform privacy policies (Google [8], Amazon [2], and Meta [11]). The data cards, designed to be physically printed and played by children and parents, were created to support the cognitive comprehension of data associated with their mobile apps and online activities. Furthermore, we curated the data card descriptions as examples to help children and parents contextualise their own activities, such as “You are friends with Sarah and Tom on Snapchat”, facilitating their situated understanding of how the data integrates into their daily lives.

Figure 3:

Figure 3: Tasksheet and worksheets for families to complete.

3.4 KOALA Hero Tasksheet & Worksheets

The third component of the hybrid KOALA Hero Toolkit is a tasksheet, including accompanying worksheets, for families to complete together. The tasksheet (Figure 3 a) is designed to weave together and offer directions for families to effectively navigate through the KOALA Hero tracker app and utilize the KOALA Hero data cards. The tasks include selecting 3 most used apps by the children (one social media app, one gaming app, and one educational app) on their device, creating a context for children and parents to have more situated reflection on their own experience. For each of the selected app, the participants were first instructed to go through their chosen app using the KOALA Hero tracker app to learn about the different trackers associated with this app, and where their data is being sent to, supporting their cognitive understanding of the datafication risks around them. They were then instructed to play around with the data cards, and to discuss topics around the collection, transmission and the processing of their data. Finally, participants were instructed to control and manage the trackers associated with this app through the KOALA Hero tracker app and discuss their decisions. Through combining digital (KOALA Hero tracker app) and physical (KOALA Hero data cards) resources, this component fosters meaningful discussions between children and parents, encouraging critical reflection on the data collected and its potential implications for data transmission and processing. It also encourages exploration of the larger structures and processes behind the mobile data, uncovering deeper datafication risk considerations. The participants were given a set of three worksheets (Figure 3 b) in total, with each worksheet being identical in format. Each worksheet was designated for recording their observations and thoughts about a different one of the three apps they explored, ensuring organized and specific feedback for each app.

Skip 4METHODS Section

4 METHODS

4.1 Study Overview

In our study, we aim to address our three research questions: firstly, to explore how families currently perceive and navigate datafication risks; secondly, to assess how our toolkit might influence families’ perceptions of datafication risks; and thirdly, to explore how our toolkit can help families become better informed about these risks, while also identifying any additional support needed. Participants were invited to our lab for the study, and were asked to bring the children’s most used Android device. Our user study consisted of three parts: 1). an onboarding session, 2). activities with the KOALA Hero Toolkit session, and 3). an open-ended family reflection session. To complement our data, we also used pre-study surveys (for parents and children to complete separately) to establish a baseline of their individual perceptions and experiences on mobile privacy in the onboarding session; and post-study surveys (for families to complete together) to reflect on their joint experiences and thoughts in the reflection session. Each study was designed to last about 1.5 hours, with one parent and 1 – 3 children from the same household. Each study was facilitated by 1 – 2 researchers, who took observation notes and offer assistance without disrupting family activities, intervening only to provide technical support when needed, aiming to minimise their involvement in the family’s activities unless assistance was requested.

4.1.1 Onboarding Session (20 minutes).

We began with a “whose fave is it” game [35], where parents and children guessed each other’s favorite mobile app. This aimed to ease participants and create a balanced power dynamic for the study. Parents and children were then asked to separately complete a pre-study survey, available in versions tailored for both groups (see supplementary materials). The surveys contained questions about basic mobile privacy knowledge, including data collection, sharing, processing, and their current understanding and practices on these topics. The questions in both versions of the surveys were nearly identical, with the child-version employing more child-friendly language to enhance accessibility. We didn’t use the surveys to measure their knowledge but to help us establish a baseline of their existing perceptions and experiences on mobile privacy. Participants were encouraged to think aloud and explain their choices as they filled out the surveys. The 1-min intro video (see Figure 1 a) was then played to familiarise the participants with basic concepts around data trackers and datafication (e.g., What are trackers, their different types, and what can they do with your data?). After the video, families were encouraged to discuss what happened in the video. This helped us to confirm whether the participants understood the content in the video, and facilitate discussions to clarify the video if needed. The video wasn’t meant for educational purposes but to familiarise participants with concepts for their upcoming activities.

4.1.2 Activities with the KOALA Hero Toolkit (40 minutes).

In this session, participants began to engage with the KOALA Hero toolkit (as introduced in Section 3). They were firstly asked to install the KOALA Hero tracker app on the children’s most used Android device. The KOALA Hero data cards were also presented to them. The participants were then given 3 minutes to navigate the app and acquaint themselves with its functionality as well as read through each of the data cards. After this initial setup, each parent-child pair was asked to go through the tasksheet and complete the tasks on the accompanying worksheets. Participants were told there are no right or wrong answers, and the goal of these tasks is not to evaluate the usability of the KOALA Hero tracker app, but to facilitate a better use of the hybrid toolkit. Participants were provided with pens and pencils to jot down their observations and ideas on the worksheets. While writing was optional, they were encouraged to verbalise their thoughts aloud for the audio recording. Researchers primarily took an observational role and only intervened for technical support when needed.

4.1.3 Family Reflection Session (30 minutes).

This session is designed as a wrap-up session for parents and children to conclude their thoughts and observations from the day, and to bring up any topics that have not yet been addressed. Participants were encouraged to share any surprising or interesting discoveries they had, and reflect on how the toolkit had influenced their perceptions and thought processes concerning datafication risks. An exit survey was presented to the parent-child pairs, which was completed collectively as a family unit, fostering a shared dialogue on their experiences and conclusions of the study. The survey contains summary questions about their thoughts on the day’s experiences and their sentiments towards datafication (see supplementary materials). The survey’s purpose was to help participants articulate their thoughts rather than quantify responses. The participants were encouraged to think aloud as they navigate through the survey. As they did so, researchers paid close attention to noteworthy comments and followed up with further questions or discussion as needed. Finally, families were encouraged to maintain the app on their devices and to take the toolkit, including data cards and tasksheets/worksheets, back home for continued use.

4.2 Participants

Participants were sourced from local schools and a public family recruitment forum starting April 2023, after obtaining institutional research ethics approval. We conducted 17 user study sessions with 23 children and 17 parents between May and June 2023. Each session consisted of a single family, typically comprising one parent and one to three children. We recognise that “family” could be a broad term, encompassing relationships like grandparents, aunts, and cousins. Yet, this study focuses on what is typically the nucleus of a family: parents, children, and siblings. This aims for an understanding of primary dynamics before potentially expanding to extended family in future studies. Each participant received a £15 e-gift card as a thank you gift for their participation.

To participate in the study, each child was required to have access to an Android device, and have a parent who was at least 18 years old, to participate alongside them. We carefully selected the age range of child participants to be between 10 and 14, for several reasons: previous research has shown that from 10 onward, children gradually transitioned away from mainly parent-guided online activities [71]; evidence has also shown that children under 13 are active users on many social media platforms despite of the age restrictions claimed by these platforms in their terms and conditions [73, 81], exposing them to a wide range of risks online [15, 88, 89, 116]. Of the 23 children, 12 were aged 10–12 and 11 were 13–14, averaging 12 years (range 10–14, s.d. = 2.06). 14 identified as boys, 9 as girls. Of the 17 parents, most were 35–44 (9), followed by 45–54 (6), and 25–34 (2). 12 were moms and 5 were dads.

Beyond considering participants’ ages, we ensured a diverse demographic background. Participants were sourced from three local schools: one private school (which charges fees to attend instead of being publicly funded), one grammar school (government-funded schools that select their pupils by means of academic performances), and one state school (government-funded schools that provide inclusive educational free of charge). For those recruited from public forums, we recorded their ethnicity, children’s school type, and parents’ education and employment status. Out of the 17 families, 7 identified themselves as Asian, 5 as White, 2 as Black, and 3 as mixed ethnicity families. 10 children were in private schools, 9 in state, and 4 in grammar. Parents held master’s (9), PhD (3), bachelor’s (3), or high school degrees (2). 12 parents worked full-time, while others were part-time-employed (2), self-employed (2), or full-time parents (1).

Table 2:
Demographic InfoChildren (n=23)Parents (n=17)
Age (Children)10–14 (Avg: 12, SD: 2.06)-
Age (Parents)-25–34 (2), 35–44 (9), 45–54 (6)
Gender (Children)Boys: 14, Girls: 9-
Gender (Parents)-Moms: 12, Dads: 5
School TypePrivate: 10, State: 9, Grammar: 4-
EthnicityAsian: 7, White: 5, Black: 2, Mixed: 3-
Education (Parents)-Master’s: 9, PhD: 3, Bachelor’s: 3, High School: 2
Employment (Parents)-Full-time: 12, Part-time: 2, Self-employed: 2, Full-time parent: 1

Table 2: Overview of Participant Demographics

4.3 Data Collection

Data collection took place throughout May and June 2023, during which we conducted 17 user study sessions with 23 children and 17 parents. All sessions were audio-recorded with the participants’ consent/assent, which was obtained through signed physical forms. For the children’s assent forms, we ensured each child had a clear understanding of the study, including the fact that their interactions would be recorded and anonymised. The first and second authors transcribed the audio recordings, systematically removing all personally identifiable information pertaining to the participants or any individuals they mentioned, from session recordings, notes and transcripts. There was a total of 1586 minutes of audio data, of which 86.3% were made by the participants, and the rest were made by researchers. The median length for each session was 96 minutes.

4.4 Data Analysis

We analysed the data using a thematic approach to develop codes and themes [26]. Photographs of families’ activity sheets and their use of data cards were also consulted to complement our analysis. The thematic coding process started by dividing the transcriptions into two equal-sized sets. The first two authors independently analysed the first set of the transcriptions to derive an initial set of codes. They then met to consolidate and reconciled codes into a final codebook, with a coding agreement of Cohen’s kappa of 0.81. The first author then completed the coding of the remaining transcripts using this final codebook. Our final codebook included themes related to families’ existing perceptions of risks, families’ thought process development, noticeable family changes of perceptions of datafication and decision-making, and their desire for additional support. More specifically, results from part 1 of the study (Onboarding Session) mostly contained families’ existing perceptions and practices of navigating mobile datafication risks (RQ1). Results from part 2 of the study (Activities with the KOALA Hero Toolkit Session) mostly contained findings on how the toolkit influenced families’ perceptions and thought processes on handling mobile datafication risks (RQ2), as well as our observations on their decision-making processes and the kind of support they required (RQ3). Results from part 3 of the study (Family Reflection Session) further enriched our understanding of the support families needed (RQ3). However, it’s important to note that the findings for each of the three research questions are not strictly confined to any single part of the study. For example, families might discuss their existing perceptions and practices at any point. The study is best viewed as an integrated process where themes can emerge and be explored at any stage. The survey data has been employed as descriptive information, offering an overview on families’ perceptions and practices on mobile datafication risks, and used to enrich our analysis; however, it has not been used to identify any direct correlations or measurable effects, nor employed as a source of direct quantitative measurements.

Skip 5RESULTS Section

5 RESULTS

We present our results by first outlining children and parents’ existing overall perceptions and practices of datafication on mobile devices (RQ1). Next, we present families’ thought processes regarding datafication risks and subsequent change of perceptions (RQ2). Finally, we explore the impact the KOALA Hero toolkit had on the families’ joint decision-making and additional support they required (RQ3). We provide quotes from individual children and parents, identified by their participant ID, along with the age of each child. A child participant is represented by (C#, age x) and a parent participant by P#. Child participants who are siblings from the same household are denoted as C#a, C#b, and so forth. For individual quotes, we present them as italicized sentences within the text. For dialogues, we present them as block quotes with speakers identified at the beginning.

5.1 Families’ Existing Perceptions and Practices on Datafication Risks

Here we present families’ existing perceptions and practices on datafication risks prior to using the KOALA Hero Toolkit, as reflected in their pre-study surveys and their articulations of choices during the Onboarding Session (RQ1).

Most families reported that children had their own devices, with only one using a parent’s device and another sharing among siblings. Of the children, 9 out of 23 used their devices for 1–3 hours weekly, 5 used theirs for 4–6 or 6–8 hours, and 4 used theirs for over 9 hours weekly. Most children (17/23) responded not very concerned about apps and companies collecting their data. 13 children reported neither agree nor disagree and disagree when answering I would like to discuss privacy issues with my parents. Parents exhibited a higher awareness of online risks compared to their children, but their primary focus was on online safety rather than data and privacy. For instance, 11 out of 17 parents used parental controls, mainly to limit access to age-specific apps. When discussing how they select apps for their children, 12 parents prioritized their functionality and educational value, while only one considered privacy aspects. However, a majority (14/17) expressed concern over data collection practices by apps and their parent companies. While 15 parents wished to discuss this with their children, only 3 felt confident doing so. Even though nearly all parents had prior online safety talks with their children; the discussions rarely touched on data and privacy, with 16 out of 17 parents admitted they had rarely or never addressed these issues at home.

Furthermore, as illustrated in Figure 4, families demonstrated a diverse range of understanding when it comes to the more specific data concerns, such as the collection, transmission, and processing of their data by their mobile apps:

In terms of data collection, almost all children and parents mentioned how they “kind of knew” this before. 17/23 children and 14/17 parents selected agree and strongly agree to I know that some of my data, such as my usage of the apps, and some of my personal data (such as name and age) will be collected by mobile apps. 11/23 children also reported having heard about terms such as “trackers/cookies”, and some parents (12/17) mentioned them knowing apps can get data from their phones but do not know how.

In terms of data transmission, fewer children and parents knew data could be shared between platforms. Only 8/23 children and 9/17 parents selected knowing I know that my data may be used by other companies, who may have an agreement with the app. This was also reflected later on in their conversations, such that a fair number of children (11/23) thought “My data will only stay within the app.” (C10, age 10).

As for data processing, almost all families had the initial idea as “Data processing is solely for offering us better services” (C7, age 12). Most parents (12/17) and children (19/23) selected disagree and strongly disagree to question I know how data can be used to learn about personal aspects about me (e.g., whether I’m a boy or girl, the type of school I go to) and I know how data can be used to make inference on personal aspects about my family (e.g., relationship status, parent or not, favourite family holiday destinations).

Figure 4:

Figure 4: An illustration of existing perceptions about datafication risks among 23 children and 17 parents, with numbers above each bar indicating instances where children or parents share that perception.

5.2 Families’ Thought Process: From Cognitive Understanding, Situated Reflection to Critical Thinking

As families engaged in the Activities with the KOALA Hero Toolkit session, they demonstrated a variety of ways to utilise the toolkit, exhibiting many playful ways to explore and make sense of datafication assisted by our toolkit. Through this process, we observe how this usage influenced their thought process about datafication and its associated risks (RQ2).

5.2.1 Cognitive discussions.

Families mostly started by trying to establish a cognitive understanding about trackers and associated datafication risks. For example, they often took some time to review the tracking information linked to each app via the KOALA Hero tracker app, and they used this data as a springboard for their discussions. Families also tried to make sense of what possible data could be collected through navigating the KOALA Hero data cards. Some families spontaneously used the KOALA Hero data cards for some kind of role playing. Parents acted as the app, telling the children, “I’m going to take this and this from you.” (P9) Following this, the entire family would begin deeper discussions on what this meant in the context of their daily experiences.

5.2.2 Contextualised discussions.

While interacting with the KOALA Hero Toolkit, families frequently engaged in contextual reflections. Most families related the study’s topics to their personal experiences to understand them better. Siblings often recalled shared experiences, with one commenting: “Remember when we liked socks on Instagram and then saw them on Amazon? That’s them sharing our info.” (C15b, age 12). Additionally, children related the toolkit’s information to what they already knew, noting, “Friend list? We discussed this in school. They always say, ‘Don’t expose yourself on Facebook.”’ (C7, age 12). Apart from families reflecting on past experience of their own, they also tried to make sense of the topics through linking with everyday scenarios, “You wouldn’t share everything about yourself with a stranger at a party, but yet we’re giving all our stuff out online.” (C9, age 13). We also noticed a trend among parents using scary examples to alert their children, “If someone really wants to know it’s very easy. [Children’s name], where does he go to school? When is he online? If some bad person uses this information to do some bad things about you!” (P1)

5.2.3 Critical thinking from both parents and children.

Along with their cognitive understanding and situated reflections, families also showcased various ways of critical thinking. For instance, families would combine the KOALA Hero data cards to self-reflect the impact of losing control of various data points: “Let’s look at this card, location; and then this card here, ‘10 Years Old.’ What they might find out about you? Maybe what school you go to?” (P3). Some families related and applied the data cards to real-world issues they may have encountered, “To me, each data card is like a box they’ve created for people; they make assumptions and put people into distinct boxes.” (C12c, age 13), “Looking at these data cards, I see supermarket products. They’re like items on shelves that big companies can freely pick and buy about people.” (C14a, age 13). We observed numerous instances where children critically reflected these observations:“Wait, if they merge your friend list, language, and browsing history, they could probably deduce your ethnicity.” (C12a, age 11).

Apart from critically reflecting on the datafication risks and relevant concerns around them, families sometimes went on to reflect on what this meant for the whole society and the future of technology. In particular, many parents brought up the concept of “tradeoff”, such that they believed “Nothing is truly free. If a game is offered at no cost, they’re likely selling your information.” (P9), “It’s the future of technology, everything is digitalised and monetised, and we rely on these services. Of course there are both good side and bad side about it, but it’s crucial to be aware of them.” (P15)

5.3 Families’ Change in Perceptions on Datafication

The various thought processes of families also affected their perceptions about mobile datafication risks (RQ2).

5.3.1 Families becoming more surprised/concerned/confused.

To start with, families demonstrated great surprise as they explored their chosen apps using the KOALA Hero tracker app, especially when viewing the associated trackers and destination countries of their data, “Oh wow, I thought it’s illegal for them to share the data, especially to a different country.” (P12). Meanwhile, almost all families were shocked at the amount of trackers associated with their apps:

C12a, age 11: Let’s look at Magic Tiles. On my God! 21?!

C12c, age 13: I did kind of know they are collecting from us. But I just didn’t realise it’s this many! I thought like, maybe everyday I run into 2 trackers, but it’s like 20 just in this app.

P12: At night you lock your door and keep yourself secure, but you never think about this opening you up.

They also demonstrated great confusion and concern around the mismatch between their perceived purpose of the apps and the types of the associated trackers: “Tracker ‘Amazon.com’. Amazon is very commercial, isn’t it? So why would an educational application have an essential need to send your data to someone who wants to sell you stuff? That’s like going to a school, and before you go into the front gate of the school, you are given a brochure of stuff to buy.” (P15)

5.3.2 Families becoming more aware of data inferences.

As families continue on the activities supported by the KOALA Hero Toolkit, we noticed that families started to become more aware of the power of data inference, and that more personal things about them can be learnt: “Wait, now they’re using Google tracker, and what you do on Duolingo will get sent to Google. But then I also use Google Chrome, so that means they also know what I search for and all that stuff. They might even able to pinpoint where I go to school, and like what they teach in school. This is overwhelming.” (C3, age 10).

Meanwhile, some families started to realise that data can be used not only for deducing the personal details of individuals, but also for collective profiling, “I think, because they have so much data from such a big range of people, it makes the data more useful. The things would have more impact on a particular group of people.” (C13b, age 13).

Families also demonstrated a deeper level of concern about how their data could be used to push and nudge them into certain things and opinions: “If they wanted to influence you, they could easily do it. They can get you hooked on something or got you looking at something or someone favourably, because they know so much about you.” (P9).

5.3.3 Families becoming more aware of stakeholder’s roles in datafication.

Both parents and children demonstrated increased concerns around what companies and platforms could do with their data: “It’s an uneasy feeling. They know all these thing about you and it’s completely up to them now.” (C11, age 14). Almost all families expressed surprise and concerns on the dominance of big companies such as Google, Amazon and Meta. The families became aware of this issue as they looked through the trackers of their apps, and they soon found out that the major companies were present in every single one: “Oh Google again. I swear this is the tenth time I’ve seen their trackers today.” (C8a, age 10), “If we look at the trackers, literally everything is sent to either Google or Amazon.” (P11). Families talked about their concerns on how powerful the data will be in the “hands of giants” (C14b, age 14):

C14a, age 10: Google knows everything, and Facebook also knows everything. It’s crazy!

C14b, age 14: Our data is now in the hands of giants. Is there anybody monitoring what they do?

P14: Yes it does reminds me that in essence, Google, Amazon, Meta, they are data companies. They make money off people’s data and make huge profits.

Meanwhile, some families talked about the “unbalanced power strucutre” between big companies such as Google, Amazon and Meta, and the users: “In the future, the internet will record all our actions, and big companies might know us better than we know ourselves. As more people use it, their analysis will sharpen. There’s going to be some point where you will just need to listen to the computer to decide your next step.” (C1, age 13). Families emphasised how they wanted to have a “stronger mind” when dealing with the datafication risks: “I don’t want to be distracted and manipulated by their little games, it’s my data!” (C4, age 11).

5.3.4 Families becoming more attuned to the impact of datafication on children.

Families started to contemplate the actual harm such practices could pose to children. Parents in particular, expressed concerns on data being taken from children from such a young age, and what that would mean for their future:“They’ll track you until you grow up. How do we know if they’ll use that data against your future?” (P8). Some parents also expressed concerns on children could become normalised and indifferent to these risks over the years as they grow up: “What you just said sounds like normalisation to me. ‘Data tracking, it’s gonna happen. I’m normalised to it’.” (P10).

We also noticed a consensus among all families around the importance for children to be aware and informed of such datafication risks from a young age: “We want to protect them, not bubble-wrap them. Awareness education, like this toolkit, is essential. They need to understand the risks and that data collection is often for others’ benefit. Especially at this age, as they start exploring social media and online platforms, they need this awareness, and it takes time.” (P5). “It’s now in the back of our mind, and it’s funny how before we were like, not concerned at all, and now we become concern of these things cause you realise things.” (C3, age 10).

5.4 Families’ Change in Decision-making and Desire for Additional Support

In this section, we outline our observations of how the KOALA Hero Toolkit facilitated collaborative decision-making among families; and the additional support they desired, as reflected upon during the Family Reflection Session (RQ3).

5.4.1 Families felt more equipped.

To start with, families expressed that they felt more in control of their app privacy choices, and a great sense of achievement by observing how the trackers activities changed according to their decisions. “When I disabled the trackers on Angry Birds, there’s a sudden decrease in the Facebook trackers activity. Glad to see that not all my information is going to Facebook!” (C14c, age 14). In the meantime, families expressed how they now feel better equipped to make decisions when they return home and moving forward into the future. Both parents and children expressed appreciation for the way the KOALA Hero tracker app made information readily visible and understandable, “In general, I think it’s just good to see things in front of your face. I was suspicious of all this data tracking and stuff before, but now I can actually see that 21 trackers in front me, and I can better decide next time.” (C8b, age 13).

Parents also reported on how the toolkit provided a structure for them to start talking about these issues with their children, and engaging the children more effectively, “It really helps me to have a structure to talk to my son. It’s a great opportunity to get them interested, so they want to learn more about it when they see these practices again.” (P3). Families further talked about how they felt better equipped to contemplate today’s discussion in future scenarios, particularly if they encounter similar situations and need to make decisions, “Next time you get a recommendation. Think about they might think of you, and what they might know about you, not just, oh, okay, I’m just gonna look at it.” (P9)

We observed instances in which families made a series of weighted decisions based on information exchange and family discussions enabled by the toolkit. Some families chose to only allow the essential trackers and block all others for all their apps. Others weighed the significance of a specific app to them and its function (e.g., educational apps), and chose to only allow all trackers for this, but not for others. A few families made more nuanced decisions by considering the companies behind. They opted to block all trackers from “dominant” companies such as Google and Amazon, while thoughtfully assessing the purpose of the tracker to ensure it did not interfere with their apps’ essential functions.

5.4.2 Families having more balanced family engagement for joint decision-making.

Through interactions with KOALA Hero, families exhibited more balanced engagement during activities. The toolkit design, especially the tasksheet’s emphasis on children’s favorite apps, often made children take the lead in discussions since they were the expert of their apps. Even in instances where parents initially assumed the lead role, they often found themselves asking questions like, “Is it true that this app does this?”, and “What do you think?”, indicating a shift towards a more balanced family engagement. At the same time, we observed a recurring pattern across all our studies where families began to explore unfamiliar subjects together, sharing their insights and helping each other throughout the process:

P15: I’m not sure what to think about Piano Tiles, they have so many trackers, but you don’t do anything with it?

C15a, age 10: You don’t really browse anything with Piano Tile. I think it doesn’t matter if we block it or not.

C15b, age 12: No. Look at this trackers list, it collects social media stuff which can link to you, that’s dangerous.

C15a, age 10: Oh, but I really don’t want to block it, it’s such fun.

C15b, age 12: Let’s only block the social trackers, but you can still play it.

We noticed that families continuously engage in a process of discussion, negotiation, and conflict resolution as they carry out joint decision-making. Different family members might have varied viewpoints on aspects like which trackers to block or which apps needs inspection. As they exchange thoughts, new ideas and perspectives emerge:

C14a, age 13: I would absolutely turn off all the trackers.

P14: Why? I wouldn’t. They just send you stuff you like but it’s up to you.

C14a, age 13: No! It’s concerning that everyone thinks they’re unaffected when they are. I’ve long felt that this data collection isn’t right, but others around me just don’t feel that. It should be a scary thing.

C14c, age 14: Yes, I agree with you. What mom said sounds like normalisation to me.

5.4.3 Additional support desired by the families.

Families, particularly parents, expressed a need for more data-centric controls. These included being able to govern the specific data being collected, managing data for real-time activities, and gaining a better understanding of precisely how the data will be utilised by the companies. Families also talked about having more contextualised information on what can go wrong. Many families in our study expressed a desire for real-world examples and news to educate their children and other siblings on data misuse. “Have examples here, like news on how our data are not being used legally. What can go wrong. As a way to educate.” (P12), “It’s frustrating being the only concerned one while others are indifferent. They should see the actual negative impacts on lives.” (C14a, age 13).

Besides the support from KOALA Hero Toolkit, families sought external assistance, notably from schools and potential legislation. Both parents and children reported how they want the schools to take up more responsibilities, such as checking on the privacy practices of the educational systems they are using. They also called for an expansion of the existing curriculum to include data awareness education, going beyond the current focus on online safety such as stranger danger, “The school needs to become more aware and hold hands of parents. We can then talk to kids at home about it.” (P10). Meanwhile, families also expressed the need for future legislation that actually works, “Honestly, I don’t think GDPR is effectively protecting our data, especially with all these trackers and I’m under 13! The internet is constantly evolving, so maybe the laws should too.” (C7, age 12).

Skip 6DISCUSSION Section

6 DISCUSSION

6.1 Impact on Families’ Thought Process

Throughout our observations we noticed a consistent progression in family participants’ thought processes. It’s noteworthy that this pattern remained consistent across all the families, despite the varying ages (10–14) of the child participants. We also observed that regardless of families’ initial understanding about datafication risks, this progression mirrored Kafai et al.’s three forms of computational thinking (Section 3): from cognitive thinking (i.e., understanding of basic computational concepts) – families figuring out basic concepts such as what are trackers, what data are being collected; to situated thinking (i.e., situate the abstract computational concepts in context children know and care about) – children connecting with their real life experiences, parents reflecting on their data practices for the children; and finally to critical thinking (i.e., supporting the questioning of larger structures and processes behind the computational phenomenon) – families questioning the dominance of tech giants, reflecting on how users are inferred and monetised in the datafied future, and making informed decisions based on their consolidating of information. While this is not a linear progression, we noticed that the cognitive and situated abilities are critical for enabling users’ critical thinking.

How was such progression achieved then? While we do not claim that the KOALA Hero toolkit are the sole factors driving this change, we notice a strong relationship between some specific design features we included and the triggering moments experienced by families, which appeared to have fueled the progression in their thought process. We identified four critical triggering moments during families’ interactions with the toolkit. The first triggering moment is when parents and children see things right before their eyes on the tracker app, supported by the information disclosure features on the KOALA Hero tracker app (e.g., trackers view and data destination view). This was the moment when almost all participants expressed surprise for the first time, expressing this either contradicted their previous beliefs or confirmed unverified suspicions. The second is when families started to read through examples on the data cards. The sentences are formulated in such a manner (e.g., “You are xxx / You did xxx last Friday”) that all families started to connect with their own experience and reflect on what they did before spontaneously. The third moment is when families documented their responses on the worksheet. This formal requirement to write down answers encourages families to integrate all their observations and speculations derived from a range of activities, such as experimenting with data cards, identifying top trackers in the tracker dashboard, observing the total number of trackers on their phone, and speculating potential data collection based on their activities. And finally, the fourth moment is when families exhibited controls on the tracker app for each of their favourite mobile apps. This transition from passive observation to active control prompts an array of discussions and critical reflections.

These four triggering moments correspond closely to the four stages of Kolb’s learning cycle [64]: Concrete Experience, learners encounter a new experience (e.g., see trackers’ existence for the first time); Reflective Observation, after an experience, learners reflect, question, and discuss (e.g., connect examples to real life data experience); Abstract Conceptualisation, learners classify concepts and draw conclusions from events (e.g, families systematically jot down their observations and refine their thoughts); and Active Experimentation, learners test out their new ideas and lessons gathered from the experience (e.g., families experimented with controlling trackers based on their reasoning).

While we do not position KOALA Hero as an education tool, our experience has shown that the use of what we call triggering moments can be valuable in promoting moments of reflection, conceptualisation and experimentation. These triggering moments can take various forms: they can be embedded as “pause” moments within systems to motivate user engagement and raise awareness; set up “game rules” that nudge users to explore with existing information; introduce “consolidation phases” for users to conceptualise their experiences; and provide opportunities for users to test their hypothesis. These ideas also resonate with several recent design principles for children, such as the Four Lenses of Play by Bekker et al [21], and Project Zero’s Agency by Design [1], which encourage the prioritization of playful elements in digital designs to enhance children’s engagement and exploration of new knowledge. We encourage future research to explore further the nuances of these design choices for offering effective engagement for families.

6.2 Implications on Family Joint Engagement

One of our research questions was to investigate how KOALA Hero could provide improved support to families in making more informed and joint decisions. Our aim was to explore whether we could provide an alternative to the existing parent-led approaches, which can undermine children’s autonomy, hinder the development of their risk coping abilities, and potentially damage trust and communication within families. Hence, we integrated design elements that fostered families’ situated reflection and enriched their experience with playful physical components and activity sheets.

Our observations indicate that KOALA Hero enhanced family engagement, with several instances of active negotiation and collaboration for collective decision-making, suggesting the toolkit’s positive influence on family joint engagement. Previously, it was commonly assumed that parents, due to their greater expertise, would take the lead in guiding their children’s navigation of the digital space [87]. However, our observations indicate that, particularly in the context of datafication risks, this assumption may not always apply. We noted a shift in expertise dynamics, with parents not always leading and children sometimes initiating discussions. While we observed some tensions in families making joint decisions—for instance, when parents noticed the large number of trackers related to an app and wanted to disable it completely, despite it being the child’s favorite and most-used app—the friction tended to be minor, and families generally found a way out. This was mainly because our toolkit offered a more moderate approach by allowing families to turn off only certain types of trackers. Moreover, our toolkit provided a joint learning experience for families, enabling them to explore and collaborate with each other during the process. In general, all families in our study showed a trend towards more balanced engagement over time, with initially dominant members like parents or older siblings increasingly valuing others’ inputs, and quieter members becoming more vocal.

In our study, we noted instances that could be described as ‘bonding moments,’ which seemed to have encouraged joint decision-making among family members. Some of these points include moments to complete tasks that children might have superior expertise or a greater personal interest instances, such as clarifying the function of their apps or navigating the data destination view of trackers. Another set of bonding moments happened when family members encountered points of disagreements and had to engage in negotiation with each other. For example, when families used the Data Cards to discuss datafication risks, it often involved bringing up real-life examples and putting forward critical arguments. Fisher [38] and Wegerif [102] proposed that there are three distinct types of conversations when guiding children’s involvement in collaborative activities, namely disputational talk (i.e., disagreements and counter-assertions), cumulative talk (i.e., speakers build positively but uncritically on what the other has said), and exploratory talk (partners critically yet constructively engage with each other’s ideas, providing justifications and alternatives). All three types of talks were observed within our family participants. In particular, we noticed that when families engaged with our toolkit, such as using Data Cards and relating app information to real-life scenarios, it appeared to influence a shift from initial disputational talk (simply expressing disagreement without deep engagement or reasoning) [102], to more explorative discussions. In these discussions, statements and suggestions were not just exchanged but also critically examined, with challenges being justified and alternative ideas presented, aligning with the principles of exploratory talk as described by Mercer and Barnes [20, 69].

Though exploratory talk is often seen as an ideal outcome for constructive interactions between parents and children [20, 103], it doesn’t naturally occur just because they are using the same device [27, 113]. Factors like parental dominance [52, 99] and challenges in recognizing shared goals [95, 108] can impede this process. Guidelines such as Playful by Design [56] suggest considering age-appropriateness and open-ended play to promote a balanced and explorative learning environment in families. Our observations of ‘bonding moments’ offer insights that complement these guidelines, highlighting the potential of designing shared experiences to enhance family joint engagement. However, these are preliminary findings and require further empirical investigations, and we recommend that future designs consider incorporating features that might support these dynamics for further investigation.

6.3 Implications on Legislative and Policy Development

Our study unveiled grave concerns among families regarding the impact of datafication on children. Families expressed their apprehensions about the influence of data inference and collective profiling on society at large. They voiced discomfort about the apparent data monopoly held by a select few tech giants. Children were particularly anxious that every aspect of their lives was being datafied and used in hidden ways to shape their thoughts and actions (“have to have a strong mind” C4, age 11). Parents echoed these concerns and extended them to worries about their children’s future in a heavily data-driven society, with inadequate safeguards in place (“legislation that actually works” P7). Both parents and children yearned for greater transparency regarding how major platforms (e.g., Google, Meta, Amazon) utilise their data. At the same time, we observed how families transitioned from initially knowing little about the implications of datafication risks to developing heightened cognitive awareness of these issues and a strong desire for change. While our toolkit shows promising impact on users’ perceptions, addressing all these complex issues likely exceeds the scope of a single toolkit.

The robust demand from families necessitates a fundamental reassessment of current legislative and policy development regarding children. However, current regulations across the globe mainly address traditional online safety issues for children, such as harmful content and stranger dangers [76, 98]. Examples include the US’s Kids Online Safety Act [10], mandating account setting safeguards for minors, the UK’s Online Safety Bill [76], focusing on illegal content and age verification, and China’s Cyberspace Protection Regulations [6], enforcing protection against inappropriate content with strict penalties. While there is an increasing number of legislative efforts globally specifically targeting children’s online safety, subtler risks associated with datafication largely remain under the radar. Recent initiatives such as ‘Child Rights by Design’ by the Digital Futures Commission [4], and ‘Responsible AI for Social Empowerment and Education’ at MIT [13] provided useful starting points by addressing child-centric AI technologies, though with less emphasis on data-centric perspectives. Meanwhile, projects such as ‘Agile-EDU’ [36] explore data in educational systems, laying a solid foundation for understanding data in everyday contexts. Our research highlights a clear demand from children and their families for enhanced ability to access and control, particularly from a data perspective, and an immediate impact on their perception of datafication risks through raised awareness. This calls for a comprehensive revision of the current data governance framework related to technologies accessible to children and more targeted legislation addressing families’ specific datafication concerns. We advocate for sustained exploration into the creation of comprehensive ethical data governance systems, replacing the current data-driven approach to innovation and re-balancing the power between users and platforms, allowing families to assert their data rights and setting the groundwork for a more ethical data landscape in our society.

Skip 7LIMITATIONS AND FUTURE WORK Section

7 LIMITATIONS AND FUTURE WORK

The first limitation of this study is the self-reporting in pre-study and exit surveys. Participants may have varying baselines: some might lean towards extreme responses like strongly agree, while others might gravitate to neutral options like somewhat concern. To address this potential bias, we refrained from using survey outcomes as quantitative measures. Meanwhile, we attempted to craft the survey questions and choices with as much neutrality as possible to avoid leading participants toward any bias. For instance, we used statements like “I know how data can be used to learn personal aspects about me (e.g., whether I’m a boy or girl, the type of school I go to)” instead of more suggestive phrasing such as “I know how data collection can be invasive and violate our personal privacy,” to prevent influencing participants’ responses. Participants were also encouraged to verbalize their choices while completing surveys, providing deeper insight into their thoughts and motivations. While we did not collect information about families’ parenting styles, it is possible that the parents’ pre-existing parenting approach may have influenced the way the family discussions were conducted at the beginning of the study. Nevertheless, with the introduction of the KOALA Hero Toolkit, all families were observed to increasingly engage in participative dialogues, aligning with our objective to promote family joint engagement on datafication issues. A thorough mapping of existing family parenting styles, however, lies beyond the scope of this study. Future research may encompass longitudinal studies to observe any notable behavioral changes, with a more detailed examination of family dynamics.

Future work also aims to explore how we can design approaches that could enable families to conduct such activities in more accessible ways, and investigate how such a toolkit could have potential behavioural impact on families in the long term if deployed in the wild. We intend to run field tests with families in the form of diary studies. The goal is to deepen our understanding of the day-to-day usage and effects of our toolkit on family dynamics, thereby capturing a real-life picture of how such tools may help shape the future digital experience of families. Additionally, while our current study structure does not offer concrete measurements for us to claim educational gain, we recognize the great educational potential in our toolkit. Moving forward, we aim to collaborate with schools and educational institutions to develop formal learning programs, integrating our toolkit into structured educational settings.

Skip 8CONCLUSION Section

8 CONCLUSION

In this paper, we introduce KOALA Hero, a multi-component hybrid toolkit made up of a tracker app for mobile, a set of data cards, and a tasksheet supplemented with worksheets that informs families of mobile datafication risks around them, and encourage enriched discussion on relevant issues. Our goal is to examine whether we could change families’ perceptions of datafication risks and families’ approaches to the discussions of these risks. Through user studies with 17 parents and 23 children aged 10 – 14, we identified more significant awareness of datafication risks from families, their progression of thought process, and a more collaborative family decision-making approach. We identified critical triggering moments and bonding moments that can nurture family data literacy development and cultivate collaborations and negotiations for family joint informed decision-making. Ultimately, we hope this work can advance the current understanding of families perceptions and decision-making regarding mobile datafication risks around them, and inspire future legislative and policy development for more ethical data frameworks.

Footnotes

  1. Corresponding author

  2. 1 URL: redacted (The full documentation of the toolkit will be made available upon camera ready to protect authors’ anonymity. KOALA Hero is not yet available in any public app store.)

    Footnote
Skip Supplemental Material Section

Supplemental Material

Video Presentation

Video Presentation

mp4

129.6 MB

References

  1. 2020. Agency By Design. https://pz.harvard.edu/projects/agency-by-designGoogle ScholarGoogle Scholar
  2. 2023. Amazon Privacy Notice. https://www.amazon.com/gp/help/customer/Google ScholarGoogle Scholar
  3. 2023. App privacy details on the App Store. https://developer.apple.com/app-store/app-privacy-details/Google ScholarGoogle Scholar
  4. 2023. Child Rights by Design. https://childrightsbydesign.digitalfuturescommission.org.uk/Google ScholarGoogle Scholar
  5. 2023. Children’s Online Privacy Protection Rule. https://www.ftc.gov/legal-library/browse/rules/childrens-online-privacy-protection-rule-coppaGoogle ScholarGoogle Scholar
  6. 2023. China releases regulations to protect minors in cyberspace. https://english.www.gov.cn/policies/latestreleases/202310/24/content_WS6537a5d2c6d0868f4e8e095e.htmlGoogle ScholarGoogle Scholar
  7. 2023. Complete guide to GDPR compliance. https://gdpr.eu/Google ScholarGoogle Scholar
  8. 2023. Google Privacy&Terms. https://policies.google.com/privacy?hl=en-USGoogle ScholarGoogle Scholar
  9. 2023. Introduction to the Children’s code. https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/childrens-information/childrens-code-guidance-and-resources/introduction-to-the-childrens-codeGoogle ScholarGoogle Scholar
  10. 2023. Kids Online Safety Act: Legislation to impose responsibility on online platforms and equip children and parents with tools. https://www.commonsensemedia.org/sites/default/files/featured-content/files/kosa-one-pager.pdfGoogle ScholarGoogle Scholar
  11. 2023. Meta Privacy Policy. https://en-gb.facebook.com/privacy/policy/Google ScholarGoogle Scholar
  12. 2023. Provide information for Google Play’s Data safety section. https://support.google.com/googleplay/android-developer/answer/10787469?hl=enGoogle ScholarGoogle Scholar
  13. 2023. Responsible AI for Social Empowerment and Education. https://raise.mit.edu/Google ScholarGoogle Scholar
  14. Amelia Acker and Leanne Bowler. 2017. What is your Data Silhouette? Raising teen awareness of their data traces in social media. In Proceedings of the 8th International Conference on Social Media & Society. 1–5.Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. A. Acquisti, L. Brandimarte, and G. Loewenstein. 2015. Privacy and Human Behavior in the Age of Information. Science 347, 6221 (Jan. 2015), 509–514. https://doi.org/10.1126/science.aaa1465Google ScholarGoogle ScholarCross RefCross Ref
  16. Awais Akbar, Simon Caton, and Ralf Bierig. 2022. Personalised Filter Bias with Google and DuckDuckGo: An Exploratory Study. In Irish Conference on Artificial Intelligence and Cognitive Science. Springer, 502–513.Google ScholarGoogle Scholar
  17. Mamtaj Akter, Amy J Godfrey, Jess Kropczynski, Heather R Lipford, and Pamela J Wisniewski. 2022. From Parental Control to Joint Family Oversight: Can Parents and Teens Manage Mobile Online Safety and Privacy as Equals?Proceedings of the ACM on Human-Computer Interaction 6, CSCW1 (2022), 1–28.Google ScholarGoogle Scholar
  18. Karla Badillo-Urquiola, Chhaya Chouhan, Stevie Chancellor, Munmun De Choudhary, and Pamela Wisniewski. 2020. Beyond parental control: designing adolescent online safety apps using value sensitive design. Journal of adolescent research 35, 1 (2020), 147–175.Google ScholarGoogle ScholarCross RefCross Ref
  19. Rafael Ballagas, Thérèse E Dugan, Glenda Revelle, Koichi Mori, Maria Sandberg, Janet Go, Emily Reardon, and Mirjana Spasojevic. 2013. Electric agents: fostering sibling joint media engagement through interactive television and augmented reality. In Proceedings of the 2013 conference on Computer supported cooperative work. 225–236.Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Douglas Barnes. 2008. Exploratory talk for learning. Exploring talk in school (2008), 1–15.Google ScholarGoogle Scholar
  21. Tilde Bekker, Linda De Valk, and Berry Eggen. 2014. A toolkit for designing playful interactions: The four lenses of play. Journal of Ambient Intelligence and Smart Environments 6, 3 (2014), 263–276.Google ScholarGoogle ScholarCross RefCross Ref
  22. Reuben Binns, Ulrik Lyngs, Max Van Kleek, Jun Zhao, Timothy Libert, and Nigel Shadbolt. 2018. Third party tracking in the mobile ecosystem. In Proceedings of the 10th ACM Conference on Web Science. 23–31.Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Marcel Bokhorst. 2021. NetGuard. https://github.com/M66B/NetGuard.Google ScholarGoogle Scholar
  24. Leanne Bowler, Amelia Acker, Wei Jeng, and Yu Chi. 2017. “It lives all around us”: Aspects of data literacy in teen’s lives. Proceedings of the association for information science and technology 54, 1 (2017), 27–35.Google ScholarGoogle ScholarCross RefCross Ref
  25. Elena Bozzola, Giulia Spina, Rino Agostiniani, Sarah Barni, Rocco Russo, Elena Scarpato, Antonio Di Mauro, Antonella Vita Di Stefano, Cinthia Caruso, Giovanni Corsello, 2022. The use of social media in children and adolescents: Scoping review on the potential risks. International journal of environmental research and public health 19, 16 (2022), 9960.Google ScholarGoogle ScholarCross RefCross Ref
  26. Virginia Braun and Victoria Clarke. 2006. Using thematic analysis in psychology. Qualitative research in psychology 3, 2 (2006), 77–101.Google ScholarGoogle Scholar
  27. Rita Brito, Rita Francisco, Patrícia Dias, and Stephane Chaudron. 2017. Family dynamics in digital homes: The role played by parental mediation in young children’s digital practices around 14 European countries. Contemporary Family Therapy 39, 4 (2017), 271–280.Google ScholarGoogle ScholarCross RefCross Ref
  28. Moritz Büchi, Eduard Fosch-Villaronga, Christoph Lutz, Aurelia Tamò-Larrieux, and Shruthi Velidi. 2021. Making sense of algorithmic profiling: user perceptions on Facebook. Information, Communication & Society (2021), 1–17.Google ScholarGoogle Scholar
  29. Bengisu Cagiltay, Rabia Ibtasar, Joseph E Michaelis, Sarah Sebo, and Bilge Mutlu. 2023. From Child-Centered to Family-Centered Interaction Design. In Proceedings of the 22nd Annual ACM Interaction Design and Children Conference (Chicago, IL, USA) (IDC ’23). Association for Computing Machinery, New York, NY, USA, 789–791. https://doi.org/10.1145/3585088.3589930Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Robin Carlsson, Sampsa Rauti, Samuli Laato, Timi Heino, and Ville Leppänen. 2023. Privacy in Popular Children’s Mobile Applications: A Network Traffic Analysis. In 2023 46th MIPRO ICT and Electronics Convention (MIPRO). IEEE, 1213–1218.Google ScholarGoogle Scholar
  31. Saksham Chitkara, Nishad Gothoskar, Suhas Harish, Jason I Hong, and Yuvraj Agarwal. 2017. Does this app really need my location? Context-aware privacy management for smartphones. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 1, 3 (2017), 1–22.Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Savino Dambra, Iskander Sanchez-Rola, Leyla Bilge, and Davide Balzarotti. 2022. When Sally Met Trackers: Web Tracking From the Users’ Perspective. In 31st USENIX Security Symposium (USENIX Security 22). 2189–2206.Google ScholarGoogle Scholar
  33. Sayamindu Dasgupta and Benjamin Mako Hill. 2021. Designing for critical algorithmic literacies. Algorithmic rights and protections for children (2021).Google ScholarGoogle Scholar
  34. Disconnect.me and Mozilla. [n. d.]. Firefox Blocklist. https://github.com/mozilla-services/shavar-prod-lists.Google ScholarGoogle Scholar
  35. Amy Laura Dombro, Judy R Jablon, and Charlotte Stetson. 2011. Powerful interactions: How to connect with children to extend their learning. National Association for the Education of Young Children Washington, DC.Google ScholarGoogle Scholar
  36. OLA ERSTAD, ØYSTEIN GILJE, GRETA BJÖRK, REBEKKA BAUNBÆK WAGSTAFFE GUDMUNDSDOTTIR, KRISTIINA KUMPULAINEN, OLGA VIBERG, BEN WILLIAMSON, JO TONDEUR, and SARAH HOWARD. 2023. Datafication in and of Education–a literature. (2023).Google ScholarGoogle Scholar
  37. Exodus. [n. d.]. Exodus Privacy. https://exodus-privacy.eu.org/en/.Google ScholarGoogle Scholar
  38. Eunice Fisher. 1993. Characteristics of children’s talk at the computer and its relationship to the computer software. Language and Education 7, 2 (1993), 97–114.Google ScholarGoogle ScholarCross RefCross Ref
  39. Arup Kumar Ghosh, Karla Badillo-Urquiola, Shion Guha, Joseph J LaViola Jr, and Pamela J Wisniewski. 2018. Safety vs. surveillance: what children have to say about mobile apps for parental control. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–14.Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. Arup Kumar Ghosh, Karla Badillo-Urquiola, Mary Beth Rosson, Heng Xu, John M Carroll, and Pamela J Wisniewski. 2018. A matter of control or safety? Examining parental use of technical monitoring apps on teens’ mobile devices. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–14.Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. Arup Kumar Ghosh, Charles E Hughes, and Pamela J Wisniewski. 2020. Circle of trust: a new approach to mobile online safety for families. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–14.Google ScholarGoogle ScholarDigital LibraryDigital Library
  42. Wendy S Grolnick and Eva M Pomerantz. 2009. Issues and challenges in studying parental control: Toward a new conceptualization. Child Development Perspectives 3, 3 (2009), 165–170.Google ScholarGoogle ScholarCross RefCross Ref
  43. Yasmeen Hashish, Andrea Bunt, and James E Young. 2014. Involving children in content control: a collaborative and education-oriented content filtering approach. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 1797–1806.Google ScholarGoogle ScholarDigital LibraryDigital Library
  44. Elena Yi-Ching Ho and Rys Farthing. 2021. How Facebook still targets surveillance ads to teens.Google ScholarGoogle Scholar
  45. Julie Jargon. 2019. How 13 became the internet’s age of adulthood. The Wall Street Journal (2019).Google ScholarGoogle Scholar
  46. Yasmin Kafai, Chris Proctor, and Debora Lui. 2020. From theory bias to theory dialogue: embracing cognitive, situated, and critical framings of computational thinking in K-12 CS education. ACM Inroads 11, 1 (2020), 44–53.Google ScholarGoogle ScholarDigital LibraryDigital Library
  47. Minsam Ko, Seungwoo Choi, Subin Yang, Joonwon Lee, and Uichin Lee. 2015. FamiLync: facilitating participatory parental mediation of adolescents’ smartphone use. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing. 867–878.Google ScholarGoogle ScholarDigital LibraryDigital Library
  48. Konrad Kollnig, Reuben Binns, Max Van Kleek, Ulrik Lyngs, Jun Zhao, Claudine Tinsman, and Nigel Shadbolt. 2021. Before and after GDPR: tracking in mobile apps. arXiv preprint arXiv:2112.11117 (2021).Google ScholarGoogle Scholar
  49. Konrad Kollnig and Nigel Shadbolt. 2022. TrackerControl: Transparency and Choice around App Tracking. Journal of Open Source Software 7, 75 (2022), 4270. https://doi.org/10.21105/joss.04270Google ScholarGoogle ScholarCross RefCross Ref
  50. Brian Krupp, Joshua Hadden, and Malik Matthews. 2021. An analysis of web tracking domains in mobile applications. In Proceedings of the 13th ACM Web Science Conference 2021. 291–298.Google ScholarGoogle ScholarDigital LibraryDigital Library
  51. Priya Kumar, Shalmali Milind Naik, Utkarsha Ramesh Devkar, Marshini Chetty, Tamara L Clegg, and Jessica Vitak. 2017. ’No Telling Passcodes Out Because They’re Private’ Understanding Children’s Mental Models of Privacy and Security Online. Proceedings of the ACM on Human-Computer Interaction 1, CSCW (2017), 1–21.Google ScholarGoogle ScholarDigital LibraryDigital Library
  52. Kitti Kutrovátz 2022. Parental mediation of adolescents’ technology use. Unequal parenting practices. Intersections. East European Journal of Society and Politics 8, 3 (2022), 99–117.Google ScholarGoogle Scholar
  53. Yan Lau. 2020. A brief primer on the economics of targeted advertising. Economic Issues (2020).Google ScholarGoogle Scholar
  54. Sonia Livingstone, Alicia Blum-Ross, and Dongmiao Zhang. 2018. What do parents think, and do, about their children’s online privacy? (2018).Google ScholarGoogle Scholar
  55. Sonia Livingstone and Jasmina Byrne. 2018. Parenting in the digital age: The challenges of parental responsibility in comparative perspective. (2018).Google ScholarGoogle Scholar
  56. S. Livingstone and K Pothong. 2021. Playful by Design: A Vision of Free Play in a Digital World. Technical Report. Digital Futures Commission.Google ScholarGoogle Scholar
  57. Sonia Livingstone, Mariya Stoilova, and Rishita Nandagiri. 2019. Children’s data and privacy online: growing up in a digital age: an evidence review. (2019).Google ScholarGoogle Scholar
  58. Sonia Livingstone, Mariya Stoilova, and Rishita Nandagiri. 2019. Children’s data and privacy online. Technology 58, 2 (2019), 157–65.Google ScholarGoogle Scholar
  59. Paweena Manotipya and Kambiz Ghazinour. 2020. Children’s Online Privacy from Parents’ Perspective. Procedia Computer Science 177 (2020), 178–185.Google ScholarGoogle ScholarCross RefCross Ref
  60. Jesse J Martinez, Travis W Windleharth, Qisheng Li, Arpita Bhattacharya, Katy E Pearce, Jason Yip, and Jin Ha Lee. 2022. Joint Media Engagement in Families Playing Animal Crossing: New Horizons during the COVID-19 Pandemic. Proceedings of the ACM on Human-Computer Interaction 6, CSCW1 (2022), 1–22.Google ScholarGoogle ScholarDigital LibraryDigital Library
  61. Giovanna Mascheroni. 2020. Datafied childhoods: Contextualising datafication in everyday life. Current Sociology 68, 6 (2020), 798–813.Google ScholarGoogle ScholarCross RefCross Ref
  62. Jonathan R Mayer and John C Mitchell. 2012. Third-party web tracking: Policy and technology. In 2012 IEEE symposium on security and privacy. IEEE, 413–427.Google ScholarGoogle ScholarDigital LibraryDigital Library
  63. Cecile Hoareau McGrath, Benoit Guerin, Emma Harte, Michael Frearson, and Catriona Manville. 2015. Learning gain in higher education. Santa Monica, CA: RAND Corporation (2015).Google ScholarGoogle ScholarCross RefCross Ref
  64. Saul McLeod. 2017. Kolb’s learning styles and experiential learning cycle. Simply psychology 5 (2017).Google ScholarGoogle Scholar
  65. Julieta Medawar, Ángel Javier Tabullo, and Lucas Gustavo Gago-Galvagno. 2023. Early language outcomes in Argentinean toddlers: Associations with home literacy, screen exposure and joint media engagement. British Journal of Developmental Psychology 41, 1 (2023), 13–30.Google ScholarGoogle ScholarCross RefCross Ref
  66. Common Sense Media. 2022. The Common Sense Census: Media Use by Tweens and Teens. https://www.commonsensemedia.org/sites/default/files/research/report/8-18-census-integrated-report-final-web_0.pdfGoogle ScholarGoogle Scholar
  67. Common Sense Media. 2022. Two Years Into the Pandemic, Media Use Has Increased 17% Among Tweens and Teens. https://www.commonsensemedia.org/press-releases/two-years-into-the-pandemic-media-use-has-increased-17-among-tweens-and-teensGoogle ScholarGoogle Scholar
  68. Ulises A Mejias and Nick Couldry. 2019. Datafication. Internet Policy Review 8, 4 (2019).Google ScholarGoogle ScholarCross RefCross Ref
  69. Neil Mercer and Rupert Wegerif. 2002. Is ‘exploratory talk’productive talk?Routledge.Google ScholarGoogle Scholar
  70. Georg Merzdovnik, Markus Huber, Damjan Buhov, Nick Nikiforakis, Sebastian Neuner, Martin Schmiedecker, and Edgar Weippl. 2017. Block me if you can: A large-scale study of tracker-blocking tools. In 2017 IEEE European Symposium on Security and Privacy (EuroS&P). IEEE, 319–333.Google ScholarGoogle ScholarCross RefCross Ref
  71. Ofcom. 2021. Children and parents: media use and attitudes report 2020/21. https://www.ofcom.org.uk/__data/assets/pdf_file/0025/217825/children-and-parents-media-use-and-attitudes-report-2020-21.pdfGoogle ScholarGoogle Scholar
  72. Ofcom. 2023. Children and parents: media use and attitudes report 2023. https://www.ofcom.org.uk/__data/assets/pdf_file/0027/255852/childrens-media-use-and-attitudes-report-2023.pdfGoogle ScholarGoogle Scholar
  73. Gwenn Schurgin O’Keeffe, Kathleen Clarke-Pearson, Council on Communications, and Media. 2011. The impact of social media on children, adolescents, and families. Pediatrics 127, 4 (2011), 800–804.Google ScholarGoogle ScholarCross RefCross Ref
  74. Ehimare Okoyomon, Nikita Samarin, Primal Wijesekera, Amit Elazari, Narseo Vallina-Rodriguez, Irwin Reyes, Álvaro Feal, and Serge Egelman. 2019. On The Ridiculousness of Notice and Consent: Contradictions in App Privacy Policies. In The Workshop on Technology and Consumer Protection (ConPro ’19).Google ScholarGoogle Scholar
  75. Luci Pangrazio and Neil Selwyn. 2017. ’My Data, My Bad...’ Young People’s Personal Data Understandings and (Counter) Practices. In Proceedings of the 8th International Conference on Social Media & Society. 1–5.Google ScholarGoogle ScholarDigital LibraryDigital Library
  76. UK Parliament 2022. Online Safety Bill (2022). (2022).Google ScholarGoogle Scholar
  77. Laura R Pina, Carmen Gonzalez, Carolina Nieto, Wendy Roldan, Edgar Onofre, and Jason C Yip. 2018. How Latino children in the US engage in collaborative online information problem solving with their families. Proceedings of the ACM on Human-Computer Interaction 2, CSCW (2018), 1–26.Google ScholarGoogle Scholar
  78. Lingzhi Qiu, Zixiong Zhang, Ziyi Shen, and Guozi Sun. 2015. AppTrace: Dynamic trace on Android devices. In 2015 IEEE International Conference on Communications (ICC). IEEE, 7145–7150.Google ScholarGoogle ScholarCross RefCross Ref
  79. Jenny Radesky, Yolanda Linda Reid Chassiakos, Nusheen Ameenuddin, Dipesh Navsaria, 2020. Digital advertising to children. Pediatrics 146, 1 (2020).Google ScholarGoogle Scholar
  80. Irwin Reyes, Primal Wijesekera, Joel Reardon, Amit Elazari Bar On, Abbas Razaghpanah, Narseo Vallina-Rodriguez, Serge Egelman, 2018. “Won’t somebody think of the children?” examining COPPA compliance at scale. In The 18th Privacy Enhancing Technologies Symposium (PETS 2018).Google ScholarGoogle ScholarCross RefCross Ref
  81. Deborah Richards, Patrina HY Caldwell, and Henry Go. 2015. Impact of social media on the health of children and young people. Journal of paediatrics and child health 51, 12 (2015), 1152–1157.Google ScholarGoogle ScholarCross RefCross Ref
  82. Victoria Rideout. 2014. Learning at home: Families’ educational media use in America.. In Joan Ganz Cooney Center at Sesame Workshop. ERIC.Google ScholarGoogle Scholar
  83. Bernhard Rieder. 2017. Scrutinizing an algorithmic technique: The Bayes classifier as interested reading of reality. Information, Communication & Society 20, 1 (2017), 100–117.Google ScholarGoogle ScholarCross RefCross Ref
  84. Marijn Sax. 2016. Big data: Finders keepers, losers weepers?Ethics and Information Technology 18 (2016), 25–31.Google ScholarGoogle Scholar
  85. Kathleen Scalise, Michelle Douskey, and Angelica Stacy. 2018. Measuring learning gains and examining implications for student success in STEM. Higher Education Pedagogies 3, 1 (2018), 183–195.Google ScholarGoogle ScholarCross RefCross Ref
  86. Diane J Schiano and Christine Burg. 2017. Parental controls: Oxymoron and design opportunity. In HCI International 2017–Posters’ Extended Abstracts: 19th International Conference, HCI International 2017, Vancouver, BC, Canada, July 9–14, 2017, Proceedings, Part II 19. Springer, 645–652.Google ScholarGoogle Scholar
  87. Erica D Shifflet-Chila, Rena D Harold, Victoria A Fitton, and Brian K Ahmedani. 2016. Adolescent and family development: Autonomy and identity in the digital age. Children and Youth Services Review 70 (2016), 364–368.Google ScholarGoogle ScholarCross RefCross Ref
  88. Frank M. Shipman and Catherine C. Marshall. 2020. Ownership, Privacy, and Control in the Wake of Cambridge Analytica: The Relationship between Attitudes and Awareness. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. ACM, Honolulu HI USA, 1–12. https://doi.org/10.1145/3313831.3376662Google ScholarGoogle ScholarDigital LibraryDigital Library
  89. Irina Shklovski, Scott D. Mainwaring, Halla Hrund Skúladóttir, and Höskuldur Borgthorsson. 2014. Leakiness and Creepiness in App Space: Perceptions of Privacy and Mobile App Use. In Proceedings of the 32nd Annual ACM Conference on Human Factors in Computing Systems - CHI ’14. ACM Press, Toronto, Ontario, Canada, 2347–2356. https://doi.org/10.1145/2556288.2557421Google ScholarGoogle ScholarDigital LibraryDigital Library
  90. Anastasia Shuba and Athina Markopoulou. 2020. Nomoats: Towards automatic detection of mobile tracking. Proceedings on Privacy Enhancing Technologies 2020, 2 (2020).Google ScholarGoogle ScholarCross RefCross Ref
  91. Nili Steinfeld. 2021. Parental mediation of adolescent Internet use: Combining strategies to promote awareness, autonomy and self-regulation in preparing youth for life on the web. Education and Information Technologies 26, 2 (2021), 1897–1920.Google ScholarGoogle ScholarDigital LibraryDigital Library
  92. Annette Sundqvist and Mikael Heimann. 2021. Digital media content and co-viewing amongst Swedish 4-to 6-year-olds during COVID-19 pandemic. Acta Paediatrica (Oslo, Norway: 1992) 110, 12 (2021), 3329.Google ScholarGoogle Scholar
  93. L Takeuchi and R Stevens. 2011. The new co-viewing: Designing for learning through joint media engagement. New York, NY: Joan Ganz Cooney Center. In Sesame Workshop.Google ScholarGoogle Scholar
  94. Marika Tiggemann and Ksenia Zinoviev. 2019. The effect of# enhancement-free Instagram images and hashtags on women’s body image. Body Image 31 (2019), 131–138.Google ScholarGoogle ScholarCross RefCross Ref
  95. Michael Tomasello and Katharina Hamann. 2012. The 37th sir frederick bartlett lecture: Collaboration in young children. Quarterly Journal of Experimental Psychology 65, 1 (2012), 1–12.Google ScholarGoogle ScholarCross RefCross Ref
  96. Max Van Kleek, Reuben Binns, Jun Zhao, Adam Slack, Sauyon Lee, Dean Ottewell, and Nigel Shadbolt. 2018. X-ray refine: Supporting the exploration and refinement of information exposure resulting from smartphone apps. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–13.Google ScholarGoogle ScholarDigital LibraryDigital Library
  97. Max Van Kleek, Ilaria Liccardi, Reuben Binns, Jun Zhao, Daniel J Weitzner, and Nigel Shadbolt. 2017. Better the devil you know: Exposing the data sharing practices of smartphone apps. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 5208–5220.Google ScholarGoogle ScholarDigital LibraryDigital Library
  98. Michael Veale and Frederik Zuiderveen Borgesius. 2021. Demystifying the Draft EU Artificial Intelligence Act—Analysing the good, the bad, and the unclear elements of the proposed approach. Computer Law Review International 22, 4 (2021), 97–112.Google ScholarGoogle ScholarCross RefCross Ref
  99. Ge Wang, Jun Zhao, Max Van Kleek, and Nigel Shadbolt. 2021. Protection or punishment? relating the design space of parental control apps and perceptions about them to support parenting for online safety. Proceedings of the ACM on Human-Computer Interaction 5, CSCW2 (2021), 1–26.Google ScholarGoogle ScholarDigital LibraryDigital Library
  100. Ge Wang, Jun Zhao, Max Van Kleek, and Nigel Shadbolt. 2022. ’Don’t make assumptions about me!’: Understanding Children’s Perception of Datafication Online. Proceedings of the ACM on Human-Computer Interaction 6, CSCW2 (2022), 1–24.Google ScholarGoogle Scholar
  101. Ge Wang, Jun Zhao, Max Van Kleek, and Nigel Shadbolt. 2023. ‘Treat Me as Your Friend, Not a Number in Your Database’: Co-Designing with Children to Cope with Datafication Online. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (Hamburg, Germany) (CHI ’23). Association for Computing Machinery, New York, NY, USA, Article 95, 21 pages. https://doi.org/10.1145/3544548.3580933Google ScholarGoogle ScholarDigital LibraryDigital Library
  102. Rupert Wegerif and Neil Mercer. 1996. Computers and reasoning through talk in the classroom. Language and education 10, 1 (1996), 47–64.Google ScholarGoogle Scholar
  103. Aiyana K Willard, Justin TA Busch, Katherine A Cullum, Susan M Letourneau, David M Sobel, Maureen Callanan, and Cristine H Legare. 2019. Explain this, explore that: A study of parent–child interaction in a children’s museum. Child Development 90, 5 (2019), e598–e617.Google ScholarGoogle ScholarCross RefCross Ref
  104. Dylan Williams, Alexandra McIntosh, and Rys Farthing. 2021. Profiling Children for Advertising: Facebook’s Monetisation of Young People’s Personal Data. Reset Australia. https:/au. reset. tech/uploads/resettechaustralia_profiling-children-for-advertising-1. pdf (accessed April 2021) (2021).Google ScholarGoogle Scholar
  105. Pamela Wisniewski, Arup Kumar Ghosh, Heng Xu, Mary Beth Rosson, and John M Carroll. 2017. Parental control vs. teen self-regulation: Is there a middle ground for mobile online safety?. In Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing. 51–69.Google ScholarGoogle ScholarDigital LibraryDigital Library
  106. Pamela Wisniewski, Haiyan Jia, Heng Xu, Mary Beth Rosson, and John M Carroll. 2015. " Preventative" vs." Reactive" How Parental Mediation Influences Teens’ Social Media Privacy Behaviors. In Proceedings of the 18th ACM conference on computer supported cooperative work & social computing. 302–316.Google ScholarGoogle Scholar
  107. Julia Woodward, Feben Alemu, Natalia E. López Adames, Lisa Anthony, Jason C. Yip, and Jaime Ruiz. 2022. “It Would Be Cool to Get Stampeded by Dinosaurs”: Analyzing Children’s Conceptual Model of AR Headsets Through Co-Design. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems (New Orleans, LA, USA) (CHI ’22). Association for Computing Machinery, New York, NY, USA, Article 152, 13 pages. https://doi.org/10.1145/3491102.3501979Google ScholarGoogle ScholarDigital LibraryDigital Library
  108. Julia Woodward, Shaghayegh Esmaeili, Ayushi Jain, John Bell, Jaime Ruiz, and Lisa Anthony. 2018. Investigating Separation of Territories and Activity Roles in Children’s Collaboration around Tabletops. Proc. ACM Hum.-Comput. Interact. 2, CSCW, Article 185 (nov 2018), 21 pages. https://doi.org/10.1145/3274454Google ScholarGoogle ScholarDigital LibraryDigital Library
  109. Ying Xu, Kunlei He, Valery Vigil, Santiago Ojeda-Ramirez, Xuechen Liu, Julian Levine, Kelsyann Cervera, and Mark Warschauer. 2023. “Rosita Reads With My Family”: Developing A Bilingual Conversational Agent to Support Parent-Child Shared Reading. In Proceedings of the 22nd Annual ACM Interaction Design and Children Conference (Chicago, IL, USA) (IDC ’23). Association for Computing Machinery, New York, NY, USA, 160–172. https://doi.org/10.1145/3585088.3589354Google ScholarGoogle ScholarDigital LibraryDigital Library
  110. Kate Yen, Yeqi Chen, Yi Cheng, Sijin Chen, Ying-Yu Chen, Yiran Ni, and Alexis Hiniker. 2018. Joint media engagement between parents and preschoolers in the US, China, and Taiwan. Proceedings of the ACM on human-computer interaction 2, CSCW (2018), 1–19.Google ScholarGoogle Scholar
  111. Junnan Yu, Sari Widman, and Ricarose Roque. 2023. Family Negotiation in Joint Media Engagement with Creative Computing. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems. 1–15.Google ScholarGoogle ScholarDigital LibraryDigital Library
  112. Bieke Zaman and Marije Nouwen. 2016. Parental controls: advice for parents, researchers and industry. EU Kids Online (2016), 1–9.Google ScholarGoogle Scholar
  113. Bieke Zaman, Marije Nouwen, Jeroen Vanattenhoven, Evelien De Ferrerre, and Jan Van Looy. 2016. A qualitative inquiry into the contextualized parental mediation practices of young children’s digital media use at home. Journal of Broadcasting & Electronic Media 60, 1 (2016), 1–22.Google ScholarGoogle ScholarCross RefCross Ref
  114. Zheng Zhang, Ying Xu, Yanhao Wang, Bingsheng Yao, Daniel Ritchie, Tongshuang Wu, Mo Yu, Dakuo Wang, and Toby Li. 2022. Storybuddy: A human-ai collaborative agent for parent-child interactive storytelling with flexible parent involvement. In ACM CHI Conference on Human Factors in Computing Systems.Google ScholarGoogle ScholarDigital LibraryDigital Library
  115. Jun Zhao, Ge Wang, Carys Dally, Petr Slovak, Julian Edbrooke-Childs, Max Van Kleek, and Nigel Shadbolt. 2019. I make up a silly name’ Understanding Children’s Perception of Privacy Risks Online. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 1–13.Google ScholarGoogle ScholarDigital LibraryDigital Library
  116. Shoshana Zuboff. 2019. The age of surveillance capitalism: The fight for a human future at the new frontier of power: Barack Obama’s books of 2019. Profile books.Google ScholarGoogle Scholar

Index Terms

  1. KOALA Hero Toolkit: A New Approach to Inform Families of Mobile Datafication Risks

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Conferences
          CHI '24: Proceedings of the CHI Conference on Human Factors in Computing Systems
          May 2024
          18961 pages
          ISBN:9798400703300
          DOI:10.1145/3613904

          Copyright © 2024 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 11 May 2024

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article
          • Research
          • Refereed limited

          Acceptance Rates

          Overall Acceptance Rate6,199of26,314submissions,24%
        • Article Metrics

          • Downloads (Last 12 months)245
          • Downloads (Last 6 weeks)245

          Other Metrics

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        HTML Format

        View this article in HTML Format .

        View HTML Format