VentTok: Exploring the mental health narrative on TikTok.

TikTok is a hugely popular social media platform that has become an important space for individuals, especially youth, to express their attitudes and feelings toward issues. This study seeks to understand the narrative surrounding mental health and illness on TikTok, including the proportion of stigmatizing or trivializing content. A content analysis was performed on a sample of 400 TikTok posts collected via #mentalhealth and #mentalillness over a 15-day period. The posts were coded for type of content and emotional expression, as well as subtypes of stigmatization and trivialization. Linguistic analysis was performed on the 10 most “ liked ” comments from each post. Of the 400 posts, the majority described a personal experience of mental ill-health ( n = 297, 74.25%), 29.25% ( n = 117) were supportive of those with mental ill-health, and 34.50% ( n = 138) intended to raise awareness. However, the most prominent emotional expression was one of sadness, loneliness, or despair ( n = 164, 41.00%). A portion of posts included content describing a lived experience of stigma ( n = 42, 10.50%) or trivialization ( n = 25, 6.25%), while 11.50% ( n = 46) and 12.50% ( n = 50) of posts contained stigmatizing and trivializing content, respectively. These results suggest TikTok is primarily a space used to share feelings and experiences surrounding mental health and illness. Yet, the potential impacts on those living with mental ill-health remain unclear. Future research should therefore seek to deepen our understanding of the ways in which social media narratives impact the lived experience of mental ill-health.

The use of social media has become near ubiquitous in the modern digital age, with over 4.95 billion global users (Statista, 2024c)."Social media" refers to the Web 2.0 online social networking applications which allow individuals to communicate and share information (Kanchan & Gaidhane, 2023).The typical social media user is thought to visit an average of 6.7 different platforms each month and spend over 2 hr using social media each day, equating to approximately 15% of their waking lives (Data Reportal, 2024).
While the popularity of various platforms shifts over time, there has been consistent growth in overall social media use since its inception (Data Reportal, 2024).
Specifically, TikTok has seen a meteoric rise in popularity since the COVID-19 pandemic, attracting an audience of over 1.5 billion active users as of January 2024 (Statista, 2024b;Weimann & Masri, 2023).First developed by Beijing-based company ByteDance in 2016, TikTok is an application which allows users to create, view, and interact with short-form looping video content, generally 15-60 s in length (Vaterlaus & Winter, 2021;Weimann & Masri, 2023).Unlike most social media platforms in which newsfeeds center around the profiles users follow, the TikTok "For You Page" harnesses a complex algorithm to curate content based on analysis of user behaviors, interactions, and apparent interests (Vaterlaus & Winter, 2021).The platform has become an important space for individuals, especially youth, to express their attitudes and feelings toward a vast array of subjects and issues, including those related to mental health and illness (Stahl & Literat, 2023;Wongkoblap et al., 2017).
Mental health conditions are a worldwide public health issue, imposing significant disease burden across the entire life course (World Health Organization, 2022).As over 13% of the global population live with a mental health condition, these conditions are the leading cause of years lived with disability and remain within the top 10 leading causes of global disease burden (Global Burden of Disease 2019Mental Disorders Collaborators, 2022;World Health Organization, 2022).Early help-seeking for mental health problems has a significant positive impact on health outcomes, financial costs, social functioning, and quality of life (Campion et al., 2012).Yet, there remains a significant treatment gap, with only 22.5% of individuals experiencing mental health problems seeking help (Doll et al., 2021).Although in large part due to the lack of appropriate services and funding for mental health care, research across a range of contexts and age groups also consistently highlights stigma as one of the most significant barriers to help-seeking (Aguirre Velasco et al., 2020;Clement et al., 2015;Gulliver et al., 2010;Yap et al., 2013).
Mental illness has been described to "strike with a two-edged sword" (P.W. Corrigan, 2004a, p. 1).On the one hand, those living with a mental illness experience the mental and physical symptoms of their condition.But on the other hand, they face the social impacts of stigma and discrimination (Torales et al., 2023).In addition to hindering help-seeking behavior, the stigma of mental illness and resultant discriminatory behaviors mean that those living with mental illness face added challenges in many aspects of life.For example, social exclusion which limits employment, housing, and economic opportunities, as well as significant emotional burden resulting in reduced self-efficacy, self-esteem, and life satisfaction (Park et al., 2019;Verhaeghe et al., 2008;World Health Organization, 2022).Beyond the individual, stigma can impact the families of those living with mental illness, health care professionals, and the wider population, with ramifications for the allocation of funding and resources toward improved mental health care (Ahad et al., 2023).
Although research into the representation of mental illness has to date focused primarily on the concept of stigma, more recently "trivialization" has emerged as an important but understudied phenomenon.Where stigmatization may deter people away from those with mental illness, trivialization is a diminishing behavior or attitude which serves to make conditions seem less severe, less complex, or even funny (R. Pavelko & Myrick, 2016).It is important that we gain a deeper understanding of this more recent trend as well as further our understanding of mental illness stigma.These representations have significant implications for those living with mental illness and how others interact with them, along with deleterious consequences for mental health policy and program support (Thornicroft et al., 2016).
Historically, research investigating the representation of mental illness has primarily focused on traditional media sources, such as print or television news (Hildersley et al., 2020;Kenez et al., 2015;McGinty et al., 2016;Stuart, 2006).However, as highlighted by a recent systematic review, there is an emerging field of research investigating portrayals of mental illness on social media (Tudehope et al., 2024).These platforms can provide a data-rich window into the perceptions and attitudes of the public, in particular youths, toward mental illness.People living with mental illness are also known to have even higher rates of social media use than the general population, and it is important that we develop a deeper understanding of the content these potentially vulnerable individuals are exposed to online (Budenz et al., 2019).
Although some research has been done in this field, systematic review of the literature by Tudehope et al. (2024) revealed the overwhelming majority of studies have focused on text-based social media platforms such as Twitter and Sina Weibo, with a dearth of studies investigating video-based platforms.There is previous research examining mental health-related content on YouTube, focused on schizophrenia (Athanasopoulou et al., 2016), depression (Devendorf et al., 2020), trichotillomania (Ghate et al., 2022), borderline personality disorder (King & McCashin, 2022), and mental illness more broadly (McLellan et al., 2022;Naslund et al., 2014).However, while both YouTube and TikTok are primarily video-based platforms, they offer vastly different types of content which are consumed by viewers in different ways, with the latter restricted to rapid short-form content.To date, only one published study has analyzed mental health-related content on TikTok, but it was limited to coding only for the presence or absence of stigma, without more nuanced analysis of stigma (Basch, Donelle, et al., 2022).Further, only a few studies have explored the trivialization of mental illness on social media, all of which focused on Twitter and did not code for more specific forms of trivialization (Kara & Şenel Kara, 2022;Passerello et al., 2019;Reavley & Pilkington, 2014;Robinson et al., 2019)

Method Study Design and Data Collection
This cross-sectional study involved a content analysis of TikTok posts related to mental health and illness collected using nonprobability sampling.Collection took place over a 15-day period between January 23 and February 2, 2024.A new TikTok account was set up for research purposes to ensure tailoring of content through algorithms was minimized.Relevant TikTok posts were identified via two hashtags, #mentalhealth and #mentalillness, where the most popular (i.e., the most "liked") posts under each hashtag were collected.To avoid any significant daily variation in content, 20 posts were collected per day for each hashtag.These dates did not coincide with any prominent international awareness days or events for mental health which could temporarily skew social media discourse (Makita et al., 2021;Nelson, 2019).
Posts were excluded based on the following criteria: (a) despite including one of the hashtags, the post appeared to have no relevance to either mental health or mental illness; (b) spoken language, video text, and/or caption were in a language other than English; and (c) there were issues with accessibility (e.g., some videos had sound removed or had become unavailable).While the majority of TikTok posts are in a video format, in 2022, TikTok introduced a feature allowing users to post photos in a carousel-style format (TikTok, 2022).Given that photo posts may still appear under the hashtags and contain relevant content, these were included for further analysis.Collection of 20 posts per day continued until a total of 200 posts were collected for each hashtag, with a final sample of 400 posts for analysis (refer to Figure 1).The selected sample size is in line with the number of posts analyzed in previous TikTok research employing manual content analysis, which range anywhere from 100 to 342 total posts (Basch, Donelle, et al., 2022;Basch, Hillyer, et al., 2022;Harriger et al., 2023;Li et al., 2021;Lookingbill, 2022;Marynak et al., 2022;Yalamanchili et al., 2023).
The Uniform Resource Locator for each of the included videos was copied into an Excel file, along with associated metrics including the date posted, video duration, likes, comments, saves, shares, and cooccurring hashtags, as well as the number of followers and profile likes on the content creator profile.A TikTok comment scrapping software (TTCommentExporter) was also utilized to collect comments associated with each of the included posts.The software was run for each post, and the data exported into Microsoft Excel, where comments were sorted in descending order based on the number of "likes."The top 10 most "liked" comments were then extracted for linguistic analysis.In the rare instance where the last few comments contained the same number of likes, the comment that appeared "highest" underneath the social media post was selected.

Coding Framework
The coding framework was developed by the research team through a combination of deductive and inductive approaches.An initial coding framework was drafted based on previous literature and was subsequently modified after coding a sample of posts to better fit the TikTok corpus.First, various visual and audio production features were coded, including whether the post included text, a caption, spoken language, people, music, subtitles, or animations.Coders also viewed the profile of each content creator and coded for gender, approximate age, and the creator type/affiliation.If the content creator had not specified their gender in the profile description, coders noted the creators presenting gender, either feminine, masculine, or other, while respectfully acknowledging that these labels may not reflect actual identities (Lookingbill, 2022).
TikTok posts were then coded for type of mental health/illness content, with categories based upon the work of Reavley and Pilkington (2014), including personal experience, awareness promotion, advice, news, advertising, research findings, antistigma, personal opinion, and supportive content (refer to Table 1 for coding definitions).The overall vibe and emotional expression portrayed in each post was coded, as well as any mention of specific mental health conditions, the hospital system, psychiatric medications, or nonpharmacological treatments and therapies.While much of the research in this field has coded for simply the presence or absence of stigmatizing mental health content, the framework developed by Reavley and Pilkington (2014) includes more nuance, with specific subcodes of stigma, for example, social distance, dangerousness, "snap out of it," and personal weakness.Thus, the categories developed by Reavley and Pilkington (2014) formed the basis of the coding framework for the presence of stigma in posts.However, after coding an initial sample, it was noted that there was a distinct difference between content describing an experience of stigma versus the content itself conveying a stigmatizing message.Therefore, the coding framework was modified to include a category for the experience of stigma and stigmatizing content, each containing the subcodes (refer to Table 1 for coding definitions).The only exception to this coding was for the subcategory "internalized stigma," where an individual applies a negative mental illness stereotype, prejudice, or discrimination toward themselves (P.Corrigan, 2004b).The coders discussed this category and came to the decision that in the context of a TikTok post, this could only be reasonably coded as a "type of content" and not necessarily an "experience of stigma," so only coded for the former.
Similarly, in prior research examining trivializing mental health content on social media, observations often focus on the mere presence or absence of trivialization without offering more specific categorization (Kara & Şenel Kara, 2022).The present study drew upon research by Myrick and Pavelko (2017) and R. L. Pavelko and Myrick (2020) to create coding categories for subtypes of trivialization including lack of severity, comedic/funny, oversimplification, and perceived benefit.Like the coding framework for stigma, posts were coded for both experience of trivialization and the content itself being trivializing, for each of the four subcodes.

Coding Protocol and Intercoder Reliability
After the initial coding framework was developed, the research team met to discuss each code and their interpretation of its meaning to ensure common understanding.Two researchers then independently coded a sample of five TikTok posts, after which they met again to discuss whether application of the framework aligned and made modifications to the framework if necessary.This rigorous process of independent coding of a sample, discussion, and inductive refinement was repeated multiple times until both researchers were satisfied that the coding framework could be applied with sufficient reliability and would provide a meaningful analysis of the TikTok content.
Once the coding framework was finalized, each coder independently coded a 10% sample of the TikTok posts for each hashtag (n = 40), selected via random number generator.Using this sample, Cohen's κ was calculated to assess intercoder reliability (refer to Supplemental Tables 1 and 2).Coders sought at least a moderate level of agreement for each code (κ > 0.60) but referenced the more stringent thresholds of acceptability proposed by McHugh (2012) as opposed to the more lenient cutoff values originally suggested by

4
TUDEHOPE, SOFIJA, AND HARRIS Cohen (1960).The code "personal opinion" under content type did not achieve an acceptable intercoder reliability score, and coders noted that it was challenging to determine what could be considered personal opinion.Due to this ambiguity and the intercoder reliability score, the code was excluded from the framework moving forward.Once reliability was established, one researcher coded the remaining TikTok posts.

Linguistic Analysis of Comments
The top 10 most "liked" comments from each of the 400 collected TikTok posts were analyzed using Linguistic Inquiry and Word Count (LIWC) software (LIWC-22 Version 1.7.1).This program performs linguistic analysis based on the notion that the way people use words and write can be reflective of both their social and psychological states (Pennebaker & Ireland, 2011).The software compares inputted text against an internal dictionary of over 12,000 words, word stems, emoticons, and other verbal constructions and reports word frequencies in psychologically meaningful categories (Boyd et al., 2022).After analyzing the text against the LIWC-22 dictionary, the software calculates the percentage of words in the text that match the dictionaries from each category (Boyd et al., 2022).For example, if 50 words associated with "anger" were identified in an analysis of text containing 1,000 words, this would be presented as an "anger" score of 5.00%.Categories most relevant to the present study's research questions were selected for presentation in the results, including "emotion" (i.e., the percentage of words conveying an emotion), and more specific categories such as "positive tone," "negative tone," "positive emotion," "negative emotion," "anxiety," "anger," and "sadness."The summary variable "Tone" was also included, which incorporates both the positive and negative tone dimensions in an algorithmically calculated summary score, where a higher number reflects more positive tone and a number <50 indicates a negative tone.The validity and reliability of these linguistic dimensions have previously been reported as acceptable (Boyd et al., 2022).

Statistical Analyses
Coding was performed in Microsoft Excel and was later imported into Statistical Product and Service Solutions (Version 29.0.2.0) for analysis.A descriptive analysis of the coding sample was performed, using frequency and percentage for categorical coding data and mean and standard deviation (SD) for continuous data, for example, reach metrics such as likes, shares, and followers.Bivariate analysis using Spearman's correlation was conducted to examine the association between profile metrics and comment linguistic analysis results.

Ethical Considerations
Ethics approval was not required for the study, as this was a content analysis of publicly available social media data and therefore did not involve human participants or intervention.Although this type of study is exempt from formal ethical review, there is still the possibility of harm if individual users learn they were inadvertently included in the research study without their knowledge or consent.Therefore, care was taken to avoid pub-lishing any screenshots of individual TikTok posts or potentially identifying information.
Of the 400 posts analyzed, 163 posts did not refer to a specific mental health condition or issue (40.75%).However, of the posts which did mention a mental health condition/s (n = 237), the most common were suicide (n = 40, 10.00%), depression (n = 32, 8.00%), and anxiety (n = 31, 7.75%).Figure 2 illustrates that the number of posts mentioning specific mental health conditions is relatively consistent between #mentalhealth and #mentalillness (refer to Figure 2).Of particular note, 10 posts under #mentalillness referred to schizophrenia/psychosis, while only one post mentioned this condition under #mentalhealth.
Coding revealed that a total of 42 posts (10.50%) included content in which someone described a lived experience of stigma related to mental illness (refer to Table 3).Of these posts, the most frequently MENTAL HEALTH CONTENT ON TIKTOK communicated subtypes of stigma included rude or derogatory attitudes toward those with mental illness (n = 20, 5.00%), a lack of knowledge or inaccurate beliefs about mental illness (n = 17, 4.25%), and suggestions that people living with mental illness "snap out of it" (n = 16, 4.00%).In addition to content which described a lived experience of stigma, the coding also highlighted content which itself was stigmatizing toward mental illness (n = 46, 11.50%).Of these posts, the most frequent subtypes of stigma were portrayal of those with mental illness as dangerous, unsafe, or unpredictable (n = 13, 3.25%); posts demonstrating a lack of knowledge or inaccurate beliefs surrounding mental illness (n = 11, 2.75%); and posts suggesting those with mental illness are "to blame" (n = 9, 2.25%).A total of 29 posts (7.25%) were also found to include internalized or self-stigmatizing content.

Principal Findings
This study performed a content analysis to understand the narrative created by TikTok users surrounding mental health and illness.Coding of TikTok videos revealed that 10.50% of posts contained content communicating a lived experienced of stigma, while 11.50% of posts contained stigmatizing content.The only previous study to quantify stigmatizing mental health content on TikTok, which took a binary approach, found a much lower proportion of stigmatizing posts (1.00%; Basch, Donelle, et al., 2022).Although this comparison of results could suggest our definition of stigma was more sensitive, we also theorize that the more detailed stigma coding framework adopted by our study could assist coders in thinking more deeply about the ways in which content could be perceived as stigmatizing.While 11.50% of posts may not seem like a significant proportion, it must be noted that exposure to any negative representations could be harmful to those living with mental illness, potentially impacting self-esteem, medication adherence, recovery, and help-seeking behaviors (Stuart, 2006).Empirical evidence also suggests adults possess an in-built negativity bias, in which negative experiences and stimuli are remembered and attended to more so than positive ones (Vaish et al., 2008).Persons living with mental illness may be exposed to these stigmatizing posts and internalize attitudes in a process of self-stigma or develop increased perceived public stigma (P.Corrigan, 2004b;P. W. Corrigan et al., 2006).Similarly, while posts detailing a lived experience of stigma could help those living with mental illness in feeling less alone in their own journey, we also theorize that such posts could counterproductively inflate an individual's perceptions of public stigma.However, targeted research into this phenomenon is needed to further explicate and support this initial theory.
To the best of our knowledge, this was also the first study to quantify the proportion of trivializing mental health-related posts on  TikTok.Analysis suggested 12.50% of posts were trivializing, and 6.25% alluded to a lived experience of trivialization.While there are no previous TikTok-based studies with which to compare these results, a Twitter study by Robinson et al. (2019) revealed similar rates of trivialization (14.3%).Although much of the previous research in this field has focused on stigma, these results reinforce suggestions that trivialization of mental illness is an important emerging trend, particularly on social media (Kara & Şenel Kara, 2022;R. Pavelko & Myrick, 2016;Robinson et al., 2019).Notably, the most frequently coded subtype of trivialization was the suggestion that mental illness is "not serious" (8.50%), downplaying the severity and seriousness of these conditions.Such representations may lead TikTok users to believe mental health conditions are not serious enough to warrant seeking formal support and can be self-managed, a preference that is already prevalent among young people (Truscott et al., 2023).Additionally, they may mean users are less likely to show empathy toward those suffering with mental illness and ultimately in the longer term fail to prioritize public policies and research focused on mental health (R. Pavelko & Myrick, 2016).
In addition to coding for stigma and trivialization, the framework also served to analyze the type and features of TikTok posts.The results suggest that posts which are supportive of those with mental ill-health (29.25% of posts), help to raise awareness of mental health conditions (34.50%), and reduce the stigma of mental ill-health (11.00%) are prevalent on the social media platform.Given both the speed and breadth at which social media content can be disseminated, the impact of such positive content could be significant in shifting societal attitudes toward mental illness (Gough et al., 2017).Awareness helps develop public understanding of mental illhealth, allowing individuals to better recognize signs and symptoms in themselves, family, or friends and ultimately improve mental health outcomes (Latha et al., 2020).Online support for this cause can also serve as a catalyst for further mental health-focused advocacy, funding, and policy beyond the bounds of the online space (Betton et al., 2015).
The results also demonstrated most posts (74.25%) shared a personal experience of mental ill-health.These findings are in stark contrast to that of a Twitter-based study analyzing content from #depression and #schizophrenia, which found only 16% and 10% of tweets shared a personal experience, respectively (Reavley & Pilkington, 2014).While these results could merely highlight the differences in how TikTok is used compared to Twitter, the previous study was also performed 10 years earlier and may demonstrate a significant shift in online behavior.Social media users, particularly the younger generations dominant on platforms like TikTok (Statista, 2024a), may have moved their behaviors from primarily sharing information online to sharing experiences and feelings.This is also supported by the low proportion of posts mentioning research findings (n = 8, 2.00%), indicating TikTok is not a platform on which users are providing evidence-based information.Mentions of the hospital system (8.50%),psychiatric medications (6.25%), and nonpharmacological therapies (5.25%) were also very infrequent.The lack of discussion of these treatments and services, on a platform with such extensive audience reach and engagement (Statista, 2024b), is a missed opportunity for social media to act as a positive catalyst for individuals to seek in-person or formal support.
Another key research question for this study related to the vibe and emotional expression of mental health-related content on TikTok.
Analysis suggested a very high proportion of posts involved emotional expressions of sadness, loneliness, and despair (n = 164, 41.0%), and far fewer posts expressed feelings of happiness, hope, or encouragement (n = 58, 14.5%).These findings, together with the high proportion of posts conveying personal experience, suggest that people are using TikTok as a space to express the negative feelings and challenges of their lived experience.We also note that hashtags such as #vent, #sadtok, and #relatable commonly co-occurred with the chosen study hashtags (see Supplemental Table 3 and Supplemental Figure 1).Venting is a form of emotion-focused coping, which involves fixating on an issue and subsequently expressing negative feelings or experiences (Trần et al., 2023).On the one hand, the relatable nature of such posts, in which users share their own lived experience of mental ill-health and its challenges, could be a comfort to other users, making them feel less alone in their own journey.Previous research examining motivations for social media use suggests that those with lower self-esteem and higher loneliness are more likely to post content sharing personal experiences and memories for their own therapeutic reasons (Stone et al., 2022).However, many studies also suggest that emotional venting in this way is a maladaptive avoidant coping strategy (Gurvich et al., 2021;Marr et al., 2022).Temporary relief may be found from emotional expression and sharing of experiences with research monitoring adolescent brain activity demonstrating activity in the reward centers of the brain when receiving positive feedback or "likes" (Sherman et al., 2016).However, this relief may come at the cost of actively working to address the underlying problems and affect (Marr et al., 2022).We theorize that such emotional venting could create a downward spiral in affect for not just the individual posting the TikTok content, but potentially those experiencing their own mental health crises viewing such negative content.
While it is beyond the scope of this study to investigate the mental health status of those viewing and engaging with mental healthrelated content on TikTok, the linguistic analysis performed on post comments does provide some insight into the response of viewers.The mean "Tone" score for comments was 37.21, where scores below 50 indicate a skew toward negative emotional tone (Boyd et al., 2022).In comparison to a "test corpus" of randomly collected tweets performed by the LIWC-22 software developers (Boyd et al., 2022), the mental health-related TikTok posts had a much more negative Tone summary score (score of 37.21 compared to 68.00 for the test corpus), more negative emotion (3.08% of words compared to 0.76% for the test corpus), and higher levels of anxiety (0.51% compared to 0.10% for the test corpus) and sadness (0.64% compared to 0.17% for the test corpus).These results indicate that not only are negative emotional expressions frequent in TikTok posts, but they may also be prompting similar expressions of sadness and negativity in comments.The findings suggest mental healthrelated posts on TikTok may be creating a negative narrative around mental health and provide some insights into the potential impact on users exposed to such content, although far more research is needed in this area.

Limitations and Future Directions
It must be recognized that the present study did have several limitations.First, the method to identify relevant TikTok posts meant that only the "top" posts under #mentalhealth and #mentalillness were collected for analysis.In reality, this is not how the vast majority of 8 TUDEHOPE, SOFIJA, AND HARRIS users navigate TikTok, as content is primarily presented to users on the "For You Page" dictated by a complex algorithm (Vaterlaus & Winter, 2021).Given that only the most popular posts under these hashtags were analyzed, perhaps fewer posts with negative representations were collected as these would not be popular and "liked" by viewers.To circumvent these issues, we suggest future researchers attempt to replicate the user experience and navigation of the TikTok algorithm to collect a sample of posts more reflective of the mental health content seen by the average user.
Similarly, linguistic analysis was limited to the top 10 most "liked" comments under each of the selected posts.The analysis was restricted to the categories built into the LIWC dictionary (Version 1.7.1), which provide a more granular examination of negative emotions such as "anxiety," "anger," and "sadness," but not for more positive aspects of the comments such as "support" or "empathy."It is suggested that future iterations of the LIWC software include the development of additional positive categories to assist researchers in providing a more balanced linguistic analysis.It must also be acknowledged that linguistic analysis has limitations, for example, it does not consider factors such as word order or syntax, or the context in which words are used which could lead to some degree of misinterpretation (Mattias et al., 2013;Windsor et al., 2019).
Second, previous research suggests that stigma toward mental illness is strongly influenced by culture (Ahad et al., 2023), and thus, the exclusion of non-English posts may have influenced results.Notably, previous content analysis of Finnish and Greek YouTube (Athanasopoulou et al., 2016), Greek Twitter (Athanasopoulou & Sakellari, 2016), and Turkish Twitter (Kara & Şenel Kara, 2022) all demonstrated very high proportions of negative schizophrenia representations.The exclusion criteria also meant that TikTok posts were omitted if they were irrelevant to the topic despite including a relevant hashtag.However, the coders noted that in a few cases, the use of #mentalhealth or #mentalillness on irrelevant posts was in and of itself trivializing or stigmatizing.For example, a video showed security footage of an individual stealing from a retail store and was tagged with #mentalillness despite no clear indications this person was experiencing mental ill-health.The inclusion of this hashtag on such a video implies an association between mental illness and criminal behaviors and therefore contributes to negative representations of mental illness.
Finally, it must also be acknowledged that the opinions and attitudes toward mental health and illness on social media do not necessarily reflect the wider views of the public and cannot be interpreted as such.However, the results of this content analysis do help to uncover the narrative present on TikTok and the type of content general users and more specifically those with mental illness may be exposed to on the platform.

Conclusion
The study findings have shown that while positive and supportive mental health-related content on TikTok is prevalent, mental health stigma is still evident, and that trivialization is an emerging but important negative representation.We believe health professionals should remain cognizant that their patients may be exposed to such content online.This study has shed light on the specific subtypes of stigma and trivialization present on social media.We suggest these findings could therefore assist in informing the development of social media campaigns that specifically and effectively target the types of stigma that still remain pervasive online.The study also highlighted that TikTok is primarily used as a space to vent and share feelings and experiences surrounding mental health and illness.While the study has furthered knowledge of mental healthrelated content on TikTok, empirical research into the perspectives of those with lived experience of mental illness is needed to understand the potential impacts of such content.
Figure 1 Flow Diagram of Included TikTok Figure 2 Mention of Specific Mental Health Condition or Topic

Table 1
Coding Definitions for Types of Content, Stigmatization, and Trivialization to have social contact with people who live with mental health issues.Dangerous/unsafe/unpredictable Portrays people who live with mental health issues as dangerous, unsafe, or unpredictable.To blame Portrays people who live with mental health issues as being to blame for their issues.Cannot recover Suggests that people who live with mental health issues cannot recover or get better.Snap out of it Suggests that people who live with mental health issues could snap out of it.Personal weakness Suggests mental health issues are a sign of personal weakness.
Lack knowledge/inaccurate Indicates a lack of knowledge or inaccurate understanding of mental health issues.Rude/derogatory Is rude or derogatory toward people who live with mental health issues.Internalized stigma Indicates that the individual has internalized a stigmatizing attitude toward people living with mental health issues.Trivialization Not serious Suggests mental health issues are not a serious problem.Benefit Portrays symptom/s of mental ill-health as a benefit that could potentially help or improve quality of life.Comedic/funny Portrays mental health issues as comedic or funny.Oversimplified Oversimplifies the experience of mental health issues.

Table 2
Features of Mental Health Content and Creator Profiles on TikTok a n = 373 (27 posts were photo carousels with no duration).b n = 392 (eight posts had comments turned off).6 TUDEHOPE, SOFIJA, AND HARRIS

Table 3
Stigma and Trivialization in Mental Health Content on TikTok a Post contained at least one form of stigmatization/trivialization.MENTAL HEALTH CONTENT ON TIKTOK