Technology-Facilitated Violence Against Women and Girls in Public and Private Spheres: Moving from Enemy to Ally

While research on digital dangers has been growing, studies on their respective solutions and justice responses have not kept pace. The agathokakological nature of technology demands that we pay attention to not only harms associated with interconnectivity, but also the potential for technology to counter offenses and “ do good. ” This chapter discusses technology as both a weapon and a shield when it comes to violence against women and girls in public spaces and private places. First, we review the complex and varied manifestations of technological gender violence, ranging from the use of technology to exploit, harass, stalk, and otherwise harm women and girls in communal spaces, to offenses that occur behind closed doors. Second, we discuss justice-related responses, underscoring how women and girls have “ ﬂ ipped the script ” when their needs are not met. By developing innovative ways to respond to the wrongs committed against them and creating alternate systems that offer a voice, victims/survivors have repurposed technology to redress harms and unite in solidarity with others in an ongoing quest for justice.


Introduction
Literature on digital dangers has been growing, yet the research on their respective solutions and justice system responses has not kept pace.The agathokakological nature of technology demands that we pay attention to not only harms associated with interconnectivity but also the potential for technology to counter offenses and "do good."The dichotic nature of digital progress in relation to violence against women and girls (VAWG) suggests that much more needs to be done to address and prevent increasingly common episodes of technology-facilitated gender violence, especially for vulnerable and marginalized groups, to exact positive cultural change.Further, we must be concerned with online displays of sexism and misogyny in communal spaces and private technologized violence (i.e., when electronic means are used "behind closed doors" or in exclusive communities as a tool for surveillance) as they may co-occur, and also share commonalities with wider public violations including street harassment and nonpublic violations like family violence.
This chapter reviews VAWG in public spaces and private places, and considers technology as both a weapon against women and girls and a shield for them.We first explore the complex and varied manifestations of technological transgressions, which include the use of technology to harass (e.g., cyberbullying) or nonconsensually record (e.g., "upskirting") women and girls (hereafter referred to as WG), to violate positions of power and authority when electronically tracking them during mediated encounters (e.g., stalking by Uber drivers), and to monitor/ control "loved" ones (e.g., home surveillance systems).We then reflect on formal and informal "justice" responses, underscoring the ways by which WG have "flipped the script" when their needs have not been met, to move from victim to survivor, utilizing the very tools that have been misappropriated by others to violate them.Last, we highlight recent, innovative technological approaches to combat online VAWG, closing with comprehensive recommendations on how to better serve victims/survivors of gender-related violence.

Violence against Women and Girls
Despite years of advocacy work, legislative change, and support services, VAWG affects millions across the globe (WHO, 2017), which includes offenses ranging from daily encounters tinged with benevolent sexism (see, e.g., discussion of gender microaggressions in Capodilupo et al., 2010) to less common yet more egregious and hostile events (see, e.g., discussion of mass murder in Marganski, 2019a).One in three women will have experienced an act of physical or sexual violence in her lifetime (WHO, 2013), with rates of physical violence, sexual violence, and stalking higher among bisexual (Walters, Chen, & Breiding, 2013) and transgender (Stotzer, 2009) women, and those belonging to other marginalized groups (e.g., Plummer & Findley, 2012;Rosay, 2016).Further, approximately 50% of women experience intimate partner-perpetrated psychological aggression and/or coercive control in their lifetime (Black et al., 2011).Yet, VAWG has been challenging to study due to definitional and methodological differences that sometimes fail to capture a range of experiences, including street harassment (Vera-Gray, 2017), workplace harassment (Westmarland, 2002), and other offenses (see, e.g., discussion of groping at pubs, clubs, and concert venues in Fileborn, 2016;Fileborn, Wadds, & Tomsen, 2019) by different perpetrators in a variety of settings.
Beyond in-person victimizations, VAWG is also technology-facilitated, consisting of overt and covert behaviors (e.g., aggression and surveillance) where perpetrators need not be present to aggress against their targets.Technology-facilitated VAWG has been defined by the United Nations as "acts of gender-based violence that are committed, abetted or aggravated, in part or fully, by the use of information and communication technologies (ICTs), such as phones, the internet, social media platforms, and email" (APC, 2020, para 1).With internet access being widely available and smart-device ownership overwhelmingly common (Anderson & Jiang, 2018), various social problems seep from the real world into the digital realm.
It is now estimated that three-quarters of WG have experienced or been exposed to online violence (Tandon & Pritchard, 2015), including threats of death or rape, bullying/harassment, and stalking, among other offenses, making this type of victimization more common than any in-person form.Even when these acts are limited to those perpetrated by intimate or dating partners, research indicates most young adults in college (Marganski & Melander, 2018) and youth in schools (Zweig & Dank, 2013;Zweig, Dank, Yahner, & Lachman, 2013) report technology-facilitated relationship violence.These online experiences have been associated with depression, substance use, antisocial behavior (Melander & Marganski, 2020), and increased risk for in-person partner violence victimizations (Marganski & Melander, 2018;NNEDV, 2014).
Technology-facilitated VAWG occurs in public spaces and private places, and it varies between and within groups (e.g., adults and minors, persons of color, those with different abilities, social class, etc.).Younger women, as well as those who are sexual minorities, are at higher risk for technology-facilitated sexual violence victimization, for example, than their counterparts (Henry & Powell, 2018).While the reasons for these patterns are complex, it points to the importance of social location in the study of these contemporary issues.Such violence falls on a continuum of illegal and legal behaviors (see, e.g., Kelly, 1987), and these transgressions threaten the privacy and freedoms of those on the receiving end (Citron, 2014).Further, it is a global phenomenon (see, e.g., Bailey & Steeves, 2015;Tandon & Pritchard, 2015) extending beyond the United States (US), as researchers in Australia (Dragiewicz et al., 2019), the United Kingdom (UK) (Mendes, Ringrose, & Keller, 2019), Canada (Bailey & Mathen, 2019), and elsewhere have noted.In all, technology's abuse, along with associated harms, creates an urgency for investigating VAWG comprehensively so we may derive informed and appropriate solutions.

Technology as a Weapon
Electronically facilitated VAWG has been dubbed an "old behavior in a new guise" (Campbell, 2005).Inherent in the virtual world are larger systems and interactions within those systems that imitate those in the real world; these largescale structural forces impact power dynamics and abuses in ICTs, including online sexual exploitation and image-based sexual abuse, cyberstalking, and other forms of cyberviolence (Marganski, 2018(Marganski, , 2019b)).While WG dominate social media Technology-Facilitated Violence Against Women and Girls 625 platforms and are more likely to use them than male counterparts (78% vs. 65% 1 ), they are not necessarily protected in these spaces and are far more likely to experience certain kinds of victimization, such as technology-facilitated sexual violence and cyberstalking (Flynn & Henry, 2019;Henry & Powell, 2018;Powell, Scott, Henry, & Flynn, 2020).Perpetrators of image-based sexual violence, for example, are commonly ex-husbands and ex-boyfriends who seek out and receive male peer support online that encourages violence against former female lovers (see, e.g., DeKeseredy & Schwartz, 2016;Henry & Flynn, 2019).The reality, then, is that many online harms are gendered in nature.
Some acts of technological VAWG resemble those that occur offline, albeit new offenses have emerged that present novel threats.Such technocrimes include nonconsensual photo/video recordings taken in public or private settings without consent such as "upskirting" and "down-blousing," which sometimes are distributed widely thereafter online (Henry et al., 2020;McGlynn & Downes, 2015;Powell, Henry, Flynn, & Scott, 2019), and cases where persons in power violate positions of trust by electronically tracking victims they meet during the course of their work (see, e.g., discussion of stalking and assaults of passengers by Uber and Lyft drivers in Osterheldt, 2019).Additional examples include, but are not limited to: e-trafficking, cyber-trolling, online impersonation, doxing, swatting, cyber-mob attacks, deep-fakes, and other image-based abuse (Henry & Flynn, 2019).
Similar to in-person VAWG, perpetrators may use technology to objectify, demean, intimidate, and/or control others.In contrast to face-to-face violence, however, perpetrators of technology-facilitated VAWG may be unknown because they may be unseen.They could be strangers, classmates, coworkers, friends, or loved ones (Henry et al., 2020).This may help explain high rates of these transgressions as the veil of anonymity protects and may embolden perpetrators; however, even when perpetrators are known, high rates of aggression still exist (see, e.g., Marganski & Melander, 2018;Melander & Hughes, 2018;Zweig & Dank, 2013).
Individuals have used technologies in invasive and vituperative ways; monitoring, controlling, and punishing others.Domestic and sexual violence programs have reported that technology-facilitated VAWG is a real concern, affecting nearly all who seek their services (NNEDV, 2014).Adolescent girls are also vulnerable to cybervictimization due to the amount of time spent online while navigating through difficult developmental life stages (Chisholm, 2006).Culturally -extolled femininity (Connell, 1987) collides with the normalization of VAWG (Klein, 2006), producing conditions conducive to these events.Girls more than boys experience pressure to sext (Crofts, Lee, McGovern, & Milivojevic, 2015), and those who do may be shamed later on; a by-product of contradictory cultural codes that judge girls on sex appeal while also policing their sexuality.Technology further amplifies harms, permitting perpetrators to reach a much wider audience than ever before, extending those who gaze, shame, and inflict harm.Thus, the ease, accessibility, and speed of our connections holds serious implications and presents unique challenges for not only victims who may experience heightened fear, anger, depression, and/or suicidal ideation (see, e.g., Bates, 2017;Powell, Henry, & Flynn, 2018) but also justice systems whose investigations grow ever more complicated.
Gendered cyberhate has become common in digital spaces, and behaviors comprising it have grown increasingly threatening to recipients (Jane, 2017;Mantilla, 2013).Sexually charged, highly misogynistic content and other forms of vitriol have a strong presence in segments of the Manosphere (see, e.g., discussion of incels, the red pill subreddit, etc. in Center on Extremism, 2018; Ging, 2017;Scaptura & Boyle, 2019;Zimmerman, Ryan, & Duriesmith, 2018), and though there have been challenges to discriminatory language and obscene material in the US (see, e.g., Miller v California, 1973;Pope v Illinois, 1987;Smith v US, 1977), many prejudicial posts have found protection under the guise of free speech.Some online spaces are known to house hostile viewpoints.For example, thousands of insecure men and boys who fail to live up to an ideal (e.g., hegemonic masculinity) have found solace in online communities that call for the subjugation of WG (Woolf, 2014), fostering animosity, violence, and even lethal events.The end result is toxic, as unregulated messages perpetuate myths, misinformation, and hate that influences various kinds of VAWG.
In all, technology-facilitated VAWG comes in many forms and has the potential to result in physical, psychological, behavioral, social, and financial problems.It constitutes legal and moral violations, and it infringes on individuals' active engagement in the digital world, which perpetuates and maintains inequalities that, in turn, continue to breed violence (e.g., Gorman, 2019).This can have devastating consequences on not only the persons targeted but also their families, social networks, and society.

Technology as a Shield
While technology has been used to harm WG, it has also been used to counter transgressions, hold perpetrators accountable for their actions, and protect individuals from future episodes of violence.Far from ideal, justice system responses have fallen short in meeting the justice needs of victims/survivors and, in some cases, contributed to further harms via secondary victimization (Flynn, 2015).WG have been pressured by police, courts, and other social service providers to turn off or alter the settings on their electronic devices, close social media accounts, and change financial and other accounts online, ignoring how difficult, impractical, and silencing/isolating such measures can be (Dragiewicz et al., 2018;Woodlock, McKenzie, Western, & Harris, 2019).When elements fundamental to safety, healing, and recovery are not included in justice practices or otherwise absent from the lives of those who suffer transgressions, the result is untreated, prolonged, and continued suffering.To move forward, it is imperative that structures incorporate trauma-informed care into operating systems and practices to improve users' experiences and set the tone for electronic environments in ways that shape expectations and promote responsible behavior.
For far too long, the burden of stopping or preventing VAWG has disproportionately fallen onto WG who are on the receiving end of violence (and more recently, bystanders, as found in the Mentors for Violence Prevention, Green Dot, etc. 2 ), rather than the men and boys perpetrating problematic behaviors.Due to Technology-Facilitated Violence Against Women and Girls 627 long-standing patriarchal practices, WG who are victims of gender-based violence are often blamed for perpetrator actions or bear the weight of innovating the problem away.
The failure of justice systems to address VAWG online (see, e.g., Dunn, Lalonde, & Bailey, 2017) is symptomatic of larger social failures to treat such violence and discrimination seriously.Feminists have brought light to the issues and absurdities associated with perspectives that blame victims for others' actions, which alleviates the perpetratoras well as the larger culturefrom responsibility and feeds into misbehavior as well as the inability to change.Satire pieces such as "The Rape of Mr. Smith" (n.d.) or "Tips Guaranteed to Prevent Rape" (Tarrant, 2009) highlight unjust practices and the burdens that WG face.Moving from chastity belts to self-defense classes, to antirape technologies that place the onus on potential targets of violence, it seems as though the more things change, the more they stay the same (Flynn, 2015).It should come as no surprise then that our reliance on technology for everyday interactions has led to the proliferation of technological strategies to combat VAWG.
Contemporary "solutions" to VAWG are geared toward potential victims, making rounds on social media as real fixes for large, complex problems.Examples include wearable technologies designed to detect, deter, and even punish violators, including drug detecting nail polish (Zikalala, 2017), wristbands that emit a foul odor to ward off predators (Cuddy, 2019), pepper-spray stilettos (EFE, 2017), internal barbed condoms (Karimi, 2010), and jackets/underwear that deliver powerful jolts of electricity or are nearly impossible to pull off (Euronews, 2013;Kahney, 2003).Noncorporeal technologies also exist including straws to detect spiked drinks (Steffen, 2020), stamps to mark public transit gropers (Koizumi, 2019), and consent condoms that require four hands to open (Cuddy, 2019).When existing patriarchal structures combine with capitalism at a point in history when technology reliance is at an all-time high, we see private companies advertising, marketing, and profiting from strategies that claim to reduce risk, yet ultimately fail to target perpetration and may inadvertently result in target transference or create unexpected harms as a consequence of usage.Further, antirape technologies create a false sense of safety and security, especially considering the plethora of evidence pointing to alcohol as the leading date rape drug (Anderson, Flynn, & Pilgrim, 2017;Hindmarch & Brinkmann, 1999).These technologies, while driven by good intentions, have unanticipated and undesirable effects.They misplace responsibility by targeting victims rather than perpetrators, perpetuate and reinforce rape myths, and undermine victims'/survivors' agency (Bivens & Hasinoff, 2017;White & McMillan, 2019).We would be remiss to overlook this in discussions of effective, evidence-based solutions.
Despite its limitations, technology has been used to counteract women's perceived and actual vulnerability.Cell and mobile phones may be used to connect with others in prearranged calls to avoid "stranger danger," reach emergency services, and document evidence of violations, while Apple watches may activate alarms.Technology can also facilitate the sharing of experiences and access to resources.The SmartSafe1 app, for example, assists with collecting and storing evidence that can be used in criminal justice proceedings (Caneva, 2016), whereas Circle of 6 not only provides immediate access to local and national rape crisis hotlines and resources but also automatically alerts up to six preselected friends that the person either needs help getting home safely, a phone call to interrupt a potentially dangerous situation or to talk immediately, all with the touch of a button (Circle of 6, 2020).EmergenSee and LiveSafe operate in similar ways.
Beyond responding to hazardous circumstances or events, victims/survivors have found support in the aftermath of trauma through digitized social networks, and some have taken action online to have their justice needs at least in some part met while others have advocated more broadly via ICTs (see, e.g., Fascendini & Fialová, 2011).Dissatisfied by formal justice (in)actions, some individuals have sought informal justice online.Referred to as "digital vigilantism," "feminist digilantism," or "DIY justice online" (Jane, 2016;Powell, 2017), women, girls, and allies have taken to technology to fight against online harassment to put perpetrators on notice and hold them accountable for their actions, when the criminal justice system has not (Al-Alosi, 2020).They have explicitly named their rapists online to warn others (Pryor, 2017), responded to unsolicited dick pics with the same kind of content to give perpetrators "a taste of their own medicine" (Hockaday, 2019, para 4), and shared screenshots of misbehavior with perpetrators' mothers, partners, and employers, in the hopes that informal sanctions might follow (e.g., Hawken, 2019;Payton, 2014).
Additionally, victims/survivors have used technology to gain support and organize "speak outs," demonstrations, and "take-back-the-night" rallies (Fileborn, 2014).They have also engaged in hashtag activism through campaigns like #rapedneverreported, #whenIwas, and #NotOkay (Powell & Henry, 2017).Safe spaces such as Women.com, the internet's first women-only social network (Cox, 2014), have also been created, and victims/survivors have collaborated with law enforcement to develop apps like VictimsVoice (Ridley, 2019) to safely collect and store evidence that may later be used to press charges.In rare yet noteworthy instances, institutions have stepped up in solidarity by using positions of power to advocate for victims/survivors while sending a message to perpetrators.For example, The Bristol Post posted pictures of the adult men who threatened and harassed teen climate activist, Greta Thunberg, online (Staples, 2020).
In contrast to the dark corners of the web, there are bright ones dedicated to collective resistance, healing, and empowerment.Activism projects and online communities such as Project Unbreakable, Everyday Sexism, Take Back the Tech!, and Hollaback!exist to not only document transgressions but also inform and mobilize victims/survivors of gender-based violence (Mendes et al., 2019;Powell, 2017).As such, WG have become inspired online to discuss experiences in their own words and from their own perspective, which is something traditionally denied through more formal routes.These expressions can be beneficial to victims/ survivors, providing them with emotional release and an opportunity to feel validated, help others in similar circumstances, and encourage others to bear the weight of intervention and advocacy (Al-Alosi, 2020).These efforts demonstrate the enormity of VAWG as a broad social problem, drawing attention to the issue in a more public way than could ever be possible through regular, in-person justice routes.
Technology-Facilitated Violence Against Women and Girls 629

Discussion
In much the same way that offline offenses have not been treated seriously or met with ineffective responses and solutions, online assaults too have been minimized or dismissed, leaving victims/survivors to handle these transgressions with little to no formal support.To counter the inadequacies of criminal justice (in)actions, victims/ survivors have taken to technology to pursue their own justice by calling out abuses and collaborating with one another to provide information, education, and support.Although victims of technology-facilitated violence are often without legal recourse, recent news headlinessuch as when a judge awarded $13 million to 22 women who were coerced into making sexually explicit videos sold on porn sites without their consent (Grant, 2020)highlight major victories.Yet challenges remain, as inequalities are evident for groups of people in terms of technology access, ease of use, and overall engagement (Al-Alosi, 2020, 2018;Silver, 2019).Furthermore, some countries restrict or regulate user-generated content or platform access, presenting challenges for victims/survivors, along with researchers, advocates, and others who wish to have their voices heard and address VAWG.This therefore impacts victims'/survivors' options, choices, and pathways to recovery.
Additional issues exist when considering the role of culture in the kinds of transgressions we see, and the tools that render aid.The technology industry and innovations are still male-dominated, male-oriented, and male-controlled, which poses difficulties in meeting WG user needs, and the harms they experience may go unrecognized or be dismissed.Preventing and responding to VAWG therefore requires that we become more inclusive and stretch beyond traditional design to dismantle problematic norms (Bivens & Hasinoff, 2017;Jewkes, Flood, & Lang, 2015).Solutions must engage various actorsincluding marginalized and oppressed personsthrough inclusive collaborations between not only victims/ survivors and victim advocates but also tech companies, healthcare providers, legal/ criminal justice personnel, and educators to integrate various voices and target the complexity of the issues.

Technology
Technology companies, social networking executives, and software engineers are well poised to make important advances in victim safety and protection.When these powerful institutions and employees combine with advocacy groups, they can better meet the needs of victims/survivors.The Safety Net team at the National Network to End Domestic Violence (NNEDV, 2014) demonstrates the promise these multifaceted collaborations have through its Tech Safety app, which aids individuals in recognizing warning signs of stalking, harassment, and abuse, while offering tips for documentation and resources for support.Beyond education and victim/survivor support, they have targeted perpetration via the Coalition Against Stalkerware (NNEDV, 2019), which advocates for the eradication of spyware due to this tool being used by perpetrators to track victims' locations and activities.Tech entrepreneurs can benefit from education and training on VAWG, which may then influence innovation design in ways that protect users, improve safeguards, lessen perpetration, and provide valuable resources to users, thereby creating a friendlier, sociotechnological space.
Instead of having the ultimate goal of keeping people logged in without consideration of their experience, satisfaction, and well-being, prioritization should be given to "regenerative technology," which refers to healthy online interactions that emphasize having empathy and kindness at the foundation (Zaki, 2019).As such, with any technological advancement, it is important to consider ethics.That is, just because it can be done, does it mean that it should?What are the potential outcomes?Are there ways to offset or respond to harm?It is imperative that those at the helm of these technological innovations, such as software engineers, receive adequate ethics training while reflecting on human rights and the costly consequences of inequality and discrimination that may restrict or impede the digital inclusion and progress of all persons in this era (Beduschi, 2018).Beduschi (2018) warns that the very software engineers who are creating the algorithms behind programs that make our lives easier may also be breaching our rights without even knowing it, invading and recording private conversations, and channeling people into haphazard spaces.Thus, we need to ask technology creators to step up, listen to, and learn from users, including victims/survivors, in order to secure safe digital spaces that maximize experiences and overall connectedness.Tech companies should have clear policies in place for violators, but they should also be willing to work on restorative practices in ways that modify behavior and produce actual change.Further, although much offensive online content enjoys protections, software engineers should be charged with preventing harmful content from being freely shared and finding userfriendly and intuitive ways in which victims/survivors can opt to filter out or limit their exposure to triggering materials.Such collaborations between technology users and designers could prove transformative and reduce initial, repeat, and vicarious victimizations.

Healthcare
Healthcare providers can play vital roles in combating the effects of technologyfacilitated VAWG.Physicians, primary care providers, counselors, and others involved in the healthcare system should be educated to recognize the signs and symptoms of tech-related violence, have screening tools available to aid in assessments, and have online and offline treatment plans in place.They may inform parents, as well as their patients, on ways to stay safe online and connect patients with other service providers, such as social workers, to identify sources of local support (Waseem et al., 2017).Much like with other illnesses and injuries, early intervention could be the key in producing more positive outcomes.Physical health, mental health, and wellness services can provide patients with vital services that promote healing and recovery in ways that work best for them.Medical interventions are often quite costly for domestic and sexual violence victims in general, which may deter some from seeking services.As such, it is critical to have online resources such as the Compensation Compass app to alert users of Technology-Facilitated Violence Against Women and Girls 631 compensation funds to which they may be entitled, which helps alleviate financial strains (Masters, 2019).Such combined efforts signal to the public the serious nature of technological abuse.

Legal/Criminal Justice System
Increasing legislation in the US has been enacted to protect victims of technologyfacilitated violence and abuse.There are now 45 laws related to adult cyberstalking/harassment (Working to Halt Online Abuse, 2020).In the US, there are two federal statutes, the Interstate Communications Act of 1994 and the Federal Interstate Stalking Punishment and Prevention Act of 1996, and many other underutilized statutes that can be used for technology-facilitated violence (Cox, 2014).Recognizing that these offenses can have severe consequences, almost every state in the US has implemented varied laws that criminalize technologyfacilitated violence or modify existing law to include cyber-related provisions (Cox, 2014).Nonetheless, there are many challenges and limitations to law.
To prevent technology-facilitated violence and abuse, a clear understanding of specific offenses comprising it (e.g., cyberstalking, online harassment, and imagebased sexual abuse) and articulation of what constitutes each type of offense is needed to inform justice systems and personnel.The disseminated information should also include information on the dynamics of these harms, including the cooccurrence of digital and in-person offenses (see, e.g., discussion of nonlethal events in Marganski & Melander, 2018;and lethal events in Todd, Bryce, & Franqueira, 2020).Such training has the potential to improve arrest and prosecution practices that impact victim safety and perpetrator accountability.Importantly, this means taking reports of digital harms seriously.Women's/girls' experiences of technology-facilitated violence have been minimized by law enforcement and others in power, much like in-person experiences of gender violence historically has been (Citron, 2009).
Even when recognition of the problem exists and reports are taken seriously, perpetrators of technology-facilitated crimes continue to evade detection and otherwise fall through the cracks of the justice system.Some cannot be traced; others may be dismissed.Conflicts in law further complicate matters, such as when a perpetrator resides in one jurisdiction while the target resides in another, thereby creating the need for expansion and collaboration when it comes to domestic and international law differences (Flynn & Henry, 2019;Henry, Flynn, & Powell, 2018).The multijurisdictional nature of many crimes calls for federal law enforcement and prosecutors to work in tandem with local agencies and offer resources necessary for investigations (Wilkinson, 2016).Beyond further criminalizing and responding to offenses, however, it is essential that the criminal justice system collaborates with victims' advocates and victims/survivors to outline effective remedies and protect human rights.Effective community responses, therefore, require the convergence of multiple stakeholders to share knowledge and purposively consider lived experiences in contemplating solutions.

Educators
Education on the nature and consequences of technology-facilitated VAWG is crucial and should come from a variety of sources.Teachers at all levels can raise awareness of this social problem.School-based cyberbullying prevention and intervention programs for children and university students, which includes individual-level, multi-level systemic, and universal school approaches, have demonstrable effectiveness in reducing perpetration and victimization (Doane, Kelley, & Pearson, 2016;Gaffney, Farrington, Espelage, & Ttofi, 2019).Cyber harassment programs have also been implemented in the workforce, and it is recommended that not only should there be education/awareness regarding how to respond to specific instances of cyber aggression but also attention to the role of organizational culture in shaping individual behavior (Faucher, Cassidy, & Jackson, 2015).It is imperative that prevention programming includes elements of bystander intervention to reduce these modern transgressions and hold those who may offend to a higher level.

Limitations
Although technology shows promise in addressing VAWG, it has limitations.While it can provide access to much needed services for WG in remote locations, not all women can afford or even know how to operate the latest and greatest technological innovations (Al-Alosi, 2018, 2020).Further, technology is not failsafe, and routine maintenance, lack of internet access or connections, and other service disruptions can interfere with regular performance, isolating women from sources of support (Al-Alosi, 2020).There are also issues with the corporate model of technology that collects and sells user data, opening users up to privacy risks.As such, overreliance on these devices and platforms may contribute to erroneous conclusions about safety and security that endanger some and contribute to violence.Performance aside, technological advancements may have other unintended consequences, including victim blaming.Similar to asking a woman to switch off her phone if she does not want harassing text messages, it is not inconceivable that a victim would be asked why she did not activate her safety app or gather video evidence of her assault (Henry et al., 2018;Powell, 2017).This modern-day victim blaming may revictimize the woman as she is interrogated about why she did not alter her behavior to prevent the assault (Woodlock et al., 2019).The very technology women may utilize to reach out for help may also be used against them by partners, who may own and control access to their cell and mobile phone accounts, using the same devices and programs to track and monitor activity (Al-Alosi, 2020;Freed et al., 2018).Those who reach out online for community support may also face a different kind of backlash from anonymous strangers who seek to revictimize, shame, and denigrate victims of TFVA.
Nevertheless, technology has much to offer victims of gender violence.Victims/survivors may use digital means aimed at responding to and preventing VAWG, and there are benefits of using technology, including: providing victims access to vital online and in-person resources (e.g., educational materials, service provider support, etc.), warning others of predatory behavior, reducing feelings of Technology-Facilitated Violence Against Women and Girls 633 isolation through social connections, and giving a voice to those often silenced.More intersectional research, however, is necessary to examine the effectiveness of technology in preventing technology-facilitated VAWG, both in the short and long term, as different groups have varied experiences online (Felmlee, Rodis, & Francisco, 2018;Powell et al., 2020).
Inclusive and culturally appropriate solutions for victims/survivors are necessary.Technological innovations that improve digital device security mechanisms and more accurately assess user patterns to determine that the primary owner is the only permitted user, such as enhanced authentication methods (Freed et al., 2018), and further engagement with online providers to curb harassing activities of their users online (Fascendini & Fialova, 2011;Dragiewicz et al., 2018) are warranted.We should not tell victims to simply stop using social media and electronics.Instead, we should be creating spaces that are safe and empower them to make decisions that promote healing, recovery, and solidarity.We see evidence of gender-based hate, misogyny, and other discriminatory practices in this digital era, which is increasingly recognized and confronted, yet we lag in reactive and proactive solutions.Until we target violence at its source (i.e., perpetrators), technology-facilitated VAWG will continue.

Conclusion
Just as vulnerable communities need a continuum of care that may include mentors, social and recreational activities, and therapy, so too do vulnerable tech users.WG worldwide face real threats to their health and well-being from strangers, classmates, coworkers, intimate partners, and family members, both offline and online.VAWG is a global phenomenon in dire need of attention.Rather than normalizing and dismissing these experiences, we need to collaborate to address perpetration, challenge problematic norms, and offer support to those most marginalized and harmed.Social and structural conditions contribute to technology-facilitated VAWG and allow perpetrators to thrive, perpetuating systemic injustice and maintaining inequalities.By educating and sensitizing various persons to technology-facilitated VAWG, providing safeguards and assistance in digital spaces, and specifying sanctions for law-breaking behavior (Tandon & Pritchard, 2015), toxic behaviors and norms can be diminished.Correcting institutional and systemic responses to address misinformation and propagate promising solutions will shift norms, policies, and practices in informed and responsible ways that exact positive cultural change.
The Anti-Defamation League's Center on Extremism (2018) also articulated specific recommendations for addressing VAWG including: building understanding among law enforcement officers and organizations about the nature of misogynist hate; taking legal and policy actions to recognize modern transgressions as harmful and promote gender equality; incorporating gender-based hate in antidiscrimination educational programming; receiving support from federal and state agencies/ leaders for research and services on victims, perpetrators, and communities; forming partnerships between tech and the public; improving and enforcing Terms of Service; filtering out obscene and offensive content; and punitive actions.These proposed measures are an excellent start.However, alternative solutions that address the root causes of these behaviors and work in reintegrative ways are also needed.
Understanding the nature of contemporary technology-facilitated VAWG, along with the unique struggles victims/survivors face can create more compassionate, user-friendly digital spaces (Gjika & Marganski, 2020).By recognizing diverse needs and implementing victim/survivor-centered approaches, larger structures (e.g., media and social media platforms) can help those harmed gain a sense of control and empowerment that allows them to act in their best interests.For too long, victims have been the forgotten persons in our justice systems, and we are witnessing the same apply to the digital world.True justice requires that we work with those who have been injured, and the social forces and norms that permit the continued perpetration of harm, to develop solutions.This requires a systemic understanding of how violence and its regulation affect the tech milieu as a whole.By prioritizing the rights, needs, and wishes of those who are most vulnerable to harm, we can make progress in digital spaces and social interactions.

Notes
1. See, PEW (2017).2. These programs aim to educate and engage community members such as students, faculty, and staff in bystander intervention strategies, so that individuals may recognize and respond to social problems in constructive ways that simultaneously help create positive cultural change.