Adoption of digital technologies amidst COVID-19 and privacy breach in India and Bangladesh

Abstract This article problematizes the institutional void caused by the lack of accountable digital regulation in India and Bangladesh regarding the adoption of public health-related digital technologies during the COVID-19 pandemic. Findings from literature review and preliminary interviews illustrate an emerged pattern in these countries that intersect governmentality and materiality with an absence of oversight. The findings further indicate an absence of privacy laws that leave citizens vulnerable to privacy breach. As surveillance becomes a social norm, authorities appear to turn a blind eye toward human rights while public remain unaware and uninformed. The article recommends that consumer-centric governmentality is needed to ensure the privacy and protection of consumers and citizens in India and Bangladesh.


Introduction
The COVID-19 pandemic emerged as an unparalleled event that changed the lives of billions of people throughout the world. Governments across the globe implemented "lockdowns," and "self-isolation" regulations to prevent the spread of the highly contagious virus which radically restricted peoples' mobility and deeply impacted their daily lives (WHO 2020). With the resulted widespread adoption of digital technologies, regular lifestyles moved online ranging from commerce and social connections to business, industry, and unfortunately criminality (Lallie et al. 2021). As a result of rapid proliferation of COVID-19 cases, World Health Organization (WHO) urged countries to increase COVID-19 patient testing, isolation, and contact tracing to battle the pandemic (Hellewell et al. 2020). This prompted most national governments to launch COVID-19 contact tracking applications for smart phones (AarogyaSetu 2020; Aljazeera 2021). The point of departure for this article is the resulted governmentality and materiality that consequently led to the normalization of a worldwide

Methodology
Owing to the topical nature of the subject, this article mainly employs a documentary research approach to explore and gather information from a variety of scholarly sources and databases, including Elsevier, JSTOR, Springer, SAGE, Wiley, and Scopus. In addition, a manual search in reputed public policy journals and Google Scholar was conducted to encompass relevant publications in the study. As research in this area is at a very nascent stage, efforts were made to include factual and evidence-based information from newspaper articles and research organizations while attempting careful selection and assessment of the quality of information sources. Few reports from international organizations also guided our research and helped build strong arguments on this sensitive topic of interest. Overall, for maintaining academic rigor and integrity, only reliable sources of information have been used, such as research papers, government information sources and policy documents, news articles and analysis, and publications and reports by global organizations. Given the limitation of scholarly information on the topic in Bangladesh, six semi-structured anonymous interviews were carried out using purposive sampling with interlocutors from public administrative service and the private sector, in particular, the decisions makers associated with public health app development companies and/or service providers. Overall, the respondents were involved with varying tasks ranging from data collection from public to provide various types of public services during the pandemic.

Exposure to privacy breach and cyber threats: cases of India and Bangladesh
While the widespread adoption of digital technologies to curb the pandemic was largely promoted and dictated by governments in India and Bangladesh, the loopholes in practises appear to expose numerous cyber threats involving privacy breach, data sharing, and hacking. This claim is substantiated by an increased number of cyberattacks in India during the outbreak increased to about 300 percent, reaching 1,158,208 in 2020, compared to 394,499 in 2019 (Chauhan 2021). Similarly, in Bangladesh, hackers have been increasingly targeting not only individuals but also industries or sectors including banking and finance, healthcare, law agencies, education, energy and nonprofit. In particular, a hacker group called "Hafnium" alone launched attacks on more than 200 organizations in the country (Dhaka Tribune 2021). They reveal that people are increasingly vulnerable to cyber-crimes and their privacy can be compromised. This necessitates awareness and action.
In addition, to contain the virus spread, various COVID-19 contact tracking applications and promising technologies have been introduced to "test, track, and trace" the population (Iyengar et al. 2021). However, specific intentions of such tracing apps are not clearly stated, which may result in data breach and cyber threats and can compromise the privacy and rights of individuals. These apps, although launched by governments, attract widespread suspicion among personal users, as well as conscious-stricken bureaucrats and service providers because such utilities pre-collect or obtain a slew of superfluous rights and accesses before they can be launched. In addition, quite often the purpose for collecting supplementary information and the data storage information for these apps remain unclear. What is alarming in this context is the absence of an adequate institutional, state, or federal regulatory oversight. We highlight this critical gap and argue that under the umbrella of public health imperatives, unclear intentions can expose citizens to a multitude of threats, where users unknowingly conform to their surveillance and personal data extraction that consequently can have serious repercussions in the long run. In the following (Tables 1  and 2, next pages), we share some of the key apps and initiatives that emerged in India and Bangladesh during the pandemic. It is worth mentioning here that although both the countries introduced a plethora of such apps, the tables reflect only those whose information regarding privacy policies, permissions, and data collection was available in the public domain.
As summarized in Tables 1 and 2, government initiatives and public-private partnerships facilitated the launching of a number of apps in India and Bangladesh, in   Johnson 2020). These apps track and trace the infected and symptomatic users and also allow them to measure their own risk of contracting the deseaseby combining cutting-edge Bluetooth technology, algorithms, and artificial intelligence. However, the permissions and access requirements of these apps raise questions about whether superficial regulatory "self-monitoring" will be restricted to the pandemic or will emerge as a people surveillance mechanism. While the efforts in combating this deadly disease are highly appreciable, concerns at the same time have been raised regarding arbitrary public-private sector access in a variety of contexts, including excessive data collection and processing, unauthorized sharing of personal data, and unrestricted surveillance and tracing of people (Mahapatra 2021). Although most apps have a privacy policy, as depicted in Tables 1  and 2, crucial aspects, such as the purpose for data processing, data retention, duration and objectives, as well as data sharing policies and terms remain unaddressed in these policies (Chaturvedi, Kalyani, and Jain 2020). In addition, excessive permissions by these apps for accessing various software components of a smart phone raise doubts and question the underlying intentions. For example, scholars, such as Chaturvedi, Kalyani, and Jain (2020) discover that some apps that are developed to inform and send alerts, request access and permissions for location, photographs, storage, and camera that are unjustified and enhance the risks of citizen surveillance. Such concerns have also been raised for the AarogyaSetu app, which claims to act as "a shield of protection" for Indian citizens (Dhar 2020;O'Neill 2020). Similarly, as identified by the interviewees, Bangladeshi apps such as Prava Health, Corona Tracer BD, and Surokkha, among others raise similar concerns.
According to the Massachusetts Institute of Technology (MIT) Review, India's AarogyaSetu app offers considerable threats to users' privacy in comparison to similar apps in other nations. The journal's investigators reviewed government-backed health apps from several nations based on five criteria i.e. (a) voluntary: if these apps require voluntary or mandatory participation; (b) limited: whether the collected data is deleted automatically after some time; (c) data destruction: if there are restrictions on data use; (d) minimized: if only sufficient information is collected; and (e) transparent: whether the apphas an open-source code and is built on publicly available policies (O'Neill, Ryan-Mosley, and Johnson 2020). As depicted in Table 3, most developed nations such as Australia, Canada, France, Italy, and Japan, satisfied all five criteria that illustrate their ethical commitment toward citizens. However, developing countries such as India and Bangladesh managed to meet only two of the five criteria (Table 3, next page). Scholars, such as O'Neill (2020) state that "India has no national data privacy law and it is unclear who has access to data from the app and in what scenarios." Interestingly, same was found for Bangladesh, though a few of the interviewees that function in the country's high-level bureaucracy acknowledged that the government has stored the data collected from different apps in National Data Center. However, they were unable to say that how Bangladeshi government is going to secure the privacy of those data. These findings emphasize the need for such laws and substantiate our argument that the lack of data privacy law and clear policy guidelines in India and Bangladesh raise suspicion and make citizens vulnerable to threats and breach.
The possibility that these public health apps could be used for purposes other than those for which they were designed has sparked the interest of various stakeholders. For example, the data collected in the Singapore app "TraceTogether" was used by police as an evidence in a murder case. This erodes public trust in these apps and raises concerns surrounding data privacy and usage (Jasper and Bismonte 2021). The majority of the COVID-19 apps around the world, in particular in Bangladesh and India, do not distinctly mention their scope, use, and purpose. The Internet Freedom Foundation (IFF), a digital rights organization, describes the AarogyaSetu app as a "privacy minefield" as it leaves one staring through a fog of uncertainty due to lack of established norms and insufficient regulatory frameworks on data protection and privacy (Saini and Mehrotra 2020). Furthermore, Robert Baptiste, a French cyber security analyst also claims that AarogyaSetu has severe privacy and security flaws (Dhindsa and Kaushik 2020;Dhar 2020). This is supported by the fact that the app was hacked by a Bengaluru-based software engineer and its defenses were cracked in less than four hours (Dhar 2020).
Similarly, in Bangladesh, interview findings reveal insufficient regulatory frameworks on data protection and privacy. The interviewees admitted receiving inadequate training and information regarding ensuring data privacy of citizens. For example, a key respondent, an Upazila Nirbahi Officer (UNO)-the chief executive officer of an upazila or subdistrict-a mid-level civil servant in Bangladesh Government's administrative service, mentioned that he had gathered information in an excel sheet designed for data collection from the general public in her constituencies and later uploaded those documents in a software specifically designed for the purpose. The respondent acknowledged she overheard the collected data from citizens were stored in the National Data Center. However, overall, the respondent was unable to provide information about how does the National Data Center or any other government institution ensure the privacy and safety of data provided by the citizens. The authors observed a similar response in regards to the storage and maintenance of public data from other interlocutors including seasoned public servants and private sector executives. The role and function of the National Data Center appeared like a black box to these interlocutors.
These interviewees further shared the situation during pandemic was very challenging for the Bangladesh Government. Therefore, all the instructions or concerns from the government were about how to reasonably distribute assistance, in particular, food, money, medicine and medical services to people living in the rural part of the country. The respondents generally acknowledged that the Government of Bangladesh was not in a position during the pandemic to give priority to the issue of data safety of the citizens. The interlocutors confessed that the concept of data privacy of the citizens was rather an unimportant topic to the data collecting administrators and their users. Although each respondent-from a senior bureaucrat to a private sector representative-claimed that the collected data would not be misused. Unfortunately however, none of the interlocutors could shed any specific information regarding the country's central server, backup databases, and their authorized access by different stakeholders. The respondents acknowledged that the collected data from citizens were stored in National Data Center. However, none of the interviewees was comfortable speaking of the measures of the center. Overall, the respondents were unable to provide information about how does the National Data Center or any other associated government institution and vendors ensured the safety of data provided by the citizens and proviacy for the citizens.
Interestingly, examples drawn from developed countries suggest that countries with robust digital infrastructure can make mistakes in protecting citizen information. For example, in South Korea the authorities sent text messages to users containing their vital personal information and these texts were used to trace the movement of coronavirus positive people (Kim 2020;Saini and Mehrotra 2020). Since these messages are not supported by strong cyber walls, it may result in an easy data leak and privacy breach. These examples solidify our argument that governmental health monitoring apps and initiatives can have significant flaws. The embedded regulatory gap paves the way for increased threats and breach of privacy through cyber-attacks or even surveillance under the shade of pandemic and beyond.
Scholars such as Dhindsa and Kaushik (2020) and Deb (2020a) raise concerns about the terms and conditions of not only the apps but also the governance measures. For instance, the Government of India made the installation of AarogyaSetu mandatory for obtaining e-pass for public transportation that left residents with no choice but to install the app on their devices (Dhindsa and Kaushik 2020). In addition, the app's privacy policy does not mention which government departments will have access to its data. Also, the app stated aim is ambiguous enough that the government may repurpose it or broaden its reach to utilize the data for other purposes (Deb 2020a). For example, law enforcement agencies may use sensitive personal data obtained for contact tracing people for punitive purposes (Dhindsa and Kaushik 2020). Thus, although smartphone applications for managing COVID-19 have been widely in use, issues such as privacy, safety, security, and data protection remain major concerns. To combat these issues and flaws, few steps have been undertaken in countries/states such as Singapore, Italy, and Arizona in the United States, for their respective apps TraceTogether, Immuni, and COVID Watch that amended their privacy rules to ensure that no personally identifiable information is collected and that explicit policies on data usage and disposal are in place (Alanzi 2021). In line with this, the Government of India recently revised the AarogyaSetu app's policies, introduced new protocols, and made the app's source code available to public to ensure greater safety, transparency, accountability, and scrutiny (Dhar 2020). The introduction of privacy policies, terms of service, policy revisions, and public awareness regarding the same is necessary as the loopholes of these apps not only expose users to cyber threats and compromise their privacy, but can also cause an infringement of human rights, as discussed further. It is worth noting that although the Personal Data Protection Bill 2019 was formulated in India for protecting personal data and privacy, it is still a long pending matter and is in draft form. Thus, no data protection law exists till date in India. The same holds true for Bangladesh where the government has recently undertaken the initiative to draft a data protection law for its citizens (Mahmood, 2021).

Threat to human rights
While various governments and researchers appreciate the contribution of innovative technologies such as global positioning systems, mobile phone apps, and facial recognition to track and contain the spread of SARS-CoV-2, scholars such as Ferretti et al. (2020) and Sekalala et al. (2020) argue that these technologies pose a direct threat to individual rights and privacy because they seek access to real-time location of citizens and record data regarding their digital footprint and mobility. This causes infringement of basic human rights and freedom of movement as any mishaps can turn deleterious especially for females, members of religious and ethnic minorities, and individuals with different political perspectives (Davis 2017). For example, dozens of Africans in Guangzhou, China, have reported being evicted from their homes and subjected to other forms of discrimination, as a result of a misunderstanding about COVID-19 transmission because of their identification through COVID tracing apps (Marsh, Deng, and Gan 2020). Thus, various such cases, where public health surveillance and monitoring mechanisms were irrationally used for discrimination, highlight the need to codify individual rights protection during surveillance where nondiscrimination and equality are considered crucial aspects of international human rights legislation (Sekalala 2020).
In light of COVID-19's global spread, the World Health Organization (WHO) indicates that the primary surveillance objectives during the pandemic are to "monitor trends in COVID-19 disease at national and global levels, rapidly detect new cases, monitor cases, provide epidemiological information to conduct risk assessments at the national, regional and global level, and provide epidemiological information to guide preparedness and response measures" (WHO 2020). WHO also states that digital technology can help with timely reporting, contact tracing, and data management during the pandemic and emphasizes the importance of public health surveillance for preventing and controlling disease spread. However, the public health data and records maintained by these digital surveillance mechanisms are frequently personally identifiable and sensitive, and may expose facts about a person's contacts, lifestyle, behavior, preferences, and location, which poses a threat to an individual's right to privacy (Ferretti et al. 2020). As a result, the emergence of such surveillance technologies accompanies concerns regarding individual privacy, human rights, data breach, data abuse, how the data is deployed, and who has explicit authorized access to data.
The scope and complexity of digital health surveillance pose three major human rights concerns which threaten to undermine the efficacy of public health surveillance system as public trust erodes when surveillance technologies curb human rights and freedoms. To begin with, the effectiveness of digital technologies for global health surveillance is debatable, as many of these systems are still in their initial phases of development (Gasser et al. 2020). Although the huge potential of digital tools cannot be ignored which includes swiftly identifying infected persons, tracing their contacts, and examining their travel patterns; it is worth noting that the COVID-19 apps were developed hastily using technologies which have not been previously tested and therefore,evaluating them in terms of privacy, accuracy, and effectiveness is utmost important (Budd et al. 2020). In line wth this, the contact tracking apps developed by several countries were assessed by MIT on the basis of five parameters (Table 3) revealed that various apps were not doing ethical justice to citizens and did not meet the very basic five parameters. Thus, the hurried development of these apps without following adequate procedures and norms, questions their operationalization, transparency, and effectiveness.
Second, the fact that data is created, used, and stored by third parties raise issues of accountability. It has been observed that the abrupt emergence of new digital surveillance techniques involves a significant number of third-party private actors with access to personal data of citizens that could be exploited in ways that irrevocably harm public trust (Calvo, Deterding, and Ryan 2020). Third, in addition to the infringement of human rights to privacy and safety, the right to equal and fair treatment is also impacted as increased digital health surveillance may intensify specific harms to minority groups like LGBTQ people and immigrants. It has been witnessed that minorities are vulnerable to privacy violations especially when their personal identification and demographic details are leaked. One such case emerged in South Korea where the data collected and exposed by the contact tracing apps for COVID-19 outbreak resulted in homophobic abuse directed at the South Korean LGBTQ community (Sternlicht 2020). Due to a prolonged history of discrimination, minority groups remain reluctant to seek health care and share their personal details in the COVID-19 apps which are monitored by governments and their agencies (Davis 2017).
In addition, making commercial and public services dependent on the download of digital monitoring tools (such as contact tracing applications) could jeopardize personal autonomy while also discriminating against existing marginalized groups. Employers may require employees to download apps before being permitted to work, citizens may require these applications to access public services (such as health care or public transportation), and landlords may require people to download apps and show their health status before renting a property. This mandatory app downloads and usage is not only coercive but also perpetuates inequality by barring poor and vulnerable people, such as migrants, from accessing and using the apps. For instance, in India, AarogyaSetu has been made mandatory for all employees, rendering the concept of meaningful consent obsolete (Phartiyal 2020). Similarly, employers in Singapore are urged to encourage all employees to download the TraceTogether app, where it is made mandatory for specific categories of migrant workers, who are particularly vulnerable because they frequently have fewer rights than other residents (Sekalala et al. 2020). Additionally, some population groups, such as the elderly, disabled, and those with lower socio-economic position who may not have proper internet access are disproportionately affected and inaccurately assessed by the digital monitoring techniques (ITU 2019).
For example, Qatar's contact tracing app "Ehteraz" (meaning precaution) requires users to have compatible cell phones, which is especially problematic for people who are impoverished and cannot easily afford smartphones for app installation. In addition, failure to download the app invites imprisonment and a hefty fine where such illegitimate rules regarding app downloads disproportionately harm Qatar's 88 percent migrant population, which further aggravates the social inequities already present (Sekalala et al. 2020). Thus, data misuse, privacy violations, and discrimination as a result of the uneven coverage and consequences of digital health surveillance measures may not only erode public trust but also result in infringement of human rights which is a major cause of concern.
The Siracusa Principles recognize that public health might be used as a justification for restricting certain rights (such as right to privacy) to allow a state to respond to a substantial threat to the public health. The Principles state that the human rights derogations must meet three essential criteria i.e. legality, necessity, and proportionality (Siracusa 1984). The first criterion of legality advocates that all privacy constraints must be based on law and must not be arbitrary (European Convention on Human Rights European Treaty Series No 5 1950). Also, it states that the personal data of individuals must be processed in a transparent manner in order to comply with the law. Thus, transparency is considered crucial with regard to human rights because it allows people to obtain meaningful permission, monitor how data is used, and seek recourse when they believe their rights have been violated (Bustro and Doebbler 2020). However, a lack of transparency has been accused of several governments. For example, the Government of India has been chastised for allowing any government department to access data acquired via its contact tracing app for purposes other than healthcare (Deb 2020b). Similarly, in Israel, the High Court of Justice ruled that digital surveillance during the COVID-19 pandemic, which used national security legal authority for the Ministry of Health to implement digital tracking of individuals, was deemed illegal because it was carried out under an executive order and lacked the scrutiny that would have been present if it had been carried out through legislative means (Bandel 2020). Unfortunately, we could not find any report from Bangladesh in this category. Critically commenting, the lack of finding can be argued through the lens of lack of democratic governance and fear in the public sphere.
The second criterion of necessity underscores that states must demonstrate that the restrictions are "strictly essential," in the notion that these must emerge from a pressing public or societal need, where the human rights are limited or curbed through the use of digital technology for the greater good of the society (Siracusa 1984). While there is compelling evidence that digital outbreak response tools are more effective at providing epidemiological data for disease diagnosis, there is no definitive evidence that digital surveillance tools for contact tracing are helpful in containing the spread (Budd et al. 2020). In such a situation, where there is a lack of evidence regarding the effectiveness of surveillance mechanisms, the criterion of necessity is not met and therefore states must avoid unnecessary restrictions on people that curb their human rights such as right to privacy, right to safety, right to no discrimination, and right to freedom of movement.
The third criterion of proportionality advocates that the limitation of human rights must be proportionate to the goal. Thus, in case of coronavirus crisis, the measures adopted must be time-bound and purpose-limited that aim to prevent the spread of the disease. In this light, any digital health surveillance that goes beyond what is required for public health would fail the proportionality test. It must be noted that this criterion is not met by most COVID-19 apps in India, Bangladesh, and abroad where there is lack of transparency and policy guidelines, the data collection is not time-bound, and the purpose of data collection and its authorized access by stakeholders is largely unknown. In such a scenario, no valid explanation or reasoning can be extended by the governments for infringement of human rights by massive human surveillance mechanisms and arbitrary restrictions imposed on general public during the pandemic crisis. Thus, for greater acceptance of digital surveillance techniques, these must be evidence-based, need-based, nondiscriminatory, time-bound, purposerestricted, and have mechanisms to ensure greater transparency, scrutiny, and accountability.

Governmentality and the "alternatives": preventing privacy breach and rights violation
We analyze governmentality through Foucault, as interpreted by Heath (2018), who unpacks governmentality as a type of power, in particular, the overarching superiority of governmental power to analyze a state's governmentalization process. Paraphrasing Foucault's work, the researcher identifies three categories of governance model including: (i) self-governance-associated with morality; (ii) family governance-associated with economy; and (iii) state governance-a political process (Foucault 2007, 94). Overall, they encompass what Foucault terms governmentality: a form of power ascendant in Europe beginning in the sixteenth centurythough with much older antecedentsgovernmentality emerged in apparatuses that combined sovereign, disciplinary and governmental power, each of which in isolation has its own ends: for sovereignty, submission to the law; for discipline, to normalize the behaviour of individuals; and for government to employ tactics that alter individual behaviour in order to manage populations (Foucault 2007, 98-99;Heath 2018, 1-2). Foucault helps us to understand that government entails "any attempt to shape with some degree of deliberation aspects of our behavior according to [a] particular sets of norms and for a variety of ends" (Berquier and Gibassier 2019, 377). Therefore, governmentality structures embed the sphere of actions of others. In particular, Dean (2013) underlines that government is about "deliberately directing human conduct, which is conceived as something that can be regulated, shaped and controlled." However, Broto (2017) argues that governmentality process should rather be unpacked as "mechanisms of orchestration, which sometimes requires domination, but most times works upon mechanisms of seduction and inducement." Dean (2013) further explains that governmentality processes and practices are designed and executed with pre-determined intentions and facilitate the achievement of normative objectives and strategies. The precursor for the execution of governmentality encompass "specific forms of knowledge to be able to shape what constitutes good, virtuous, appropriate, responsible conduct" (Dean 2013, p. 12).
In addition, Kooy and Bakker (2008, 377) interpret that Foucauldian theories of governmentality critically deconstruct the knowledge construction process of governing "subjects," that are periodically reshaped to influence social relations. Indian post-colonial critic Chatterjee (1995) adds that Foucauldian analysis also includes wide ranging topics, such as colonial and indigenous discourses and texts that are usually overlooked by traditional western scholarship (Chatterjee 1995). In addition, other post-colonial scholars, such as Agrawal (2005) and Legg (2006) assess governmentality beyond the parameters of western, liberal, and democratic states, as well as analyze how power relations materialize, sustain, and alter within physical space and through material functions emphasizing the patterns, overlaps, and contradictions. The works for the post-colonial scholars help us to understand the production and execution of the governmentality in developing countries, such as India and Bangladesh by overcoming the limitations of the western views to interpret the culture and tradition embed governmentality of the Global South.
Building upon Foucault's (1991) critiques, state-employed disciplinary apparatuses function to identify, punish, and correct deviants, thus making state-advocated agendas social norms. In line with this, the state introduced people surveillance mechanisms for tracing citizens during the pandemic must be diligently evaluated and must be prohibited to be accepted as a permanent people tracking social norm. In particular, Foucault's biopower construct described states' control over a population by not only relying on punitive techniques but also by making efforts in forging "a form of activity aiming to shape, guide or affect the conduct of some person or persons" (Gordon 1991). Accordingly, unlike in the context of sovereign power, in the biopower setting, state-advocated social disciplines are diffused to societies aiming at convincing a population to view these disciplines as social norms (Li, He, and Liu 2021). In the case of contact tracing applications and the associated governance norms propagated by the states, it is observed that due to fear of the disease amidst pandemic and owing to unawareness, the citizens accept the nontransparent terms and conditions of the apps which may harm their privacy and rights. Thus, the states in a biopower setting, may aim to use this crisis situation as an opportunity to dictate surveillance norms to citizens as it can expect wider acceptance by the public because of the fear and uncertainty surrounding the situation.
The Foucauldian conceptualization of normalization as a process and the roles of biopower apparatuses in the process can have crucial implications in the contemporary consumer society. Therefore, we interpret governmentality through alternative perspectives: Assemblage, Actor Network and Institutional theories. Such alternative approach pays equal attention to: (i) state and non-state apparatuses; (ii) materiality and immateriality; (iii) ontological apparatuses between human and non-human, and between individuals and institutions, and; (iv) political and apolitical studies. Such approaches are more useful in democracy because they consider all stakeholders' perspectives rather than only state perspective and necessarily advocates a network of which all living and non-living are a part and exert influence. These approaches do not advocate rigid Governmentality and a biopower setting as observed in case of surveillance mechanisms adopted by governments in different countries.
Although scholars including Hardy and Thomas (2015) and Dean (2010) argue that Foucault's governmentality derives from a strong emphasis on materiality. His conceptualization of power/knowledge contains material and immaterial content, as well as all objects, individuals, organizations, and symbolic sources deemed useful for establishing the governance of a population. In his words, "Nothing is more material, physical, corporeal than the exercise of power" (1991). Building upon that, Li et al. (2021) argue that the modern governmentality is essentially big data enabled manipulation of materialities.
For example, in the context of the COVID-19 global pandemic, a material analysis of governmentality amid a public health crisis illustrates that significant health information is collected through mobile phones and computers, in which states take the lead, mobilize, empower, synthesize, manipulate, enforce, and rationalize. Reasonably adding, "biopolitics plants the destruction of humankind into the global or national political scene to justify, without grounds, the most brutal and/or absurd countermeasures" (Esposito 2008). Therefore, scholars including argue that states should be focused on the study of governmentality, at least amid the public health crisis. In addition, we argue that governments should self-monitor. In this vacuum, it is imperative to start critical discussions on consumer privacy and human rights.
As numerous digital technologies and trends are emerging across the globe, cybercrime is too progressing at an incredibly fast pace. This is affirmed by data shared by INTERPOL, which reveals that between January and April 24, 2020, 907,000 spam messages, 737 malware attacks, and 48,000 malicious URLs all related to COVID-19 were detected (INTERPOL 2020). These figures reflect that cybercriminals have enhanced their social engineering strategies by using COVID-19 as a base in their attacks, taking advantage of the economic crisis, the rapid development of virtual networks lacking cyber security arrangements, governments' helplessness, and peoples' anxiety during the outbreak. Along with the privacy concerns that have arisen as a result of the introduction of various COVID-19 apps that lack clear policy guidelines, there is a strong possibility of potential attacks that could be mounted against these apps. For example, Bluesnarfing is a security attack that uses a Bluetooth-enabled device to force a connection in order to get access to sensitive data such as photos, videos, emails, contact lists, calendars, and the International Mobile Equipment Identity (IMEI) stored in memory. Because most tracing apps, such as TraceTogether in Singapore, COVIDSafe in Australia, NZ COVID Tracer in New Zealand, and AarogyaSetu in India use Bluetooth technology to generate a random ID for each device, identify the user and their close contacts (Sowmiya et al. 2021), sensitive information could be stolen from mobile devices without the user's knowledge using Bluesnarfing. Likewise, similar arguments can be made regarding Corona Tracer in Bangladesh (Corona Tracer BD 2021). Further, a significant increase has been observed in the number of Distributed Denial of Service (DDoS) attacks (Khan, Brohi, and Janjhi 2020). The attacker spoofs the user's system to send service requests to internet-exposed servers and in return inject bogus encounter messages into the contact tracing environment. When user tests COVID-19 positive, it may upload sensitive and personal information on these servers which poses a direct threat to user's privacy and security. A DDoS attack on the United States Department of Health and Human Services (DHoS), which flooded millions of users at once, is a recent example (Stein and Jacobs 2020).
Apart from the abovementioned cyber-attacks, various malicious domain names containing the key phrases "COVID" or "Corona" emerged and COVID-19 themed phishing emails were used to solicit user credentials and passwords in a substantial number of incidents reported to law enforcement authorities. Domains such as "@nic.in" and "@gov.in" are also potential cyber vulnerabilities because they are being exploited by "adversaries" to send malicious messages to users (Chandra 2021). Additionally, a large number of fake websites claiming to provide COVID-19 updates, and spurious tracking systems have been utilized for a range of nefarious actions throughout the pandemic (Khan, Brohi, and Janjhi 2020). Recent cyber-attacks on government-owned corporations such as Air India where the data of lakhs of passengers was exposed (Singh 2021) highlight the need and urgency of developing strong cyber security walls for all government systems.
A study by Shah (2016), in India, revealed that out of 100 users aged 17-35, 11% were found unfamiliar with the terminology "Cybercrime," 47% were somewhat familiar with the term, and only 36% were well acquainted. Surprising was the finding that 75% of the respondents thought that cybercrime is a "politically motivated attack on computer systems" of major organizations to intentionally create disruptions. Similar to India, due to lack of public awareness, hackers have been increasingly targeting individuals and industries in Bangladesh, and the government is yet to publish any official report on the same (Dhaka Tribune 2021). In contrast, more than 75% of people worldwide are completely aware of cyber bullying and cybercrime (Vojinovic 2021). These numbers clearly demonstrate that there is a dearth of awareness in India and Bangladesh, which has contributed to a surge in cyber vulnerabilities as internet use has escalated. This lack of awareness is directly linked to the absence of a national cyber security regulatory policy (Shah 2016). Regardless of the fact that threats from cyberspace are well-known, the countries still lack a national policy that lays down a framework for preventing and dealing with cyber-attacks. As data is a national resource and cyberspace is where the most data are traded, a data protection law exists in most regions whose governments and residents rely on cyberspace for many routine operations. For example, General Data Protection Regulation (GDPR) governs the European Union and California Consumer Privacy Act (CCPA) governs the United States. Despite the fact that many Indians and Bangladeshis have lost data on several occasions, no national law exists in this regard in the countries (Dhaka Tribune 2021). It must be noted that the Data Protection Bill which was introduced in the Indian Parliament in 2019 is still awaiting a decision (Relia 2021) and Bangladeshi lawmakers are presently working on the first version of a similar draft (Mahmood 2021). We argue that in order to prevent such cyber-attacks and data breaches, developing countries such as India and Bangladesh must develop and implement a collaborative strategy by consulting cyber experts to achieve stability and security, as well as build a robust framework for protecting public data and human rights. Various countries such as France, Belgium, Netherlands, and Italy proactively consulted data protection authorities prior to the development of their contact-tracing apps (European Fundamental Rights Agency 2020), which resulted in remarkable changes to the design of the app, and significantly lowered the risk of data breach. These apps must be built on a robust policy design that ensures strong framework of operationalization, transparency, stakeholder commitment, ethical justice, fixed agendas, structured implementation, and adequate monitoring. Such initiatives on the part of governments in developing countries are lacking which makes them more vulnerable to attacks and compromises the rights of citizens.

Recommendations
Considering the deleterious impacts of digital people governance and contact tracing apps, we propose that such digital surveillance techniques and mechanisms must be evidence-based, contribute to a comprehensive public health monitoring system, include privacy policies and terms of service, be nondiscriminatory, and include methods for enhanced transparency and accountability in order to protect human rights and individual privacy. As observed in the case of most COVID-19 apps in India and Bangladesh, it is found that under the umbrella of emergency, most apps were tracking and tracing people without appropriate policies and terms of service which hindered transparency and privacy.
Concisely, we propose the following three measures for improved and justified people governance.
a. Evidence-based measures: States should insist on robust pilot studies and risk assessments to ensure precise, evidence-based decision-making in order to meet the criterion of necessity (Gasser et al. 2020). Although WHO and regional bodies, such as the European Union, have begun to provide technical guidance on digital surveillance tools, they have primarily focused on contact tracing apps so far (European Commission 2020). Moving forward, the wide range of additional digital surveillance techniques and monitoring mechanisms that states may use to monitor and control people need to be considered and justified based on evidence. Furthermore, a higher dependence on proof would oblige states to demonstrate that "less restrictive tools" such as decentralized data inside contact tracing apps or non-technological measures, cannot fulfill the purpose of limiting the spread of COVID-19 (Sekalala et al. 2020). As views of different countries are divided on this, evidence will help build consensus and establish equity across the globe. It is observed that some governments use centralized contact tracing methods, in which data is stored on a central server that is managed by the body that processes the data. When a person comes into contact with an infected person under this paradigm, the state is alerted and has the authority to impose quarantines and punishments. For example, Taiwan employs smartphone location tracking to detect and sanction quarantine violations. Other countries, on the other hand, such as India and Bangladesh are taking a decentralized strategy in which, most data are stored locally on a user's phone, enabling them more discretion over how their data is shared with authorities (Cohen, Gostin, and Weitzner 2020). As organizations such as Apple and Google are collaborating with governments to promote the use of a decentralized method (Munir et al. 2021), evidence will help ensure better acceptance by the public. Also, the cases of discrimination experienced by citizens as a result of their data sharing in COVID-19 apps, as highlighted by Marsh, Deng, and Gan (2020), outlines that data is a sensitive matter and its collection needs to be supports by evidence and defined purpose. Such evidences can also be used to frame national data privacy laws, a concern raised by O'Neill (2020), which are lacking in cases of both India and Bangladesh (Mahmood 2021). The data privacy laws will not only eliminate the possibility of such discrimination but will also garner public trust. b. Temporality: Considering the possibility of emergence of a larger people monitoring mechanism under the shade of public health crisis, we strongly advocate that states must take measures to avoid digital health surveillance from becoming the norm. Given the threats to privacy, states must include a sunset provision in laws that allow digital public health monitoring, which specifies what data will be collected, how long it will be collected, and when the authority to collect it will expire. This will ensure that the data collection by states is not extended beyond the crisis and is not used for purposes other than those for which it was primarily done. This addresses the concerns raised by Deb (2020a) and Dhindsa and Kaushik (2020). Although few countries like Macedonia, have given users the option of deleting their data after 14 days, and Australia, where the contact information held on a device is automatically destroyed after 21 days, most other countries do not include any such provisions in their tracing apps. For example, considering the prominent apps of India and Bangladesh i.e. AarogyaSetu and Corona Traced BD, respectively, the policies of the apps do not explicitly mention the type of data collected, duration of its storage, and the data processing authority. Further, the respective governments of the two countries have not maintained any sunset provisions regarding when such public health surveillance will end. This gives opportunity to the stakeholders to exploit the loopholes and extend this emergency health monitoring to greater people surveillance mechanism. This erodes the public trust and therefore it is strongly recommended that states must focus on temporality and resist any permanent change in governance. c. Transparency and accountability: States that rely on public health surveillance must pursue a rights-based approach to openness and accountability procedures in their digital public health monitoring. This will serve as a check on the concerns raised by Mahapatra (2021) and Chaturvedi, Kalyani, and Jain (2020) which includes unauthorized sharing of personal data, unrestricted surveillance, excessive data collection, and unrevealed purpose for data processing and data retention. Such transparency ensures that the public trust on the government machinery remains intact. In Italy, for example, the Ministry of Health is in charge of all data from public health surveillance instruments, and the government has pledged that data would not be transferred or used for commercial purposes. However, such affirmation is missing in case of Indian and Bangladeshi governments where it is not made public that who has access to data and the authority to process it which may lead to violation of privacy and human rights.
Where the use of contact tracing apps fails to fulfill the obligations of international human rights legislation, transparency through strong multilateral and multi-stakeholder review regimes is required to hold governments responsible. Formal frameworks, such as national human rights action plans, which provide an organized and practical approach to promoting the realization of human rights through public policy, can help to enhance accountability for how data is used at the national level.

Conclusion
In this study, we build upon the recent renewed resurgence of interest in policy design to better understand how specific combinations of policy tools arise and shape policy outcomes and how it has been hindered by under-theorization, lack of conceptual work on the subject and in policy practice. Then our findings illustrate how the adoption of unregulated digital apps during the pandemic can contribute to institutionalized void even in cases of strong state governance in India and Bangladesh. Such a void exposes an intersectional issue connecting governmentality and materiality. The outcomes, in particular, further marginalization of the marginalized social groups solidifies an emerging argument that public policies and digital governmentality with transparency issues fail to generate equitable social outcomes. The paper not only illustrates the capability of new technologies but also raises a number of concerns related to consumer privacy, cyber threats, and corporate responsibility as the outcomes can lead to breaching citizen and consumer rights. Private corporate and public policy makers in this sense need to identify a mutually-accepted mechanism to ensure consumers' rights and national safety and security. Our study also holds relevance for consumer advocates where responsible data integration policy and strategies are needed to ensure better future use of citizen and consumer data in the age of digital economy. Admittedly, the state teams up with service provider from the private sector to develop these technologies in order to address urgent needs deriving from issues concerning public security, safety, and health. However, we sought to call for future monitoring on the state's subsequent efforts in making digital data collection a social norm. In addition, we raise an alarm for the greater need of more scholarly discussion on governance merits in the era of digitalization.

Disclosure statement
No potential conflict of interest was reported by the author(s).