Introduction
These days, businesses have access to their customers’ data; consequently, this creates tremendous opportunities for analysis and forecasting and introduces personal data breach risks. The development of public opinion and legislation in privacy result in intentions to create regulations for personal data’s ethical use. There is no systematic regulation of the right to private information protection at the national level in the United States. The European approach is most precisely expressed in the General Data Protection Regulation (GDPR). The European concept of regulating personal data management is based on preventing excessive control over a person’s private life, either by state institutions or by corporations. In this Case Analysis, I will argue that the ethical matrix shows us that the United States should not follow Europe’s lead because it might affect the country’s economic health. It is mainly built on collecting, processing, and using vast amounts of data; and to solve the problem of privacy, the United States government needs to develop its unique approach that considers all stakeholders’ interests.
Zimmer Concept
The most significant challenge with open data is its privacy. According to Zimmer (2018), publicly available data can also be private. For example, the topic of information sharing was addressed in 2016, when a Danish researcher released a dataset of information extracted from 70,000 user profiles on the OkCupid dating site (Sharma, 2019). Due to the provided details, it was possible to establish persons’ identities (Sharma, 2019). The data included potentially sensitive information such as username, age, gender, geographic location, personality traits, and responses to thousands of questions from the site’s survey (Sharma, 2019). The data was publicly available, and therefore, it was completely free to use this information or share it.
Zimmer was one of several privacy and ethics scholars who have criticized this position. Zimmer’s concept is that having “one’s personal information stripped from the intended sphere of the social networking profile and amassed into a database for external review becomes an affront to the subjects’ human dignity and their ability to control the flow of their personal information” (Zimmer, 2018, p. 6). Zimmer discussed in detail how deanonymization became possible and what risks arise from such situations. The key ones violate privacy and inappropriate handling of private information, unfair storage, and dataset errors (Zimmer, 2018). His questions stem from traditional values for Western culture at the beginning of the 21st century – privacy and a distinction between the personal and the public (Zimmer, 2018). This is important since the private and the general differentiation on both the democratic way of government and social norms. After this case, many social media began to restrict researchers’ access to their information, and researchers confirmed and changed ethical codes.
In the United States, freedom of information, freedom of commerce, and public safety are fundamental. For adhering to the values, the state allows complete access to citizens’ data compared to Europe. On the one hand, the US has a robust journalistic community and the non-profit sector; on the other, significant freedom in data use. Regarding the example of the new privacy law, the California Consumer Privacy Act (CCPA) appeals across different sectors of the economy (Pardau, 2018). It provides substantial subject data rights, including the right to delete information, and introduces new responsibilities for companies (Pardau, 2018). Similar to the European GDPR, the CCPA applies to a wide variety of companies, not just California-based companies (Pardau, 2018). It also concerns restrictions on the processing, using, and disclosing private data.
The ethical matrix helps to evaluate this approach from the perspective of several principles such as autonomy, fairness, and well-being of each interest group. However, in practice, regulation is complicated because it is difficult to strike a balance between overregulation, which will reduce the quality of the services provided, and insufficient protection of citizens’ rights (Pardau, 2018). It is difficult for governments to control IT companies as long as the benefits of violating the law far exceed the damage from potential lawsuits (Pardau, 2018). For instance, the US Federal Trade Commission forced Facebook to change its privacy policy, but this did not affect the previous surveillance business model’s operation (Pardau, 2018). Widespread surveillance has become possible in part precisely because the United States has a hands-off policy.
Buchanan Concept
The global collection of data available is limited by the local context and individual understanding of the private and public boundaries on the Internet. According to Markham and Buchanan (2017), people understand the degree of publicity of the information posted, but using it in different circumstances can result in adverse outcomes. According to Samuel and Buchanan (2020), all ethical principles attempt to maximize benefits and minimize negative effects regardless of discipline. Therefore, research ethics requires that all participants need to give open, informed permission.
All big-data research conducted by multiple IT companies should answer issues that will benefit humanity. Each potential study participant should know the reasons the research is being undertaken, the duration, and the methods being utilized (Samuel & Buchanan, 2020). Moreover, it is essential to give full information about whether users have the right to refuse to join or leave the research at any moment (Samuel & Buchanan, 2020). Furthermore, the details such as the possible risks or benefits, limits of confidentiality.
The company should clearly explain what user data it collects, the reasons, and ways it will be managed. According to Samuel and Buchanan (2020), the company should only collect data that is consistent with its stated goals. As long as data collection purposes have changed but continue to be used, this can be a violation. However, Markham and Buchanan (2017) emphasize that “researchers must balance subjects’ rights as authors, as research participants, as people with the social benefits of research and researchers’ rights to conduct research” (p. 207). Ethical issues may occur and require discussion throughout all the research process stages, starting from preparation, study performance, publication, and distribution (Markham & Buchanan, 2017). However, the study of persons’ private information does not necessarily involve manipulation. It might become a useful tool for understanding user behavior and improving digital products and services.
Concerning new policy law, the current situation demands increasing responsibility of the United States IT companies. The business and the technology community should pay more attention to user data privacy and loyalty to American and European requirements regulators and authorities (Solove & Schwartz, 2020). These days, representatives of IT companies declare their willingness to cooperate with lawmakers to achieve a balance of interests (Solove & Schwartz, 2020). They express their readiness to rethink service security approaches and make technical product changes to strengthen users’ protection.
Regarding the ethical matrix, this agreement between the provider and a person benefits both parties. The former can still process, aggregate profiles, and analyze personal data in business development interests. Therefore, customers should have the right to know the ways, addresses, and reasons for providing personal information (Solove & Schwartz, 2020). For instance, there is a project addressing the interests of all stakeholders. Business Software Alliance (BSA) forms a project that includes the following principles: transparency, purpose specification, informed choice, data quality, consumer control, security (Rider, 2018). It also intends to facilitate data for legitimate business interests, considering accountability, compliance with law and order, and international interoperability (Rider, 2018). Companies are obliged to provide information on third parties who use it and the opportunity to correct personal data if necessary and request its deletion.
Conclusion
Ethics regarding privacy laws refers to computer and information issues. It covers topics such as the participant’s knowledge and permission, data integrity, protection, confidentiality and accuracy of the information, intellectual property issues. Personal data legislation builds relationships between several interest groups such as individuals, companies, and mediators. The concepts are the same in almost all countries as they are based on citizens’ rights to anonymity and inviolability of private life. In the United States, companies cannot anticipate and overcome all ethical risks. Nevertheless, they need to improve their privacy policy and take responsibility for data control and digital services as a priority for the future. In case the US Congress assumes that the developing national legislation on online privacy cannot obstruct innovation, the new privacy policy can be beneficial. Thus, at the latest stage of legislative regulation of ensuring the protection of personal data, the government should focus on adopting rules regarding all interest groups.
References
Markham, A., & Buchanan, E. (2017). Research ethics in context: Decision-making in digital research. In M. T. Schäfer & K. van Es (Eds.), The datafied society: Studying culture through data (pp. 201–209). Amsterdam University Press.
Pardau, S. L. (2018). The California Consumer Privacy Act: Towards a European-style privacy regime in the United States. Journal of Technology Law & Policy, 23(68), 69-113.
Samuel, G., & Buchanan, E. (2020). Guest editorial: Ethical issues in social media research. Journal of Empirical Research on Human Research Ethics: JERHRE, 15(1-2), 3-11.
Sharma, S. (2019). Data privacy and GDPR handbook. John Wiley & Sons.
Solove, D. J., & Schwartz, P. M. (2020). Privacy, law enforcement, and national security. Aspen Publishers.
Zimmer, M. (2018). Addressing conceptual gaps in big data research ethics: An application of contextual integrity. Social Media+ Society, 4(2), 1-11.