Introduction
The widespread usage of technology in today’s world accelerates the digitalization of various activities. Innovative technologies play a crucial role in people’s lives since they improve performance and offer the necessary level of comfort. When personal information is collected, transparency encourages trust; one should be able to determine the company’s dedication to privacy protection, what it plans to do with the information, and under what conditions it may release it. However, there is another aspect to the issue as well. Extreme degrees of digitalization in society rob people of their privacy and personal space. It occurs more frequently due to the widespread use of social networks, which host gigabytes of data that users share with their friends and close family. The report will include the Cambridge Analytica scandal from an evaluative perspective; cases with Facebook and Trump; an analysis of the necessary actions before and after the incident, and recommendations derived after analyzing what happened.
From a public relations perspective, the costs of not being able to use data for advertising and behavioral analysis may outweigh the benefits of changing the amount and type of data companies collect. Touting the company’s strong privacy management team and promoting its partnership with a privacy-focused nonprofit will also be an excellent way to build trust. Notifications sent directly to the user’s home page must include public information.
On the other hand, consumers can choose to keep these files private from others outside their social circle. As a result, cybersecurity, data theft, ethics, and privacy are given the highest emphasis because the data that social networks hold could be utilized by criminals or other businesses for various objectives. To gain a more profound knowledge of the issue, the presented article evaluates the Cambridge Analytica incident, examines its peculiarities, and applies fundamental principles of data security, information sharing, dangers, and ethics.
Trump and Cambridge Analytica
The new Internet has made it possible to observe the individual behavior of users depending on the tool. During Trump’s campaign, he employed British consultancy Cambridge Analytica, which helped him triumph in the 2016 United States presidential election (She & Michelon, 2019, p. 60). However, the firm continued illegally acquiring the personal information of Facebook users, causing a massive scandal. Using a series of questions in Facebook’s software and platform to gather personal details of Facebook users, the psychological profiles of 87 million users were created (Ernala et al., 2020, p. 10). On top of using the collected data, Cambridge Analytica then provided analytical support to the Trump campaign; in my opinion, such actions are not prohibited. Cambridge Analytical Scandal has been accused of intruding on the Brexit referendum, but that claim was overturned after a pompous investigation.
Cambridge Analytica asked Facebook clients to take a paid survey for academic purposes. Guardian reporter Harry Davis was the first to report on the alleged breach by Cambridge Analytica (Berghel, 2018). He said Senator Ted Cruz’s data came from millions of Facebook users and was taken without permission. Kogan was permitted to collect user data for professional purposes but had no power to sell this data to third parties. Cambridge Analytica CEO Alexander Nix continued his unethical behavior by releasing private emails to support Donald Trump, the Republican presidential candidate.
The company collapsed due to the scandal investigation, and Nix was outlawed from managerial for seven years. Regulators also fined Facebook for failing to protect its users’ data from political harvesting. Cambridge Analytica used politicians, public opinion traps, and social media to discredit opponents in more than 200 international elections (Tuttle, 2018, p. 8). In the process, Facebook announced that Cambridge Analytica had been removed from the platform. Facebook added that it had ordered the destruction of the data but later discovered that Cambridge Analytica had not wholly deleted it. Despite these scandals and inappropriate conduct, Cambridge Analytica still has a great chance to succeed because of its intellectual resources.
What Could Have Been Done Better Before the Scandal
Cambridge Analytica uses micro-targeting, aggregated databases, and behavioral targeting to fulfill its target commitments. Although these methods are not illegal, they are done in an unethical way, with little regard for the choices made by the audience. There are no general guidelines for data mining, but Cambridge Analytica needed to ensure Kogan had the privilege to vend users’ data. However, his records exhibit that he was not allowed because Facebook, at most, will enable him to utilize the figures for analysis.
By acquiring legally authorized data, Nix should have followed the code of conduct, allowed users to partake or back off, and made their data-extracting practices transparent before the scandal. Nix could have handled the situation more responsibly due to business practices that increased the brand’s value to customers or the public. However, Nix’s actions following the public disclosure should have demonstrated its cooperation and awareness in spreading the word about the privacy breach (González et al.,2019, p. 800). Following the code of conduct averted organizations from getting unnecessary criticism for misleading users. Therefore, Cambridge Analytica should have ensured that they seek express permission from users to access data in addition to their legal rights (Rider & Murakami, 2019, p. 640). Gaining approval lowers the threat of opposition by meeting customer expectations and obtaining legal consent to use data.
What Could Have Been Better after the Scandal
After the Scandal, Nix must deploy the public and legal relations departments to respond honestly and in time to the citizens through all media. Alexander Nix needs to assure the stakeholders that the organization takes the matter seriously to deal with the issues that caused the scandal. Manner and providing information on progress need to be updated regularly. In the future, the organization must develop privacy measures to restore confidence in the relevant partners (TED, 2019). Alexander needs to focus all his attention on restoring the trust of the relevant stakeholders (Laterza, 2018). While there is no guarantee that this will fix the company’s tarnished dependability, it could be a move in the correct direction and show some of the methods Alexander Nix needed to mitigate the fallout from the scandal. Facebook must develop a strategy that convinces customers that the company will protect user privacy. Facebook had to change its internal privacy policies and product development. Facebook should also consider limiting data collection, shortening the time data is stored, and limiting data transfers to third parties.
Work and Service Cambridge Analytica Provided
Cambridge Analytica developed a study of individual questions, which were then given out to respondents and scored on characteristics such as agreeableness and openness (Berghel, 2018, p. 86). After combining these results with data from polls, online activity, and voter records, a model of voter personality was created by Nix. Trump’s 2016 campaign utilized the gathered data to develop a psychographic profile that identifies the characteristics of Facebook users depending on their venture (Confessore, 2018). Using this information as one of the micro-targeting master plans, the campaign group personalized a platform of Trump-related information for various voters through various digital media. The advertisement and discovery of services are divided into categories based on whether Trump is a supporter.
Nix sent his supporters pictures of a victorious Trump and information about polling places. At the same time, Hillary Clinton’s voters received negative photographs or information about the popularity of Trump or Hillary Clinton supporters. Trump, for example, has been promoted as the most attractive candidate, while fake ads have targeted Clinton to expose her to corrupt practices. In addition, Trump’s campaign office welcomed some Cambridge Analytica workers, led by chief product manager Matt, who worked at a super PAC organized by Trump. The group was later increased to 13 persons, with Brad Donald’s digital director as their immediate supervisor (Wu et al., 2019, p. 334). The team analyzed American voters’ data for the presidential election.
Cambridge Analytica’s Strategy and Impact of the Scandal
The company denies using Facebook data, although Cambridge Analytica claims to utilize psychographic marketing. The New York Times (2018) claims to have seen evidence suggesting that the company still has part of the Facebook data, although Cambridge Analytica also claims to have erased the information. Cambridge Analytica disputes that it has any information derived from or collected through the company. However, the business wouldn’t need to require information on Facebook likes to create personality models of people. Advertisers can examine additional facets of a person’s digital footprint, such as Twitter feeds, browsing histories, and phone call patterns, for correlations with personality. This information can predict nature, albeit with variable degrees of accuracy.
As a result of disclosures that data strategy analytics Cambridge Analytica improperly acquired data on tens of millions of Facebook users, users’ trust in the company has decreased by half. According to a survey conducted by the Ponemon Institute (2018), an independent research organization specializing in privacy and data protection, only 28 % of Facebook users believe the company is dedicated to privacy, down from a high of 79 % before that. Most social media users know their information is being gathered, shared, or sold. That is Facebook’s way of operation.
My Perspective on Cambridge Analytica
The consequences of unauthorized use of user figures and breaching of privacy regulations can damage a business’s reputation and cause it to run into financial trouble. Despite the controversy surrounding the company’s intelligence-gathering approach, it seems that Cambridge Analytica’s inceptive promise to sway voter conduct by recognizing their personality was legitimate and ethical (Afriat et al., 2020, p. 120). The decision to collect information about Facebook users without their prior consent was wrong. However, Nix aims to develop a unique collateral proposition for Cambridge Analytica among politicians or their candidates by bringing something new (Bailey et al., 2018, p. 259). Nix wants to influence voters psychologically but needs access to citizens’ behavioral and psychological data.
Data companies provide psychological profiles; however, these data types are insufficient for Cambridge Analytica, as they are useless in determining psychological characteristics to design personalized and persuasive political messages. Instead, it uses customers’ purchase history to predict their political beliefs (Shiau et al., 2018, p. 60). Facebook’s claim that the Cambridge Analytica scandal was not the result of a security breach may be correct. Despite the scandal that put the company out of business, no one has reportedly faced charges. Facebook may remain the subject of intense scrutiny at this time.
Public License Operation and Interrelation to Data-driven Technology
The term license to operate is often utilized as an indicator of set behavioral limits companies must meet to be recognized and accepted in their environment. How the operating and data-driven licensing technology companies are related is studied and defined as a license to engage in trade or business activities, regulated or supervised by a licensing authority. Many of the recent trading stores aim to convert to data-driven firms, which makes analytics and data an integral part of corporate systems, culture, processes, and master plans. Moreover, analysis is embraced at every level of the organization and is the basis of all truth (Heawood, 2018, p. 429).
This fundamental business asset makes business processes better and more efficient. A cheaper and faster business asset is possible, and it is rapidly changing the world (Cadwalladr, 2018). Data-driven companies focus on collecting data for every aspect of their business. When workers in all departments can use the correct data needed for a specific task in perfect timing, enterprises gain a competitive advantage because data drives the bottom line. More data-driven businesses create a more open environment for business and marketing. Still, the Cambridge Analytica scandal and other episodes of monopoly, market manipulation, and privacy violations dominated the news.
Challenges Facing Facebook
Since 2016 after the Cambridge Analytica scandal, Facebook has been working to show people how effective its process of removing applications, accounts, and other problematic content has become. In a recent blog, it is evidenced to have deleted 3.2 billion fraud accounts (Richterich, 2018, p. 540). It also announced its initial removal of millions of chunks of content portraying self-harm and child abuse, differentiating its efforts to moderate content on Instagram from those on Facebook (Weisbaum, 2018). Developmental content management is positive but also expensive. Facebook’s security, safety, and content moderation efforts cost about $5 billion yearly, which may soon drop significantly (Larson & Vieregger, 2019, p. 100). Artificial intelligence cannot come out more quickly to reduce costs. The company needs more capital to change the terms of Facebook and to pay engineers who review content.
Facebook focuses on the company’s longstanding complaints about privacy violations, political advertisements, content moderation, and civil rights data. In two different lawsuits, one with CEO Zuckerberg and the second with Facebook CEO Marcus, legislators revealed their disaffection with the firm’s impact on society (Lotrea, 2018, p. 20). Its form is unknown when or if Libra will be launched in 2020 as planned. If developed, it will probably be a declining variety of Facebook’s inceptive concept of a world of digital money, which has an extensive web of holders whom they withdrew after members of parliament told them they could be held responsible for Libra’s problems. In addition to Libra, however, regulators in the US and elsewhere are also watching Facebook closely. Nearly all of the state’s attorneys are likely to focus on Facebook’s data practices, acquisitions, and product development, making it one of several technology giants facing the most pressure in Europe, such as potentially tough new penalties and speech controls (Setiawan et al., 2021, p. 20). In coming years, it needs to reassure capitalists that regulatory pressure will not hurt its durable prospects.
Fake news and hate speech are common challenges facing Facebook. After the 2016 US presidential election, most people worried about the outcome of fraudulent news spread, primarily through digital media. According to Lischka (2019, p. 200), after using data from web browsing, figures from fact-checking channels, and the outcomes of a recent online survey study, 14% of people within American cities say social media is a crucial fount of election reports. On Facebook, the company shared fake news stories 30 million times in 90 days before the general election, and pro-Trump news got 8 million shares (Mena, 2020, p. 170). In the months leading up to the presidential election, the mean American adult eyed one or several lies, and most of those who remembered seeing the stories believed them. Especially if they have different ideological social networks, people are more likely to believe stories that support the preferred candidate.
With Facebook’s presence in countries, the question is how to create guidelines that protect users from abuse and promote freedom of expression in all religions and norms. In the first three months of 2019, Facebook’s monthly active users were 2.4 billion (Feng et al., 2019, p. 44). Facebook is facing unprecedented challenges in handling the daily posts, videos, and photos it receives. These posts broaden the conversation about artificial intelligence.
Future Implications
In my opinion, even if there are significant changes in the future, Facebook will be careful about any changes it makes to continue to provide value to advertisers. At the same time, Facebook advertisers should also be wary of any development in Facebook’s promotion settings (McCarthy, 2018, p. 1). In my perspective, operations will be more successful if one can quickly understand and adapt to these changes. Rather than focusing on the company’s shortcomings, understanding these issues provides a better context. Most of Facebook’s solutions address broader technical problems. Using new technologies like artificial intelligence is always tricky, and one cannot just ask for help on stack overflow. It all depends on users’ trust in Facebook’s corporate integrity, not Facebook’s responsibility for user privacy.
Conclusion
In conclusion, transparency promotes trust when personal data is requested, and one should see the company’s commitment to protecting privacy, what it intends to do with the data, and under what circumstances it may share it. It is undemanding to ponder data seclusion as complying with laws and directives. However, the Cambridge Analytica breach should educate us that individuals view their private data differently, as this should be both ethical and legal. Many users do not know how data outsourcing can expand their digital track or what businesses can undertake with that data. Their view is that every company will honor their privacy. However, some businesses may not discern ethical issues, granted that their actions are legal. Cambridge Analytica reveals a potential loop between consumer expectations and business expectations. Remember, as a user, before giving a company your information, read its terms and conditions to see what kind of guardian it wants to be. Remember that while values may play a significant role in these terms and conditions and even vision and mission statements, they may not accurately reflect what is happening with the data.
An organization that draws inspiration from the extraordinary success of the Internet economy and provides a flexible regulatory and enforcement mechanism is needed. Facebook focuses on the company’s longstanding complaints about privacy violations, political advertisements, content moderation, and civil rights data. Internet companies must drive innovation, but not to the extent that people fear being drawn into political or commercial experiments. Social media platforms and the internet economy are run in the public interest as they become increasingly important to billions of people. It is the government’s responsibility to define the rights of citizens in media. Congress must protect individuals and their ability to interact online freely.
Reference List
Afriat, H., Dvir-Gvirsman, S., Tsuriel, K. & Ivan, L. (2020) “This is capitalism. It is not illegal”: Users’ attitudes toward institutional privacy following the Cambridge Analytica scandal.” The Information Society, 37(2), pp. 115-127.
Bailey, M., Cao, R., Kuchler, T., Stroebel, J. & Wong, A. (2018) “Social connectedness: Measurement, determinants, and effects.” Journal of Economic Perspectives, 32(3), pp. 259-80.
Berghel, H. (2018) “Malice Domestic: The Cambridge Analytica dystopia.” Computer, 51(5), pp. 84-89.
Confessore, N. (2018) “Cambridge Analytica and Facebook: The Scandal and the Fallout so Far.” The New York Times, Web.
Ernala, S.K., Burke, M., Leavitt, A. & Ellison, N.B. (2020) “How well do people report time spent on Facebook? An evaluation of established survey questions with recommendations.” In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, pp. 1-14.
Feng, S., Wong, Y.K., Wong, L.Y. & Hossain, L. (2019) “The Internet and Facebook usage an academic distraction for college students.” Computers & Education, 134, pp. 41-49.
González, F., Yu, Y., Figueroa, A., López, C. & Aragon, C. (2019) “Global reactions to the Cambridge Analytica scandal: A cross-language social media study.” In Companion Proceedings of the 2019 world comprehensive web conference, pp. 799-806.
Heawood, J. (2018) “Pseudo-public political speech: Democratic implications of the Cambridge Analytica scandal.” Information polity, 23(4), pp. 429-434.
Laterza, V. (2018) “Cambridge Analytica, independent research and the national interest.” Anthropology Today, 34(3), pp. 1-2.
Larson, E. & Vieregger, C. (2019) “Strategic actions in a platform context: what should Facebook do next?” Journal of Information Systems Education, 30(2), pp. 97-105.
Lischka, J.A. (2019) “Strategic communication as discursive, institutional work: A critical discourse analysis of mark Zuckerberg’s Legitimacy talk at the European parliament.” International Journal of Strategic Communication, 13(3), pp. 197-213.
Lotrea, C. (2018) “Mr. Zuckerberg and the Internet. An essay on power relations and privacy negotiation.” Journal of Comparative Research in Anthropology and Sociology, 9(1), pp. 19-24.
Mccarthy, S. (2018) “Zuckerberg tells staff to use Android after Apple feud.” UWIRE Text, pp. 1-1.
Mena, P. (2020). “Cleaning up social media: The effect of warning labels on the likelihood of sharing false news on Facebook.” Policy & Internet, 12(2), pp. 165-183.
Richterich, A. (2018). “How data-driven research fuelled the Cambridge Analytica controversy.” Partecipazione e conflitto, 11(2), pp. 528-543.
Rider, K. & Murakami Wood, D. (2019) “Condemned to connection? Network communitarianism in Mark Zuckerberg’s “Facebook manifesto.” New Media & Society, 21(3), pp. 639-654.
Setiawan, R., Ponnam, V. S., Sengan, S., Anam, M., Subbiah, C., Phasinam, K., Vairaven, M. & Ponnusamy, S. (2021) “Certain Investigation of Fake News Detection from Facebook and Twitter Using Artificial Intelligence Approach.” Wireless Personal Communications, pp. 1-26.
She, C. & Michelon, G. (2019) “Managing stakeholder perceptions: Organized hypocrisy in CSR disclosures on Facebook.” Critical Perspectives on Accounting, 61, pp. 54-76.
Shiau, W. L., Dwivedi, Y. K. & Lai, H. H. (2018) “Examining the core knowledge on Facebook.” International Journal of Information Management, 43, pp. 52-63.
Tuttle, H. (2018) “The Facebook scandal raises data privacy concerns.” Risk Management, 65(5), pp. 6-9.
TED. (2019) Facebook’s role in Brexit — and the threat to democracy | Carole Cadwalladr [Video]. YouTube. Web.
Weisbaum, Herb. (2018) “Trust in Facebook Has Dropped by 66 Percent since the Cambridge Analytica Scandal.” NBC News, Web.
Wu, C.J., Brooks, D., Chen, K., Chen, D., Choudhury, S., Dukhan, M., Hazelwood, K., Isaac, E., Jia, Y., Jia, B. & Leyvand, T. (2019) “Machine learning at Facebook: Understanding inference at the edge.” In 2019 IEEE international symposium on high-performance computer architecture (HPCA), pp. 331-344.