Research Methods & Professional Practice, Week 5, E-Portfolio Activity, Reflective Activity 2, Case Study, Inappropriate Use of Surveys
E-Portfolio Activity, Reflective Activity 2, Case Study, Inappropriate Use of Surveys
E-Portfolio Activity, Reflective Activity 2
Case Study: Inappropriate Use of Surveys
The 2018 Cambridge Analytica scandal exposed the unethical use of personal data through online surveys, with the company harvesting data from millions of Facebook users without their knowledge or consent. In 2018, it became public knowledge that millions of Facebook users’ data had been harvested without their consent. At the heart of the issue was Cambridge Analytica (CA) which in partnership with Cambridge researcher, Aleksandr Kogan harvested data from millions of Facebook profiles. Kogan had developed an application called “thisisyourdigitallife” which featured a personality quiz and CA paid for people to take it. The app recorded results of each quiz, collected data from quiz taker’s Facebook account such as personal information and Facebook activity (e.g., what content was “liked”) as well as their Facebook friends which led to data harvesting of about 87 million Facebook profiles (Rehman, 2019).
The lack of transparency about how data was collected and used is one of the central ethical issues in this case.
The ethical breaches in the Cambridge Analytica case center around the lack of informed consent. Users were unaware that their data would be used for political purposes, which undermines trust in digital platforms. Transparent data collection practices are essential for ensuring that users understand how their personal information will be used (Zuboff, 2019).
Legally, the Cambridge Analytica case exposed weaknesses in data protection laws. In response, the European Union introduced the General Data Protection Regulation (GDPR) in 2018, which provides stricter controls over how personal data is collected and used (European Commission, 2018). This case was a catalyst for legal reforms aimed at enhancing privacy protections in the digital age.
The Cambridge Analytica case had a significant impact on the data analytics industry. It raised awareness about the importance of ethical data practices and the need for professionals to adhere to stricter ethical guidelines. Data scientists and marketers must now take greater responsibility for ensuring that data collection practices are transparent and comply with privacy laws (Sweeney, 2018).
Other examples of similar activities were presented in the 2016 US Presidential Election. At the time of the 2016 U.S. election, there was widespread use of big data and micro-targeting by political campaigns by both parties. With its capabilities in this space of data and political campaigning, Cambridge Analytica, a now defunct political consultancy and data analytics firm, became a key figure in Trump’s campaign for the 2016 U.S. election. Cambridge Analytica was known for its expertise in using data for ‘election management strategies’ and ‘messaging and information operations,’ the latter also known in the military as ‘psyops’ (psychological operations) or mass propaganda that plays off of people’s emotions. The firm’s unique value proposition was a twist on the concept of micro-targeting, analyzing big data to understand not only what people do (their personal and professional actions and interactions) but also who they are (their emotions and preferences), (Langworthy, 2019).
References:
Ur Rehman, I. (2019). Facebook-Cambridge Analytica data harvesting: What you need to know. [online] Available from: https://digitalcommons.unl.edu/cgi/viewcontent.cgi?article=5833&context=libphilprac. [Accessed 28 February 2025].
Zuboff, S. (2018). The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power - Zuboff, Shoshana : Shoshana Zuboff : Free Download, Borrow, and Streaming : Internet Archive. [online] Internet Archive. Available from: https://archive.org/details/zuboff-shoshana.-the-age-of-surveillance-capitalism.-2019. [Accessed 28 February 2025].
European Commission (2023). Data protection. [online] commission.europa.eu. Available from: https://commission.europa.eu/law/law-topic/data-protection_en. [Accessed 28 February 2025].
Sweeney, L. (2013). Discrimination in online ad delivery. Communications of the ACM, 56(5), pp.44–54. doi: https://doi.org/10.1145/2447976.2447990. [Accessed 28 February 2025].
LANGWORTHY, S. (2019). Cambridge Analytica and the 2016 U.S. presidential election. [online] Power Dynamics in an Era of Big Data, LSE IDEAS, pp.8–10. doi: https://doi.org/10.2307/resrep45170.5. [Accessed 28 February 2025].