Technology Tap: CompTIA Study Guide
This podcast will give you help you with passing your CompTIA exams. We also sprinkle different technology topics.
Technology Tap: CompTIA Study Guide
Inside the Cambridge Analytica Scandal: Technology Ethics and Data Privacy
In this episode of Technology Tap: CompTIA Study Guide, my students dive into the notorious Cambridge Analytica scandal and its profound impact on data privacy and technology ethics. Our students break down how seemingly harmless personality quizzes exploited Facebook data, creating psychological profiles that influenced elections worldwide. This discussion not only explores real-world technology applications but also enhances your understanding of data security—an essential topic for IT skills development and CompTIA exam prep. Tune in to expand your knowledge of technology education and the critical role of informed consent in today's digital landscape.
We walk through the mechanics: the Open Graph loophole, the “This Is Your Digital Life” app, and the shift from demographic targeting to OCEAN-based psychographics that amplified fear, duty, or curiosity depending on your traits. The conversation connects the dots from early experiments with Ted Cruz to huge ad impression volumes tied to the 2016 cycle, explores coordination concerns with super PACs, and examines why these tactics made public debate harder and disinformation easier to spread. Along the way, our students highlight the whistleblowers who surfaced the practice and the global footprint that reached Brexit, the Caribbean, and beyond.
The fallout mattered. Facebook faced FTC, SEC, and UK ICO actions; Cambridge Analytica went bankrupt; and Meta tightened API access to cut off friend data collection. We also dig into the privacy wave that followed—GDPR in Europe, CCPA in California—and what those laws do and don’t fix. The core takeaway is clear: ethical data practices and transparent advertising aren’t nice-to-haves; they’re the guardrails for a healthy digital public square. If personal data can be turned into political power, then consent, purpose limits, and accountability must be visible and enforceable.
Listen for a clear, step-by-step breakdown, plain-language answers to tough questions, and practical context you can use to evaluate political ads and platform policies. If this conversation sharpened your thinking, subscribe, share the show with a friend, and leave a review telling us how you protect your data online.
100% Local AI. No cloud. No tracking. Convert URLs, PDFs & EPUBs into high-quality audio.
Art By Sarah/Desmond
Music by Joakim Karud
Little chacha Productions
Juan Rodriguez can be reached at
TikTok @ProfessorJrod
ProfessorJRod@gmail.com
@Prof_JRod
Instagram ProfessorJRod
And welcome to Technology Tap. I'm Professor J. Rod. And this episode, well, it's a special episode, is an episode of my students doing their own podcast. Let's tap in. And welcome to the Technology Tap. Hi, I'm Professor J. Rod. For those of you who don't know me, I'm a professor of cybersecurity, and I love helping my students pass their A, Network Plus, Security Plus. And I also um consider myself an amateur historian, so sometimes I drop episodes on the history of modern technology, you know, history of you know different uh technological origins. But in this episode, it's a little different, usually around this time and during the summer, sometimes I do, you know, I have to take a break. So not to leave you out cold because I know how much you fans love hearing the sound of my voice. But unfortunately, this time you won't. I'm gonna have usually have students do a podcast during this time of the year, and I post it. Uh so how I do it is is I have them do it as an alternative to an assignment. Either write an essay or do a podcast off their presentation that they did earlier in the semester. Well, it seems like as the years have gone by, I think this is the maybe second or third year that I'm doing this, I'm getting less and less podcasts. I think a lot of it has to do with AI, so they rather write the essay. So I'm gonna have to change that. I'm gonna change that next semester. So those of you who are listening and who are my students, just know that changes are gonna be made next semester. But grateful to have Aaron, Antonio, Mark, and Joe put together this episode on the Cambridge Analytics Facebook scandal. And I hope you enjoyed it. They worked really hard on it.
SPEAKER_04:Antoni Garcia.
SPEAKER_03:Saya.
SPEAKER_04:Uh Joelle Torres and Mark Diakitos.
SPEAKER_03:And in today's episode, we are talking about the Cambridge Analytica Facebook data scandal. Now, we're gonna have Joel start off.
SPEAKER_02:Alright, thank you, Aaron. So uh we're gonna be defining the data scandal, all right? Uh the Cambridge Analytica scandal. Uh just hearing the name. It's like so bad. Uh but what exactly happened here? Uh its core, uh, this was a massive breach of trust. Uh personal data from 87 million Facebook users was collected by a political consulting firm called Cambridge Analytica, and it was all done without proper consent. These weren't just random data points, these were intimate details about people's personalities, preferences, political leanings, and behaviors. Cambridge Analytica uses data for political advertising, attempting to influence voter behavior. They built sophisticated psychographic profiles, essentially psychological maps of millions of voters to target them with tailored political messages. The two primary players here were Facebook, the social media giant that housed all this data, and Cambridge Analytica, the consulting firm that exploited it. Now, here's what makes this particularly disturbing. This wasn't a quick hack that happened overnight. This data collection occurred privately between 2013 and 2016. For three years, millions of people's data was being harvested and weaponized without their knowledge. It only became public in March 2018 when m whistleblowers came forward. Their goal? Cambridge Analytica built the psychographic profile specifically to influence elections, including the 2016 US presidential campaign and Brexit.
SPEAKER_04:What did they exactly do again? And why was this like a problem?
SPEAKER_02:Well, Cambridge Analytica misused data from about 87 million Facebook users by using it for political advertising instead of the purpose users believe the data was for. Yeah. Alright, alright. Next we're gonna be talking about the data harvesting mechanism. Alright. So how did they actually pull this off? Uh Candrich Analytica started with a seemingly innocent personality quiz app called This Is Your Digital Life, created by a researcher named Alexander Kogan. Now, personality quizzes were everywhere on Facebook back then. You remember them, right? Like, well, which Disney character are you?
SPEAKER_03:Or yeah, like BuzzFeed quizzes. Yeah, like yeah, yeah, yeah. That's what I was gonna say. Yeah.
SPEAKER_02:Like they seemed harmless and fun. But this quiz was different. When users agreed to take the quiz, they consented to data collection for what they thought was an academic research. The consent form mentioned academic use, which sounds legitimate enough. But here's a crucial detail most people miss. By taking this quiz, users weren't just sharing their own data. They were unknowingly exposing their friends' data too. This brings us to the consent loophole, and this is where Facebook's own policies became the weapon. Users consented to share their data, but they had no idea they were also handing over access to their entire friends' network information. Imagine taking a quiz and accidentally giving someone access to the personal data of everyone you know. That's essentially what happened. Cambridge Analytica exploited Facebook's open graph platform, which before 2015 allowed apps to access extensive friend information. This wasn't technically hacking in the traditional sense. It was using Facebook's own features in a way that clearly violated the spirit, if not the letter, of user privacy. And a scale, each user who took the quiz gave access to approximately 160 friends' profiles. Do the math on that, and you quickly reached 87 million users globally. That's more than a population of the United Kingdom. These profiles contain treasure troves of information like interests, location data, relationship status, political views, everything needed to build the psychographic profiles I mentioned earlier.
SPEAKER_00:So I have a question for uh this slide. So the question I have, which is who were the main organizations involved with the actual Cambridge Analytica scandal, and what role did each one play according to this slide?
SPEAKER_02:The main organizations involved was of course Facebook, which provided a platform where the data was collected, and Cambridge Analytica, which used that data for the political consulting, basically scamming their players.
SPEAKER_03:So they got around 87 million users out of what was that you said, how many people? Yeah, you got 87 million people's like users just from the people that did the quiz and then like taking their friends list. That's crazy. But exactly now that we've talked about how they got the data, let's talk about what they actually did with it. Because they didn't just use it for like regular standard advertising. So the whistleblowers described it as psychological warfare, which I'll explain in a bit. So on the corporate side, this was a joint venture between SEL Elections and Robert Mercer, who was a billionaire who put millions of dollars into the project, as well as Steve Bannon being one of the board members who was a key political strategist in this whole thing, too. But their main secret actually came in as to what Joel said, which was the psychographic profiling, where most advertisers usually target you based on common demographics like your age or where you live. But Cambridge Analytica claimed that they themselves could even target your personality as like who you are. So they actually used what was called the ocean models, which scored people based on openness, conscientiousness, extroversion, agreeableness, and neuroticism. So pretty much by analyzing your Facebook likes, they could predict these traits with crazy accuracy. But going back to what I said about the psychological warfare thing, with this system, they could even know your deepest fears and insecurities, and then use that to spam you with content that's designed to trigger like a feeling or emotional reaction to you personally. Their next goal became to actually show the scale and the impact of the tool in the real world. And that was in this case, the elections. So it actually started with the Ted Cruz campaign, in which Cambridge Analytica was paid around six million dollars to build these psychological profiles and test if they could actually influence the voters. And once that technology was made better, they actually moved on to the Donald Trump campaign, in which it went from six million to 1.5 billion dollars in ad impressions. Just crazy. But that pretty much made it impossible for the media or like political opponents to actually fact-check them at that time. But their strategy was now split into two things. So between finding supporters of Trump and riling them up, and finding people that are likely to vote against Trump and his supporters and targeting them with negative content to in a way discourage them from even voting at all. Which is crazy. And part of the reason why that's so important is because it pretty much made it so like politics and elections and stuff went from the like public open discussions that they would have to be more focused on private personalized attacks on social media. Uh, does anyone have any questions on that slide?
SPEAKER_02:Yeah, sorry. I got a question real quick.
SPEAKER_03:How did the ads work again? Like okay, yeah. So the dark posts are pretty much ads that all that are like specifically made for a user uh so that they can see it, but once they like once they actually see it and open it, it vanishes like right afterwards. It like disappears. So pretty much made it so it was difficult for people to actually keep track of these ads and made it harder to kind of pinpoint it to them, if that makes sense. So they got away with it for a long time. I see, I see.
SPEAKER_04:Um, like the whistleblower, right? Christopher Wiley, he was a former like Cambridge Analytica employee, in which he basically like revealed everything about like the hijackings, he had proof, and there was a bunch of evidence like confirming like over fifty million Facebook users were pretty much harvested, in which they spent over one million right. And then Facebook and both yeah, both Facebook and Cambridge Analytica, right? They denied everything. Like they denied Wiley's evidence, even though he showed it in court. And even then, didn't like Facebook also know in 2015, but they didn't like tell anyone about it. Like they only like told everyone once people found out. Then also this also doesn't apply to just the United States. This has also happened in Brexit in the UK, Trinidad and Tobago with the t with the 2010 campaign to impact young voters, and even in the Russian Federation, with basically screwing up multiple elections in which Wiley testified that Cambridge Analytica had talks with Russia.
SPEAKER_00:We're actually gonna describe and explain about the different ethical and moral failures. Now we are uh going to talk about such as the accountability deflect, then a minimizing failure, lacker transparency, and violations of informed consent. So based on this, you know, based on this failures that what went on, this will cover the aftermath in terms of penalties in under the breakdowns in the Cambridge Anthetica scandal. Facebook failed to enforce proper data governance, allowing metadata collection beyond legitimate needs. The users were unaware of the extent of data harvested, and informed consent was violated because data was used for political purposes without permission. The scandal underscores the importance of transparency and accountability in digital platforms. Does anyone have any questions on Facebook? Well, actually, I do.
SPEAKER_03:So how did Facebook contribute to the accountability deficit?
SPEAKER_00:Okay. So in terms of contributing, they had Facebook itself had inequatic data governance, and they also failed to enforce its own policies effectively.
SPEAKER_04:Yeah. Um, I also have a question. Yes. What was the main ethical issue in the Cambridge Analytica scandal? Because like I'm having a little trouble understanding.
SPEAKER_00:Okay. I can totally explain that for you. So they had a certain violation which was based which is primarily due of the informed consent. Now there were users who basically never agreed to their data being used for political advertising. So when that happened, that's when a whole violation came into place from that cause. Yes. Oh, all right. Yeah. All right. So we have some regulatory and financial consequences. We will be talking about the different types of penalties. We have the FTC penalty, the UK ITO fine, which is from the actual commissioner's office. We have some SDC penalties, which were basically fines that were misleading disclosures on any data misused from users that were reported. And then we have some meta-sentimates, which when that happened, it just took place off when whenever directors agreed to settle for any scandals that were mishandled in certain situations. So this site covers the aftermath in terms of penalties and corporate.
SPEAKER_03:Wait, actually, I have a question about the last thing you said. So what happened, what actually happened to Cambridge Analytica after the scandal?
SPEAKER_00:For the people that don't know, just in case. So after the scandal, they failed they filed for uh the bankruptcy that they had going on in back of like May 2018. I believe that's when that happened. Oh wow.
SPEAKER_03:So after that actually filed for like chapters on the bankruptcy then?
SPEAKER_00:Yeah, yeah, yeah. That's good. Yep. So as I said, they cover any aftermath which dealt with penalties and had a copper fall out. Facebook themselves, they faced regular breaking fines from the FTC, SEC, and UKICO, as I explained earlier, for any privacy violations and misleading disclosures. Now, Meta agreed to sentiments, and Cambridge Analytica filed for bankruptcy in May 2018, which I explained earlier. These different consequences demonstrated how regulatory bodies are stepping up enforcement and the financial risks of poor data practices.
SPEAKER_03:Actually, I have one more question. Do you know what like major privacy laws emerged like after the scandal?
SPEAKER_00:Privacy laws? So I think like major privacy laws will be based off from the GTPR back in Europe, and in California, there will be something called the CCPA. Yeah, that would take place in California, of course. Okay, okay.
SPEAKER_02:Yeah, I also have a question to be honest. Yeah, yeah. How did like meta respond to the backlash again?
SPEAKER_00:So after they dealt with the backlash, which was pretty crazy at the time, they had to restrict any third-party access to any user data through their API changes. Okay. Mm-hmm.
unknown:Yeah.
SPEAKER_04:So basically, through what we all like discussed today, the Cambridge Analytica scandal showed just how powerful and dangerous personal data can be when it's basically misused for anything. What happened wasn't literally a data breach. It basically revealed how personal information could be weaponized to influence elections and distort public opinion. And this also exposed major failures in transparency, accountability, and data governance. Especially on the part of Facebook and Cambridge Analytica. Basically, because of this entire thing, there was massive public backlash, including movements such as as someone said, delete Facebook, right? Which pushed people to question how their data was being used. And and then also on in other countries, this helps spark stronger privacy laws such as GDPR in Europe and CCPA in the United States, along with stricter platform policies about data access. But the one true takeaway from everything is that ethical data practices and user trust are essentially needed for democracy in the digital age. So if companies don't protect our data, the consequences can affect not just us, but anything.
SPEAKER_03:I mean, honestly, all of us would recommend checking out the documentary, The Great Hack, on Netflix. I don't know if everyone fully saw. It or not, but I did, and it was really good, honestly. So if we've thrown a lot of numbers and dates, you guys today, but like this documentary actually follows the real people that were involved in this whole scandal. So, like, including the whistleblowers that we mentioned earlier and the actual like professors and other people who fought to get their data back, you know. But like fun little fact, it actually features Brittany Kaiser, who was a like a former director at Cambridge Analytica, and actually like shows her journey from inside the company to actually becoming like a whistleblower. So it's really interesting. And it also like really helps you like visualize how that like invisible data and like strings of code can actually you know turn into something way bigger, like real-world political like problems and chaos, you know. But I don't know if anyone has like any more questions that they would like to ask before we wrap this up.
SPEAKER_02:Yeah, I got a question real quick. Okay. What's up? You know how you know how like I mentioned like psychographic profiles before earlier? How is that actually different from regular advertising?
SPEAKER_03:Yeah, so we actually both mentioned it a bit, but it's still a great question because we didn't answer it enough, honestly. So regular advertising usually targets demographics, like your age, gender, like where you live and stuff, but psychographic targets like your personality. So, you know, Cambridge Analytica categorize people based on traits like neuroticism and openness, which I mentioned a bit in the ocean model. So if the data like show that you are a like fearful person, like you get scared easily, they would purposely show you something that was like designed to try to scare you. So it wasn't just about selling potential candidate, but more of as well as like triggering an emotional reaction out of people, too. I don't know if anyone has any questions.
SPEAKER_00:Yeah, I do have a question. So earlier, I don't remember if you mentioned any like super PAC concerns, but why does it even matter if they worked for both of that campaign or the PAC?
SPEAKER_03:So glad you referenced it, because it's actually super important. Like it's a major like legal issue, especially like in the US, where official campaigns and super PACs, which can raise like pretty much unlimited money, are like strictly forbidden from coordinating their strategies with others. So since Cambridge Analytica was working for both the Trump campaign and the Make America number one super PAC simultaneously, there were huge concerns that they were acting as a bridge to pretty much illegally share data and their strategies, which like essentially bypasses campaign and like finance laws and stuff. Wait a minute.
SPEAKER_04:How did they even get all this info? And did like all those people really take the quiz?
SPEAKER_03:Yeah, I I don't remember if Joel mentioned it, I think he did, but pretty much no. So that's honestly like the scary part. I don't know if Joel wants to add to it or not, but pretty much like only about 270,000 people actually took the quiz and like you know download the app and stuff. But the like those were users who consented, and the app also like scraped the data off of all their Facebook friends without them knowing. So that was like the the no consent part. So that like whole friend of a friend loophole that they ended up using was how they really managed to go from 270,000 people that took the quiz to 87 million profiles in which they could use like from such a small group in comparison. But I don't know if anyone else has any questions.
SPEAKER_02:I mean, yeah, like like that one group, like one friend or one person who took the quiz could spread it to like 160 friends, basically. Yeah, and that's how like crazy like badasses.
SPEAKER_04:Yeah, so it's like a sickness, like it spreads like the more like you come in contact, basically.
SPEAKER_02:Exactly. Yeah, like once one person takes the quiz, their friends, 160 around them basically like gets their information stolen and stuff.
SPEAKER_03:Yeah, so like if if you take it and you have like how Joe said, like up to like 160 friends added on your profile, they could just take it, and they do. They just straight up like took their profile and their data as well. So they kind of just used the users who took the quiz not only for their data, but also to kind of take everyone else's data too, which is crazy. But I think that's it, honestly. Yeah, I just want to say thank you, and I'll see you guys for another episode sometime soon. I don't know if anything has anyone to say.
SPEAKER_00:Yeah, I appreciate yeah, I do want to give my first thoughts. I do appreciate uh being in this kind of group and this uh podcast, you know. Definitely there was a lot of fun, and yeah, you know.
SPEAKER_03:Yeah, I'm really glad we were selected together. It was a lot of fun. It was very nice.
SPEAKER_00:Yeah.
SPEAKER_03:Alright, guys, we will see you for another episode sometime soon. Have a great day, guys, and goodbye.
SPEAKER_01:Alright, I hope you enjoyed that episode. I know the gentlemen they work really, really hard on them. I'll see you back after the holidays for original episodes of Technology Tap. I think I have one more of these that I'll I'll release, and then I'm gonna have to go into the archive, see if I can find some from older students, or maybe I'll just replay some of the earlier ones from a couple of years back. But I want to wish everybody a safe and and happy holidays and a fabulous new year, and as always, keep tapping into technology. This has been a presentation of Little Chacha Productions, art by Sabra, music by Joe Kim. We are now part of the Pod Match Network. You can follow me at TikTok at Professor Jrod at J R O D. Or you can email me at Professor Jrodj R O D at Gmail dot com.
Podcasts we love
Check out these other fine podcasts recommended by us, not an algorithm.