How the Ancient Greeks Could Have Saved Facebook from Scandal

What will it to take to fix Facebook’s ethical blind spot? The talk is regulation, but the long-term fix may lie in education.

By Jordan DeLong, Ph.D.
April 13, 2018

Twenty-five years ago, the film Jurassic Park was the scariest thing I’d ever seen. I still remember sitting in the theater with my grandmother, hands over my face, watching Tim and Lex run around a kitchen trying to evade vicious velociraptors.

It took me years to realize the threat Jurassic Park warned us about wasn’t dinosaurs, but the effects of corporate-fueled technological innovation unchecked by ethical reasoning.

I’ve reached a place in my career where I identify most with Jeff Goldblum’s character, Dr. Ian Malcom, the eccentric mathematician who proves an unexpected critic of the enterprise. During an incredible, four-minute debate scene, he accuses the park owner of irresponsibly wielding technical power, arguing that “[his] scientists were so preoccupied with whether they could that they didn’t stop to think if they should.”

Recently, news about the relationship between Facebook and Cambridge Analytica brought that scene back to mind. Recent reporting revealed that Cambridge Analytica, a political data firm, exploited information from 87 million Facebook users to forward the political goals of its billionaire patron, Robert Mercer, by developing what a whistleblower has described as a tool for “psychological warfare”1. However, as society dug deeper into the reporting about the scandal it became apparent that while Cambridge Analytica may seem like the frightening T-Rex of the story, the more dangerous threat is Facebook’s ethical shortcomings.

One fact that unnerves me above all others is that Facebook wasn’t hacked – it simply gave users’ information away under the terms of an agreement it had no capability or desire to enforce2. Perhaps courts will eventually conclude that Cambridge Analytica flagrantly violated that agreement, but Facebook was so feckless in their protection of user data that it borders on complicity. The oversight also highlights a high degree of naiveté towards how much damage that unprotected data can cause.

While I was doing my graduate work at Cornell, Facebook’s News Feed team conducted a study that attempted to alter user’s mood by changing what appeared in their feed. Users were not asked for explicit consent, debriefed, or even notified they were part of the study. (At Research Narrative, we also wondered how many minors, who cannot legally consent to being studied, had slipped into that dataset.) Facebook handed the results of the experiment to researchers in the Communication and Information Science department, who published the results3 without a full ethics review4.

We might have hoped that Facebook would learn from their mistakes, but the company continues to turn a blind eye towards ethical issues related to data and research. Reporting from ProPublica has shown that Facebook has profited by remaining ignorant of unethical (and possibly illegal) practices, such as blocking older users from seeing certain job postings5 or allowing real estate agents to only advertise to people based upon their race6.

Facebook’s behavior exemplifies what happens when a company exhibits advanced technical skills but a distinct lack of ethical awareness. In an industry brimming with intelligence and talent, it’s startling to find such a pronounced blind spot. Even if Mark Zuckerberg follows through on promises to implement changes, what faith should we have in Facebook’s ability to overcome its shortcomings?

Kumail Nanjiani, one of the stars of HBO’s Silicon Valley, thoughtfully described this blind spot after witnessing a demonstration of new tech he personally found “scary”:

“And we’ll bring up our concerns to them. We are realizing that ZERO consideration seems to be given to the ethical implications of tech. They don’t even have a pat rehearsed answer. They are shocked at being asked. Which means nobody is asking those questions.”

What can be done about this ethical blind spot? Three words: Liberal Arts Education.

I’ll be going over what constitutes a Liberal Arts Education in future articles, but a rough definition is that Liberal Arts is an educational philosophy that dates back to Ancient Greece. It attempts to create well-rounded, democratically minded, self-reliant people. The word ‘liberal’ in Liberal Arts is related to liberation – not being politically liberal, but rather, thinking freely.

The process of becoming a free thinker requires hard work and discipline – learning how to code is more straightforward than asking yourself what ethical responsibility you have for the code you create. Philosophy courses are centered around asking these hard questions but are often unfairly maligned because they don’t provide cheap and easy answers. For example, students in Ethics courses aren’t told what’s right and wrong – they are confronted with situations that force them to take new perspectives and think critically about their own assumptions. Students in the History department aren’t just tasked with remembering names and dates, but are confronted with conflicting narratives, cultural biases, and the difficult task of distilling a vast amount of information into something readable.

The Liberal Arts education doesn’t exist to prepare you for a specific job, but rather to prepare you for a number of possible careers by giving you a basic level of understanding in a variety of diverse fields.

Unfortunately, this well-rounded education is being sidelined in favor of a checklist-fueled, skills-based approach that is more suited to churning out worker bees than thoughtful citizens. It only would have taken one courageous, free-thinking person to ask the tough ethical questions that could have avoided the scandal of Cambridge Analytica, but that person either wasn’t present, didn’t speak up, or was ignored. I worry that by allowing students to focus so narrowly on “job skills”  we are creating more technically adept, yet ethically, socially, and morally unaware workers.

The recent indiscretions of Facebook are a harbinger of things to come if we continue to value technical skills at the expense of a well-rounded education. If we continue to pump out narrowly-educated programmers, we might avoid the next hiring shortage only to crash headlong into a humanitarian catastrophe. Other companies and industries aren’t off the hook either, but they can look to Facebook to catch a glimpse of their own.

Since the scandal, Facebook’s valuation has dropped 75 billion dollars7. Mark Zuckerberg was called to Washington D.C. to apologize and show deference towards government regulation that could cause aftershocks across the tech sector – possibly at the expense of Facebook’s competitors. Even Apple CEO Tim Cook has pointed out that “Facebook should have regulated itself, but it’s too late for that now.8” I have my doubts that this congress will be able to successfully craft, pass, and implement meaningful regulation for an ever-changing tech sector. Doubtless, we would all be better off if Facebook had hired more well-rounded employees, particularly at senior level—team members who might have spent more time asking if they should, rather than if they could. I’m not sure you can regulate ethical competence, but I’m certain you can teach it.

It may not be new, sexy, or disruptive, but Liberal Arts is a necessary starting point as we start to map out the future of education. Ignoring the importance of a well-rounded education puts students, companies, and the general public at risk.

More from Trophy Nation…
Like this article? Share it on social media:


1. Cadwalladr, C. ‘I made Steve Bannon’s psychological warfare tool’: meet the data war whistleblower. The Guardian (2018).
2. Wagner, K. Here’s how Facebook allowed Cambridge Analytica to get data for 50 million users. Recode (2018). 
3. Kramer, A. D. I., Guillory, J. E. & Hancock, J. T. Experimental evidence of massive-scale emotional contagion through social networks. Proc. Natl. Acad. Sci. U. S. A. 111, 8788–90 (2014).
4. Sullivan, G. Cornell ethics board did not pre-approve Facebook mood manipulation study. The Washington Post (2014).
5. Angwin, J., Scheiber, N. & Tobin, A. Dozens of Companies Are Using Facebook to Exclude Older Workers From Job Ads. ProPublica (2017). 
6. Angwin, J. & Parris, Terry, J. Facebook Lets Advertisers Exclude Users by Race. ProPublica (2016). 
7. Cherney, M. A. Facebook valuation drops $75 billion in week after Cambridge Analytica scandal. MarketWatch (2018).
8. Kafka, P. Tim Cook says Facebook should have regulated itself, but it’s too late for that now. Recode (2018).


Sign Up for Our Newsletter
Trust us, it’s amazing.

Click here to stay in touch