top of page

Written by Maisha Razzaque & Hafsa Memon

Edited by Andrew Neff

July 2019

The Need for Ethics in Tech

Keeping innovation from getting too evil; ethics classes

Dots.gif
Technology ethics

Tech is an extension of our daily lives, for some of us more than others. According to a 2017 survey conducted by ReportLinker, a tech analyst company, 46 percent of Americans surveyed say they use their phones before even getting out of bed in the morning. That doesn’t surprise us because it’s not a stretch to imagine waiting a few minutes for our bodies to warm up while scrolling through Twitter, checking our email, or reading the news. Beyond the clockwork-like instinct to start the day with our phones, we use tech for a myriad of things: shopping, entertainment, tracking our sleep and/or exercise habits, or even just looking up the most trivial information while passing the time. The age of technology has even transformed certain workplaces. From adopting cloud-based scheduling systems for efficiency to life-changing robotic arm surgeries, the far reach of tech is inescapable and magnanimous. And it’s impressive! We are in the dawn of an era of progress and innovation that has the potential to yield great benefits for humanity. Of course, if you’re keeping up with the news or reading/watching any kind of Bradbury-esque science fiction, you know there is a flip side to this. No device user wants to be used by their device. This begs the question: Is tech the champion tool for progress or are we setting ourselves up for a very Matrix-like future?

​

Noam Chomsky, the famous linguist and cognitive scientist (also known in these parts as the Cognitive Science Daddy) once said: “Technology is basically neutral.” He likens it to a hammer: “The hammer doesn't care whether you use it to build a house, or whether a torturer uses it to crush somebody's skull.” And he’s right; there’s no inherent good or evil behind tech. It remains, like most invaluable tools, up to the human factors behind it to assign a moral value. Therefore, there’s a certain burden of responsibility that falls on the creators of this so-called hammer. After all, tech can be creepy as hell sometimes. But also we love it. Every time a new iPhone comes out, a tiny but persistent part of you wants to sell an organ on the black market so you can afford to animate your face into a barn animal. But what happens when our affair with technology goes south, and who answers for it? 

Dots.gif

Skeptical Science

With a

Humanistic Touch.

Neuroscience From Underground is

a Psychology and Neuroscience Blog

​

~~ Support Us Today on Patreon ~~

5% of our annual proceeds are donated to the John Templeton Foundation.

Also please take just a very quick moment to share this article on twitter and facebook.

Thank you!

Dots.gif

For example, an article in Verge emerged in December 2017 about Caryn Vainio, a woman whose friend posted a status about being in the hospital briefly before his unexpected death. Vainio and several of her friends did not know about the post regarding the hospital update until weeks later because Facebook’s sorting algorithm simply didn’t consider it to be content worthy of displaying on their feeds, possibly — as their algorithm guides suggest — because her friend didn’t post that much. Consequently, Vainio and her friends were unable to visit their friend in the hospital, and can’t console themselves from thinking that their friend might have died feeling isolated, unwanted or unloved. This isn’t about a singular incident, as Facebook is regularly in hot water for its data experiments. In 2012, a group of Facebook data scientists began to manipulate feeds of nearly 690,000 users and reported on the phenomenon of “emotional contagion.” They found that pushing exclusively negative content on feeds caused users to create more negative posts, and vice versa. This means that if you were subjected to weeks of “random” videos about orphans, abused animals, and other similarly morose content, you were unknowingly being psychologically manipulated by Facebook data science researchers. The study was made public and yielded a lot of criticism. In response to Facebook’s dismissive attitude toward the outrage, a 2014 article on The Conversation called out Facebook’s flimsy claim of informed consent. They argued that while the user agreement informed users that their data would be used for internal research, they were not informed on how that was going to happen. Though the topic was not discussed further, the flippant attitude about a possible violation of Institutional Review Board (the official institution of research ethics) guidelines of informed consent is an alarming indication of the corporate take on protecting consumers.

​

Technology Ethics Infogaphic

You might be thinking that these aren’t exactly red pill situations. Maybe this stuff only matters to people who are on Facebook long enough to be psychologically jerked around by data scientist nerds. You want a larger-scale example of tech gone wrong? You got it. The short-lived buzz last year was Jeff Bezos and Co.’s brainchild Amazon Go. Basically, they took the concept of grocery shopping and turned it into a design problem: making the task a “charming experience” with an emphasis on eliminating checkout lines. John Blackledge, a financial analyst at the Cowen Group told Huffington Post that the grocery store initiative is one of the biggest potentials for revenue for the company. The “Just Walk Out” technology would allow payment transactions to be done via app, establishing a self-running grocery store. Since its conception, Amazon Go has stayed out of the news, but one can’t help but wonder if this model sets a precedent that will end with retail workers — specifically, cashiers — facing the possibility of losing their jobs to an app. 

​

The ongoing obsession with curving up productivity coupled with applauded technological advancements may threaten job growth for millions, but not to worry; our human bodies are still assets in the dystopian machine! Dutch scientists at the Institute of Human Obsolescence (yes, that’s a real thing!) have figured out how to harness human body heat to mine cryptocurrency. Although the process hasn’t been done to mine Bitcoin specifically yet, several trials have been conducted to mine other cryptocurrencies such as Vertcoin and Startcoin. The process is as terrifying as it sounds. Basically, wearable units of thermoelectric generators capture excess heat from a human body at rest. The generator converts the body heat into electric energy which powers the computers tasked to mine cryptocurrency. Motherboard reports that a single Bitcoin interaction requires the amount of 10 households per week, and that one single bitcoin could be mined with the body heat harnessed from 44,000 people. The CEO at the Institute of Human Obsolescence, Manuel Beltràm, tells Motherboard that this method of maintaining the Bitcoin blockchain is a step towards being ecologically responsible — due to its sheer energy consumption. But how do we, as a society, feel about biological labor? When and where do we draw this line on these “innovations”? 

​

Technology Ethics Statistic

The arguments against regulation aren’t new. Regulation has been capitalism’s biggest adversary, but at the same time, a source of consumer protection. Measuring success by eyeing profit margins is a norm in our consumer-driven society, so naturally, tech innovators want to see efficiency and rising profits from their products. Goal-oriented creators may not prioritize the moral obligations they have while holding such immense power in their hands. Technology — like a great number of things we depend on — requires a degree of regulation. The DEA won’t let you buy an excessive amount of cough medicine just in case you start making meth. The FDA won’t allow peanut butter manufacturers to sell cans of peanut butter with more than four rodent hairs. In the same vein, shouldn’t data mining via your Facebook account be limited to a reasonable amount? In fact, Section 215 of the USA Freedom Act (for those who aren’t familiar, this 2015 act renewed the expired portions of the infamous Patriot Act of 2001) even restricts the National Security Agency from their continued mass phone data collection; instead, the NSA can only obtain phone data from phone companies with permission from federal courts. It’s almost laughable that we can’t really expect an equal degree of restraint from Facebook. But we aren’t naive. Obviously, we have to acknowledge that the government works slowly. Any piece of policy regarding a hot topic such as this requires multiple levels of red tape. On top of that, which of your congressmen are well-versed enough in data mining and sorting algorithms to have an appropriate plan of action to address the ethics in technology?

​

Maybe it’s in our best interests to act quickly and find a solution that works. It’s time to address the values held by our baby tech geniuses before they grab their degrees and jump into The Real World™. I’m talking about ethics classes, to be specific. Too extreme? Actually, it’s more plausible than you might think. MIT, Rutgers, Northeastern University and Texas Tech have all started offering courses addressing the ethics of engineering. They discuss topics such as the responsibilities that civil engineers have to the native population when constructing structures in a developing country. Closely following are universities like the University of Buffalo and the University of North Carolina; both offer courses on ethics in computer science. The University of Texas at Dallas offers a class that fits this particular profile: “Professional Responsibility in Computer Science and Software Engineering.” The CourseBook description touts discussion topics like “social implications of the Internet,” “interaction between human values and technical decisions,” and “intellectual property” (which is the reason why all those phrases were in quotes). Requiring CS majors to take these classes isn’t a jab at tech creators or their moral leanings. In fact, it’s an attempt to foster their growth and innovation that doesn’t toy with moral gray areas. It’s high time we offer a healthy, collaborative space for tech nerds to discuss with other tech nerds how to safely wield the power they will one day hold. Our values may still align with the enthusiasm behind rapid progression, but requiring classes in the context of applications in corporate settings could be our first line of defense against the fear of Big Bad Data.

​

It’s not like our artificial intelligence overlords will end up forcing us into slavery by 2020, right? Probably not. We get it: You still feel like this trivial stuff isn’t worth the sweat. Technology is cute! Technology is fun! You just traded in your pet cat for a low-maintenance pet Roomba last week, and that’s fine! The issue, however, is of precedence. If we don’t draw a line now, will we have the power to pull technology back and draw that line later? Engineers, computer scientists, data analysts and the like all need a firm reminder that the ultimate goal for tech is to work for people. Everything they work on should, theoretically, be for the betterment of society. We can’t afford to have people in these professions who feel comfortable tossing their morals into a glass bowl when they walk into work, then picking them up on the way home. It’s time to start having the sticky conversations about the dubious morality behind the direction in which our tech might be heading. So before we jump down the throats of creators and demand legislation to hold them accountable, we need to equip them with the capability to be better. We need to start requiring that these baby geniuses take an ethics class specifically geared towards their STEM field sometime during their undergrad years because the best time to teach them how to use this proverbial hammer for good and not evil is when they are learning to use it for the first time.

Dots.gif

Join The Conversation

Skeptical Science

With a

Humanistic Touch

A Psychology and Neuroscience Blog

Business

~~ Support Us Today on Patreon ~~

5% of our annual proceeds are donated to the John Templeton Foundation.

New Frankenstein

The neuroscience of reviving brains

​

Sean Green ~ July 19

​

bottom of page