Skip to main content

The Ethical Minefields of Technology

As we embrace technological innovation, we must also grapple with its implications

This article was published in Scientific American’s former blog network and reflects the views of the author, not necessarily those of Scientific American


Almost all businesses today rely on some type of software, and few people would dispute the productivity increases that accompany new technology. However, there’s a flip side to the embrace of advanced technology, and one with significant implications.

The issue has emerged relatively recently. Just ten years ago, the introduction of new software features didn’t really impact the real world—after all nobody ever joined the downtrodden after upgrading their spell checker. Today it’s different, and the design of sophisticated software is affecting people in alarming and unexpected ways.

For example, consider the highly automated world of online recruiting advertising. A study undertaken at Carnegie Mellon two years ago found that job advertisements for women were significantly less likely to display high paying jobs than for men.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


That’s unsettling, but perhaps not as life changing as the impact of the software used in some American states to assist judges in sentencing. The investigative journalism organization ProPublica studied one of these systems and found a massive bias in the algorithm it used to predict likely reoffending. When comparing the risk scores of 7000 people in Florida in 2013 and 2014, ProPublica found that the software predicted that black people were 77 percent more likely to be identified as at risk of committing future violent crime when compared to white people.

An increasing number of examples like this means that it’s no longer possible to point the finger at a couple of cases and dismiss the issue. The impact of a lack of ethics in software development now extends to a wide range of applications including the world’s most commonly used software—Facebook.

While very few Facebook users actually understand the software, many of them understand the level of attention it demands. The use of the word “demands” is important, as, like many other social media companies, Facebook has constructed its software to be the equivalent of an addictive casino in your pocket.

Tristan Harris spent three years as a Design Ethicist for Google, and is an expert in understanding how social media controls your attention. He points out that software designers have created apps that exploit psychological tricks to ensure that you keep checking your phone repeatedly, and Facebook is no exception.

In an interview last year, the prominent media commentator Douglas Rushkoff stated that the increasing use of these techniques could have significant consequences. "Instead of looking at technologies programmed to enable human beings to better navigate the world I see technologies optimized to help corporations better navigate and manipulate human behaviour.”

Companies like Facebook manipulate human behaviour because they profit from selling advertisements, and the more frequently people use their software, the more advertisers the company can attract.

To put this in perspective, a study at Nottingham University in 2015 found that people used their phones up to 80 times per day. Hardly any of this usage was for making telephone calls.

The never-ending competition for your attention by software on smartphones is an issue that deserves attention because studies have shown that the more you use software like Facebook, the unhappier you are.

Jean Twenge is a Professor of Psychology at San Diego State University and the author of the book “iGen: Why Today's Super-Connected Kids Are Growing Up Less Rebellious, More Tolerant, Less Happy” She has studied generational differences for 25 years, and has seen teenage loneliness and depression spike dramatically since 2012—shortly after the arrival of the smartphone. Twenge points out that research shows that 13 year olds who are heavy users of social media increase their risk of depression by 27 percent. She also points out that among other things, social media use is linked to significantly increased rates of suicide, and decreased levels of sleep among teenagers.

While this is concerning and needs attention, it’s what’s coming next that really worries me.

Technology developments are like evolution—eventually the dominant lifeform must evolve or die. Many technology observers note that the smartphone is already at the peak of its evolutionary cycle, and the next generation of personal computing is emerging rapidly. The leading contender to supersede smartphones is likely to be mixed reality environments—including virtual reality and augmented reality.

The heady and immersive nature of these technologies has the potential to make today’s addictive social media software look like wooden kindergarten blocks. If you’re concerned about teenagers shutting themselves in their rooms to spend time on social media, just wait until the development of software where the feedback mechanisms are so addictive that kids would rather stay in the virtual world than exit to the real world.

So what’s the best way to start to address these issues?

From a corporate perspective, there needs to be considerably more diversity in the programming workforce. The technology giant with perhaps the most active diversity program is Apple, but even then it’s statistics are woeful. Apple’s most recent figures show that 68 percent of its workforce is male, and 56 percent is white. A much more diverse technology workforce would be one way to start addressing some of the gender and race biases that are appearing in software.

Laying the blame entirely on programmers isn’t fair, as software can also provide ways of ensuring biases are detected before they start to impact people. One such tool is called Themis, a free piece of software that enables programmers to test their creations with thousands of fake users, each with different profiles to test for discrimination. Clearly this is a process that should be applied regularly to any software that has real world implications such as in financial services and justice.

Regulators also need to move with some urgency, but this is problematic. James Duncan is the co-founder of the independent consultancy Stance Global, and served in senior technology advisory roles in the UK Cabinet office until the end of last year. He has observed the policy issues first hand and notes that the building blocks of our world are being replaced by the foundations of a digital one.

“Society keeps up because the technology needs to be able to land somewhere,” says Duncan. “The same is not true of our governments, and to fix it will require effort and thoughtfulness that is not currently on display.”

Fixing the issues requires technology leaders to have a much wider view of the world. The industry doesn’t need more programmers, it urgently needs more women, ethicists and philosophers.