Facebook recently announced it had acquired CTRL-Labs, a U.S. start-up working on wearable tech that allows people to control digital devices with their brain. The social media company is only the latest in a long string of firms investing in what has come to be termed “neurotechnology.” Earlier this year Neuralink, a company backed by Elon Musk, announced that it hopes to begin human trials for computerized brain implants.

These projects may seem like science fiction, but this drive to get more out of our brains is nothing new—from tea, caffeine and nicotine, to amphetamines and the narcolepsy drug Modafinil, drugs have long been used as rudimentary attempts at cognitive enhancement. And in our tech-driven world, the drive to cognitively enhance is stronger than ever—and is leading us to explore new and untested methods.

In today’s hypercompetitive world, everyone is looking for an edge. Improving memory, focus or just the ability to work longer hours are all key to getting ahead, and a drug exists to improve each of them. In 2017, 30 percent of Americans said they had used “smart drug” supplements, known as nootropics, at least once that year, even if studies repeatedly demonstrate that they have a negligible effect on intellect.

For some, however, nootropics are not enough, and so they turn to medical-grade stimulants. The most famous of these is Adderall, which boosts focus and productivity far more than commercial nootropics. A well-established black market thrives on university campuses and in financial centers, supplying these drugs to people desperate to gain a competitive edge.

Despite its popularity, recent research has suggested that Adderall’s benefits could be illusory. It alters the balance of chemicals in the brain tethered to our emotions, in this case dopamine, which is responsible for our feelings of reward and pleasure. The drug makes people feel more motivated or upbeat and therefore content to work intensely for longer. Our brains crave the rewarding “high” generated by increased dopamine, meaning users of Adderall eventually become compelled to maintain this elevated, intoxicating state. 

Yet, despite a wealth of evidence that shows its damaging effects, its popularity endures. Adderall and its sibling drugs have been taken in by the marketplace as a short-term means to an end, reflective of a working culture that rewards those who push themselves to the limit. Their usage in academic settings is now endemic: around 30 percent of American students say they have taken Adderall or Adderall-like stimulants at least once. As university degrees and jobs become only more competitive, these numbers seem likely to grow, regardless of the risk.

Human nature makes us perpetually dissatisfied with our present condition. We want to improve the most important thing in our lives: ourselves. Our very neurochemical composition makes us want to perfect ourselves—which perversely has led us, in the guise of Adderall and many other drugs, to want to change our neurochemical make-up. But short-term stimulants may only be the beginning.

Artificially changing brain chemistry, for example through stimulants, is a process known as neuromodulation. This is a delicate process; you cannot change only one part of the brain. For example, boosting levels of dopamine can cause a loss of empathy, impulse control and caution. Now imagine what a wanton cocktail of chemical changes could do to the mind.

The possible benefits of neuromodulation are obvious. Mental health treatment would be revolutionized if happiness chemicals like serotonin or oxytocin could be increased with precision. But our experience with stimulants to combat ADHD and narcolepsy suggests neuromodulation will not be limited to health care.

How far can we “enhance” our minds before we lose our sense of identity and authenticity? Eliminating negative feelings like fear and anxiety might seem like a no-brainer, but could leave us emotionally and morally stunted. The U.S. Defense Advanced Research Projects Agency (DARPA), has for years sought “biomedical” tools to enable “stress resistance” and “accelerated learning” among soldiers. Would the “perfect” soldier retain the aspects of their personality that made them human?

There are also numerous ethical and societal implications, regarding issues such as fairness, meritocracy and authenticity. Increasingly prevalent and effective enhancers carry the likelihood of deepening inequality, since the wealthier in society will have early access to the most advanced drugs and neurotechnology, gaining even more of an advantage. Already-fragile notions of meritocracy would be undermined, and education systems would struggle to fairly reward high achievers. And even if smart drugs became universally accessible, it seems likely that the overworked would be pushed even harder, with the “unenhanced” left behind.

Of course, it is not all doom and gloom. Innovation in cognitive enhancement, as anywhere else, can be a force for much good and should be encouraged. In any case, these problems are only hypothetical, and current use of potent enhancers is limited to relatively small sections of society. It is important that we initiate a dialogue about the risks as well as the benefits of preempting evolution.

Innovation currently moves faster than culture or even laws can handle, and we will soon lose the luxury of time to debate. Today we are dealing with stimulants that provide only temporary benefits in the short term, but with companies like Facebook and Musk’s Neuralink investing in neurotechnology, what will happen when a drug or device comes along that works? If a response comes only once the next generation of enhancers is readily available, it will be too little too late.