Skip to main content

It’s like an Analogy

Creative communication can help, or hurt, our attempts to bridge the divide between technically or emotionally disparate audiences

This article was published in Scientific American’s former blog network and reflects the views of the author, not necessarily those of Scientific American



On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Communication is seldom straightforward. It can be like the childhood game of telephone, the initial message getting warped and mangled with each new telling. Selecting arguments can be as fraught as clipping wires to disarm a bomb. Or it can all flow as easily as the marriage of wind and sails on a perfect lake day. And, of course, communication can be self-serving, like a series of nested analogies all crammed into an opening paragraph to make a point.

Whatever your preferred analogy (or metaphor), it’s clear that effective communication can often feel much harder than it needs to be. After all, people talk all the time! They read and write and listen! But somehow our most well-intentioned messages still end up getting missed or misunderstood. This feels especially true in contexts that bring together technical and non-technical groups around a common problem. And sometimes the tools we try to use to overcome that confusion end up having unintended consequences of their own. Here we explore this landscape through three anecdotes: the analogy that did its job too well, the simplification that didn’t do its job well enough, and the extended metaphor that we hope will be just right. Once upon a time…

The analogy that did its job too well: In an otherwise mundane meeting, a group of colleagues took a moment to celebrate the rollout of a long-awaited, behind-the-scenes feature. There was even a whoop and smattering of applause in the meeting room. Our colleague from marketing – who typically languished through the development update portion of the agenda – perked up. She asked if this was something we could capitalize on. Announce? Get a bit of fanfare? She was met with silence and a few horrified stares. People around the table tried to explain that the change wasn't something to brag about. She was either unconvinced or so deeply starved for things to promote from our typically unglamorous unit that she got up to bring in the CEO. "No,” I blurted. “It would be like saying, ‘Your favorite cereal - now without syphilis!’ We don't want to draw attention to the fact that this new process has only gone into place now." It was my turn to be met with silent horror. The point, however, had been made. She said, “I see,” and sat back down. Had I made my point? Yes. Had I also become “That woman who said that thing about syphilis?” Also, yes. It was not the best.The simplification that didn’t do its job well enough: Another interdepartmental impasse formed around the word "complex." Users requested that a new type of data be integrated into an existing model. The technical team didn't want to get into the details of the code's infrastructure – how such an addition would have to touch many places, be broken into many parts, and be maintained separately across the model. They also were hesitant to say outright that the design of the model – what the users had asked for – was specifically what prevented the inclusion of the kind data those same users were now asking for. The modeling team determined that the conversation would be uncomfortable and probably over the heads of the users. It was better to simplify. So, they said that adding that feature would be too “complex.” This did not solve the problem. The users came back with what they thought was a clarification. No, no, they explained, we don't need it to be included in a sophisticated way. We just need a basic idea of any potential effects. So, just add a “simple version,” okay? No need to take the high-level, complex approach. This devolved for weeks. Eventually the modeling team had to break down for the users that including any data of that type – even a version so simple as to be meaningless – would be so computationally demanding as to grind operations to a halt. They hadn’t been asked to build the system that way, and so it hadn’t been built that way. Period. Time was spent, words were said, and it was also not the best.The extended metaphor that we hope will be just right: The final group also suffers a vocabulary-based breakdown, with word choice expanding from a disagreement into an instrument for mutual gaslighting. The word in this case is “system.” The system in question was designed to monitor materials and communication through the duration of a review process. While largely automated, some parts of the system required manual entry – and errors at those points were bogging down the rest of the system. The users cried out that the system was broken. The technical team said that the system was working exactly as designed, that the data-entry errors were to blame. To the users, the “system” referred to everything from top to bottom, including all human and non-human components. The developers insisted that the system only included the automated components, and human errors were not theirs to prevent or solve. Rational conversation became impossible. One side used increasingly insulting language to describe how poorly designed the system was to allow for such debilitating errors, and the other stood behind a metaphorical shield of “the code is working exactly as intended.” Intervention, and metaphor, eventually had to come from above. Instead of talking about the system itself, the meeting with management was about a car. They imagined a car designed to take people from point A to point B that kept breaking down on the way. The mechanic discovered that the issue was that someone has put diesel in the gas engine. And, after rogue diesel left them stranded on the highway dozens of times, the users had developed severe anxiety about ever stepping foot in the car. With the blame turned into a metaphorical notion, some progress could be made. The mechanic was right to insist that the car worked properly. But the riders were also right to insist that it was insane to expect them to keep using the car with the unexplained, diesel-induced breakdowns. Rather than blaming anyone, the question became what we could do create a metaphorical nozzle that kept diesel from ever being put in the engine again. Who could build and regulate such a thing, metaphorically speaking? Tempers are still high, but at least fingers aren’t pointing quite so hard. And while everyone agrees on what a solution should look like, only time will tell how it turns out. We are hoping for the best.

Amanda Baker is a science communicator and outreach advocate. She has a geoscience PhD from Cornell University and has managed open-access, academic journals as well as the outreach journal Frontiers for Young Minds. She is currently writing and editing science content for kids, from curriculum materials to magazines like Smore. She has served as a Science Olympiad national event supervisor and taught a first-year writing seminar on sustainable earth systems while at Cornell.

More by Amanda Baker