Skip to main content

Neural Networks for Artists

From hallucinogenic-like DeepDream composites to mesmerizing style-transfer videos, visuals provide an engaging entry point to the world of machine learning

This article was published in Scientific American’s former blog network and reflects the views of the author, not necessarily those of Scientific American


None

Original photo by Jen Christiansen. Transformation via Deep Dream Generator, based on open source code.

Remember last summer’s influx of convolutional neural network art, which took the form of hallucinogenic-like DeepDream images, like the one above? Prompted by a blog post and code release by a team of Google engineers, haunting composite generation—also known as inceptionism, as a nod to the movie-related internet meme “we need to go deeper”—became the poster child for artificial neural networks.

Perhaps you’re also familiar with last summer’s style transfer examples. In “A Neural Algorithm of Artistic Style” Leon Gatys, Alexander Ecker and Matthias Bethge describe it as a system that “uses neural representations to separate and recombine content and style of arbitrary images, providing a neural algorithm for the creation of artistic images.” Imagine your vacation photos rendered in the style of Pablo Picasso, or Leonardo da Vinci’s Mona Lisa painted in the style of Vincent Van Gogh’s Starry Night. You can see that example directly, in “Machines and Metaphors,” a blog post by artist and programmer Gene Kogan.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Last month, in his talk “The Neural Aesthetic” at Eyeo Festival in Minneapolis, Kogan reviewed the basic concepts and processes behind inceptionism, style transfer, and other machine learning approaches that involve visual output. And then he upped the ante—and blew my mind—with a series of magical moving pictures. Based on the approach described in "Artistic style transfer for videos," by Manuel Ruder, Alexey Dosovitskiy and Thomas Brox, and their code, Kogan developed a series of style transfer videos rooted in footage he took out the window of New York’s J-train. Here’s a peek at a few excerpts. The first renders the video in the style of Van Gogh’s Starry Night. The second renders it with Google Map imagery.

Kogan notes that he's "fascinated by style transfer in a more general sense—applying a similar technique in the audio domain, on text, or other kinds of media. Style is such an abstract concept to us, which is what makes algorithms inferring it so interesting, even when it's not aesthetically pleasing. If they can capture style, what other kinds of intangible things can they understand?" For more examples of both static and video style transfer, as well as a near real-time style transfer “mirror,” (!) see Kogan’s website

For a few visual explanations of how deep learning works, check out “Unveiling the Hidden Layers of Deep Learning.” Too basic for you? Then move on to Machine Learning for Artists (an in-progress resource from Kogan and Francis Tseng) as well as Kogan’s NYU class notes and videos. For more from the pages of Scientific American, see  “Springtime for AI: The Rise of Deep Learning” by Yoshua Bengio. It’s even better when paired with Bengio’s “Large Scale Machine Learning” video.

Jen Christiansen is author of the book Building Science Graphics: An Illustrated Guide to Communicating Science through Diagrams and Visualizations (CRC Press) and senior graphics editor at Scientific American, where she art directs and produces illustrated explanatory diagrams and data visualizations. In 1996 she began her publishing career in New York City at Scientific American. Subsequently she moved to Washington, D.C., to join the staff of National Geographic (first as an assistant art director–researcher hybrid and then as a designer), spent four years as a freelance science communicator and returned to Scientific American in 2007. Christiansen presents and writes on topics ranging from reconciling her love for art and science to her quest to learn more about the pulsar chart on the cover of Joy Division's album Unknown Pleasures. She holds a graduate certificate in science communication from the University of California, Santa Cruz, and a B.A. in geology and studio art from Smith College. Follow Christiansen on X (formerly Twitter) @ChristiansenJen

More by Jen Christiansen