ADVERTISEMENT
  About the SA Blog Network













Solar at Home

Solar at Home


The trials, tribulations and rewards of going solar
Solar at Home Home

How Optical Illusions Can Build a Better Bulb

The views expressed are those of the author and are not necessarily those of Scientific American.


Email   PrintPrint



At the SciFoo conference last weekend, brain scientist and illusionmeister Steve Macknik elevated a basic principle of energy conservation—turn off the lights when you don’t need them—to a whole new level. He showed how you can turn off the lights in a way that no one will even notice.

Right now, an AC light bulb turns on and off 50 to 60 times a second, or even faster for a modern ballasted bulb. If you could stretch out the off intervals, you’d save electricity. And if you could do it cleverly, taking the response of the human eye into account, the flickering would not only be imperceptible, but the quality of light might actually improve. “We’re taking advantage of the dynamics of the visual system,” Macknik says.

So far, most efforts to create greener bulbs have focused on the spectrum of light. Incandescent bulbs have long spectral tails, so even at their best, only about a tenth of their energy goes into useful illumination; most gets squandered as infrared radiation. (You can use this handy online calculator to see how much energy a glowing filament emits over different wavelength ranges.) The three main alternatives—fluorescents, LEDs, and a new technology called electron-stimulated luminescence—all emit light by causing a phosphor to glow at certain wavelengths, lopping off the tail. The downside is that spectral engineering takes a toll on color quality. It took me a long process of trial and error to find a good mix of bulbs for our house; I have a big box in my basement with all the CFLs and LEDs I’ve screwed in and back out again. New York Times columnist Bob Tedeschi recently offered his tips on how to match bulb to room.

Macknik and his colleague Susana Martinez-Conde, who work at the Barrow Neurological Institute in Phoenix and write a column for Scientific American Mind magazine, propose to explore not the spectral but the temporal dimension. The two of them have made a name for themselves coming up with visual illusions, and they think bulbs could exploit these tricks to shed more light with less power.

They start with the famous illusion that two identical gray dots or squares look different when you put them on different backgrounds. They find they can accentuate the difference by flickering the illumination at a certain rate. “You can make something brighter or dimmer if you change the temporal dynamics appropriately,” Macknik says. This result seems simple enough, but he says visual studies are prone to a lot of subtle biases; volunteers’ perception of light and dark can be skewed by the order in which the experimenters present their test scenes. It took years to come up with a controlled experiment that could tease out the optimal flicker rate.

This illusion can work even if you don’t realize the light is flickering. When a light turns off and back on within 20 milliseconds, it looks as if it never went off; retinal cells detect the change, but the stimuli get blurred as the brain processes them—an effect known as flicker fusion. Yet the gray patches still look brighter or dimmer.

To achieve the highest contrast, Macknik says a bulb would turn on for 70 milliseconds, off for 10, and repeat—an 87.5-percent duty cycle. Straight away, this would use 12.5 percent less electricity. You can wring out another 10 percent or so because the enhanced contrast would let you get away with a lower-wattage bulb.

The 12.5-hertz power cycle that Macknik proposes flouts conventional wisdom, which holds that such a low frequency would cause an unbearable amount of flickering. No one wants their bedside lamp to become a dance-club strobe light, which is one reason why power companies chose an AC frequency of 50 or 60 hertz to begin with. (If you still have a CRT computer monitor, try changing the refresh rate to see how low a rate you can stomach.)

Ah, but the AC current is a sine wave. Macknik’s claim is that a different waveform eliminates visible flickering even at very low frequencies. Such a waveform could be readily programmed into LED bulbs, which convert AC power to DC and could modulate the DC output as desired.

A 20-percent savings isn’t all that much, compared to the factor of four you get when changing from an incandescent to a CFL or LED. For me, the bigger lesson is that engineers can draw on the findings of neuroscience and behavior science to fine-tune technology to our needs.

Optical illusion courtesy of Edward H. Adelson

Demonstration of how flickering makes a gray circle look brighter or lighter. (c) Stephen L. Macknik & Susana Martinez-Conde, All Rights Reserved 1998, 2011

George Musser About the Author: is a contributing editor at Scientific American. He focuses on space science and fundamental physics, ranging from particles to planets to parallel universes. He is the author of The Complete Idiot's Guide to String Theory. Musser has won numerous awards in his career, including the 2011 American Institute of Physics's Science Writing Award. Follow on Twitter @gmusser.

The views expressed are those of the author and are not necessarily those of Scientific American.





Rights & Permissions

Comments 12 Comments

Add Comment
  1. 1. jtdwyer 10:20 pm 08/18/2011

    The article states:
    “Right now, an AC light bulb turns on and off 50 to 60 times a second, or even faster for a modern ballasted bulb. If you could stretch out the off intervals, you’d save electricity. And if you could do it cleverly, taking the response of the human eye into account, the flickering would not only be imperceptible, but the quality of light might actually improve.”

    What advantage would approach have over a dimmer switch?

    Link to this
  2. 2. gmusser 8:00 am 08/19/2011

    @jtdwyer: A standard light dimmer changes the voltage, not the AC waveform. What Macknik proposes is closer to a PWM dimmer.

    Link to this
  3. 3. BBHY 10:17 am 08/19/2011

    @gmusser,

    No, a modern light dimmer works by extending the off time. They use either SCRs or thyristers to switch the current on after a variable delay on each AC wave. Only very old light dimmers work by simply changing the voltage.

    To get the same light output while having an extended off time, the peak power output of the lamp would have to be higher. Then overall power consumed for the same light output is about the same, likely a bit worse for the dimmed version since some power will be wasted in the control circuit.

    This seems like a faulty idea as far as I can tell. Maybe there is something that was left out of the explanation.

    Link to this
  4. 4. jtdwyer 11:13 am 08/19/2011

    gmusser: wouldn’t simply reducing the voltage also ‘save electricity’? What real advantage would there be in reducing cycle time?

    Link to this
  5. 5. davyduck 11:14 am 08/19/2011

    @gmusser,

    Your article is misleading; the savings for INCANDESCENTS are illusory. Incandescent filaments glow visibly because they are hot. The frequency of the power input would have to be very low for the filament temperature to vary appreciably, so low that it would be quite noticeable to the human eye. Worse still, if you’d allow the filament to cool off, the “inrush” current increases considerably, so much so that you would actually DECREASE the efficiency. Your scheme might be effective for LEDs, but it would be insignificant compared with the basic efficiency advantage LEDs have over incandescents.

    BTW, @BBHY is quite right about modern dimmers being PWM.

    Link to this
  6. 6. gmusser 12:14 pm 08/19/2011

    @BBHY @davyduck: Well, I did distinguish “standard” dimmers from PWM, but I take your point that the latter has become more common.

    Even incandescents flicker discernibly at low line frequencies, hence the choice of the current AC standards. But you’re right that Macknik is really thinking in terms of solid-state lighting.

    Link to this
  7. 7. Dannyoz 3:57 am 08/21/2011

    You are all right guys about the PWM (otherwise the dimmer will get very hot…) and the persistence of the filament.
    I would like to point another mistake – the AC lamps turn on and off 100 or 120 times a second, twice in each cycle!

    Link to this
  8. 8. Joseph C Moore, Cpo USN Ret 7:44 am 08/21/2011

    I googled sci foo to find out that foo stands for friends of O’Reilly. Then I googled O’Reilly Media to find out that it is owned by that Neo-Con, Bill O’Reilly. Democrat sponsor or Neo-Con sponsor – I am NOT thrilled with either. Both parties are disingenuous and forcing our Republic into Socialism.

    Link to this
  9. 9. Bora Zivkovic 3:05 pm 08/21/2011

    Nope. Tim O’Reilly. Not Bill.

    Link to this
  10. 10. gmusser 8:27 pm 08/21/2011

    I’m just baffled that people accuse my post of these errors. Is no one actually reading what I’ve written? It stated explicitly that this principle makes most sense for LED lighting!

    What is more, filament persistence notwithstanding, the simple empirical fact is that incandescent light flicker is noticeable for AC line frequencies below about 40 Hz. The filament doesn’t have to cool appreciably for the flicker to be seen.

    Link to this
  11. 11. odyssoma 8:51 pm 08/29/2011

    Filament flicker is noticeable at 40 Hz. I can remember when some Canadian power was delivered at 40 Hz, and we US folks could see the flicker. (Never did ask a Canadian. Any of you out there who can remember this?)

    Link to this
  12. 12. macknik 5:52 pm 09/6/2011

    Hi all,
    I thought I would chime in to clear the air, as there are factual errors in most comments thus far. The main thing to keep in mind is that we reported that human visual perception peaks at a specific range of stimulus flash duration, which means that at durations that are higher, power is unnecessarily spent in maintaining the light. Our report at Sci Foo stated that each pulse of light should have an optimal duration (whether powered by AC or DC is irrelevant), and in the case of continuously flickering light sources, each cycle should have an off-time that promotes flicker fusion (> 17 msec) and which further saves power. To put it another way: light looks brighter if you tune it to the temporal dynamics of the brain, so we should do that (nobody has ever done that). This is about human perception: not electrical engineering.

    1) @jtdwyer: See above –> the technology allows you to turn down the voltage while maintaining the same brightness (so long as the flicker rate and duty-cycle are optimal)… thus you save money because the voltage is lower for the same perception.

    2) @BBHY: If the the On-time is optimized, then having the longest off-time possible per cycle (while still maintaining flicker-fusion) will result in increased energy savings because ***you do not have to increase the voltage since the perceptual effect in the brain increases the brightness for you for free***.

    3) @davyduck: You are factually incorrect. The visual system does indeed follow the flicker of incandescent (or fluorescent) lighting, as evidenced by physiological recordings in the brain. Vision scientists must use special DC-driven light bulbs in the lab when recording from visual neurons, or they follow the flicker readily. Edison further showed that the perceptual flicker fusion threshold is between 24-48 Hz, depending on the lighting conditions and duty-cycle. Our technology shows how to create flicker-free low-rate flicker that optimizes visual perception (and, as a by-product, saves energy).

    4) @Dannyoz: correct, incandescent and fluorescent photon production is full-wave rectified with respect to the power grid. The brain can still perceive the flicker.

    5) @gmusser: Actually, technically, PWM LED lighting is AC lighting. Thus the technology suggests that all lighting should be made AC with a specific waveform that promotes enhanced brightness in the brain.

    6) @odyssoma: thank you. You are absolutely correct. Edison was the first to really work out a flicker-fusion threshold of about 48 Hz in movies.

    I cannot get into more detail as the paper has not yet been published in a peer-reviewed journal (I must stay within the bounds of what we reported at Sci Foo or violate embargo rules at the journals). But thank you all for the interest, and I’ll be happy to discuss this further after the paper is published!

    Link to this

Add a Comment
You must sign in or register as a ScientificAmerican.com member to submit a comment.

More from Scientific American

Scientific American Special Universe

Get the latest Special Collector's edition

Secrets of the Universe: Past, Present, Future

Order Now >

X

Email this Article

X