ADVERTISEMENT
  About the SA Blog Network













Guest Blog

Guest Blog


Commentary invited by editors of Scientific American
Guest Blog HomeAboutContact

Researchers Discover Hacker-Ready Computer Chips

The views expressed are those of the author and are not necessarily those of Scientific American.


Email   PrintPrint



Computer Chip X-RayA pair of security researchers in the U.K. have released a paper [PDF] documenting what they describe as the “first real world detection of a backdoor” in a microchip—an opening that could allow a malicious actor to monitor or change the information on the chip. The researchers, Sergei Skorobogatov of the University of Cambridge and Christopher Woods of Quo Vadis Labs, concluded that the vulnerability made it possible to reprogram the contents of supposedly secure memory and obtain information regarding the internal logic of the chip. I discussed the possibility of this type of hardware vulnerability in the August 2010 Scientific American article “The Hacker in Your Hardware.”

The security breach is a particular concern because of the type of chip involved. The affected chip, a ProASIC3 A3P250, is a field programmable gate array (FPGA). These chips are used in an enormous variety of applications, including communications and networking systems, the financial markets, industrial control systems, and a long list of military systems. Each customer configures an FPGA to implement a unique—and often highly proprietary—set of logical operations. For example, a customer in the financial markets might configure an FPGA to make high speed trading decisions. A customer in aviation might use an FPGA to help perform flight control. Any mechanism that could allow unauthorized access to the internal configuration of an FPGA creates the risk of intellectual property theft. In addition, the computations and data in the chip could be maliciously altered.

Assuming that the researchers’ claims stand up to scrutiny, there are at least two important questions that immediately get raised. First, how did this vulnerability end up there in the first place? Second, what does it mean?

Regarding the source of the backdoor, some people are hinting that Chinese sources may be to blame. But, as Robert Graham of Errata Security explains in a blog post titled “Bogus story: no Chinese backdoor in military chip,” it’s premature to point fingers:

. . . it’s important to note that while the researchers did indeed discover a backdoor, they offer only speculation, but no evidence, as to the source of the backdoor.

And, as Graham also observed, the term “military chip” can be deceptive as well, as these chips are used in a wide variety of applications, many of them unrelated to the military.

It’s possible that this vulnerability was inserted at the behest of a nation state. But it’s also possible that the backdoor is due to carelessness, not malice. Someone in the design process could have inserted the backdoor to enable testing, without realizing that it would later be discovered, publicized, and potentially exploited.

Regardless of the source of the vulnerability, its presence should serve as a wake-up call to the importance of hardware security. Cybersecurity, of course, is a well-recognized concern. Yet the overwhelming majority of cybersecurity vulnerabilities identified to date have involved software, which is the set of instructions that describe how a task inside a chip or system is performed. Software can be replaced, updated, altered, and downloaded from the Internet. By contrast, a hardware vulnerability is built in to the actual circuitry of a chip. As a result, it can be very difficult to address without replacing the chip itself.

This certainly won’t be the last time that a hardware security vulnerability will be identified. As chips continue to get more complex, hardware security flaws—whether malicious or accidental—will increasingly become a part of the cybersecurity landscape. We should put in place pre-emptive measures to minimize the risks they might pose.

 

Photo by tjmartens on Flickr

John Villasenor About the Author: John Villasenor is a nonresident senior fellow at the Brookings Institution and a professor of electrical engineering at UCLA.

The views expressed are those of the author and are not necessarily those of Scientific American.






Comments 22 Comments

Add Comment
  1. 1. HansPL 7:06 pm 05/29/2012

    “… there are at least two important questions that immediately get raised. First, how did this vulnerability end up there in the first place?”

    Wouldn’t it have been so terrible if the author (John Villasenor) would have told us the origin of the chip or that the researcher did not reveal the origin of the chip? What a darn strange article this is! Like “There is a product out there that is dangerous for your children, but we can’t tell you which product it is.”

    And where the heck is “Quo Vadis Labs”? Would it have hurt to tell us that?

    I, for one, really, really miss the editors that read through all articles and alerted the authors to inconsistencies and errors.

    Hans Leander
    Cleveland, Ohio, USA

    Link to this
  2. 2. stallagmite 7:28 pm 05/29/2012

    Easy Sir. This we fight for the last 12 years. Imagine the whole portfolio of advantages. Somebody wants that. Switch off remotely your PC ventilator, and the whole device is immediately gone. The backup can be done remotely as well. All I need is slantwise eyes and Bangalore friends.
    Ontario

    Link to this
  3. 3. GG 2:33 am 05/30/2012

    Since the author did not bother to say…these chips are manufactured by Actel, a Chinese manufacturer in the city of Dongguan, in the province of Guangdong, communist China. Actel is partially owned (you can only own max. %49 in China) by Microsemi, which is an American company listed on the Nasdaq:MSCC.

    Link to this
  4. 4. jtdwyer 7:57 am 05/30/2012

    While the significance of this finding for the computer and electronics industry cannot be dismissed, it’s extremely difficult for casual reader to assess this report’s general significance for electronics devices.

    As I understand, these chips are generally not used as the instruction processor unit on a personal computing device. They are field programmed to provide very specific support processing, such as the processor in graphical display card. They are not typically used to run Windows, for example. These devices are typically programmed by a component manufacturer, installed in some electronics component device and forgotten.

    Access to the discovered back door would typically have to occur during the component manufacturing process. These types of devices are not typically reprogrammed in the field, which is one of the problems mentioned in the study – there is generally no way to correct the programming of the chip once it has been distributed – which also generally implies that there’s no way to access the back door in a field installed unit. Anyone, please correct me if I’m wrong here…

    Please see http://en.wikipedia.org/wiki/Fpga

    Link to this
  5. 5. Michael Moyer in reply to Michael Moyer 8:56 am 05/30/2012

    Thanks for your comment, Hans. That’s my fault: The company name was removed during editing. The manufacturer of the chip is the Actel Corporation, which was acquired by the Microsemi Corporation in 2010. I have added a link to the specific product on the Microsemi site. Quo Vadis Labs is a corporate spinoff of research at the University of Cambridge. http://www.quovadislabs.com/

    Link to this
  6. 6. Michael Moyer in reply to Michael Moyer 9:05 am 05/30/2012

    I’m not sure where you’re getting your information, but Actel was founded in Silicon Valley in 1985 by Dr. Amr Mohsen, a graduate of Caltech who had worked at Bell Laboratories and Intel. Actel was headquartered in San Jose, Calif., until it was acquired by Microsemi in 2010.

    Link to this
  7. 7. John-sensei 9:56 am 05/30/2012

    jtdwyer,

    “Access to the discovered back door would typically have to occur during the component manufacturing process. These types of devices are not typically reprogrammed in the field, which is one of the problems mentioned in the study – there is generally no way to correct the programming of the chip once it has been distributed – which also generally implies that there’s no way to access the back door in a field installed unit. Anyone, please correct me if I’m wrong here…”

    You’re wrong. All you need to access this back door in the field is the ability to poke the JTAG programming pins. The board manufacturer may even provide the interface to do this remotely.

    Link to this
  8. 8. GG 10:32 am 05/30/2012

    AcTel Electronic (Dong Guan) Co., Ltd.
    No.10-11, (Hu Pan Rd) 138 Industrial District, Tang Xia Town, Dong Guan City, Guang Dong Province, China
    Zip Code: 523710
    Tel: 86-769-8791-3815
    Fax: 86-769-8772-8464

    Is this the company in question?

    Link to this
  9. 9. Michael Moyer 11:13 am 05/30/2012

    No, I believe that is a manufacturing site for AcBel Polytech Inc. (Note the “B”) http://www.acbel.com/eng/About_AcBel.aspx?Group=42&&sd=g4

    Link to this
  10. 10. Windontree 11:15 am 05/30/2012

    It might not be rational but I have stayed away from Chinese manufacturers like Lenovo (after they bought out IBM PC division) just on the abnormal fear of compromised equipments. China has this historical/ ingrained disdain and distrust for the West that overrides all the monetary rewards they get from dealing with. Looks like I am not so paranoid after all.

    Link to this
  11. 11. villasenor 11:16 am 05/30/2012

    GG – Thanks for your comments. At the time the Microsemi agreement to acquire Actel was announced (October 2010), Actel was a NASDAQ listed company headquartered in San Jose, CA. Please see this pdf announcing the agreement:

    http://www.actel.com/company/press/files/microsemi_to_acquire_actel.pdf

    Link to this
  12. 12. jtdwyer 12:15 pm 05/30/2012

    John-sensei – thanks for clarifying. However, if the board does not provide direct support for remote access to the chip, wouldn’t the back door be effectively inaccessible?

    Even if the board provided remote communications support, for a graphics display card, for example, wouldn’t the user generally be required to provide physical connection to a network, unless the board manufacturer provided for some RF connection? For most applications (except for perhaps auto/areospace platform wireless network configurations), wouldn’t this be an otherwise unnecessary expense for the manufacturer?

    I would think that display, sound and disk controller manufacturers would not usually provide such direct communications capabilities…

    Link to this
  13. 13. John-sensei 12:33 pm 05/30/2012

    “John-sensei – thanks for clarifying. However, if the board does not provide direct support for remote access to the chip, wouldn’t the back door be effectively inaccessible?”

    The biggest reason for the security features of the chip is to prevent someone with physical access from reading the firmware. The firmware may contain encryption keys or other secrets.

    “Even if the board provided remote communications support, for a graphics display card, for example, wouldn’t the user generally be required to provide physical connection to a network, unless the board manufacturer provided for some RF connection? For most applications (except for perhaps auto/areospace platform wireless network configurations), wouldn’t this be an otherwise unnecessary expense for the manufacturer?”

    For some products, the manufacturer provides firmware updates or upgrades via remote connection. They therefore provide some data path to the JTAG pins.

    Link to this
  14. 14. greerite 1:24 pm 05/30/2012

    SORRY….

    But I think you guys are missing the point here.

    If this chip is made this way, it means other chips can be too.

    Would you like to discover this fact after you fired a missle or smart bomb at a Chinese target only to see that device turn around and fly back to where it was launched?

    Link to this
  15. 15. jtdwyer 1:32 pm 05/30/2012

    “The biggest reason for the security features of the chip is to prevent someone with physical access from reading the firmware. The firmware may contain encryption keys or other secrets.”

    Pardon my persistence, but I do not understand how such capability to access storage encryption keys could be leveraged by some malicious foreign agent with direct physical access to the chip to accomplish widespread havoc. A demonstration of actual harmful effects would be much more convincing.

    It does seem to me that there is an enormous potential for offering hardware security services for naive customers to guard against some unseen threat. Don’t get me wrong, the potential risks of hardware ‘backdoor’ access points exist, but I fail to see the suggested widespread potential impact of such openings without significant malicious software access to take advantage of such exposures. I suspect the identified risk exposure is being overstated – if not, then why not demonstrate the potential impact?

    Link to this
  16. 16. John-sensei 3:09 pm 05/30/2012

    “Pardon my persistence, but I do not understand how such capability to access storage encryption keys could be leveraged by some malicious foreign agent with direct physical access to the chip to accomplish widespread havoc. A demonstration of actual harmful effects would be much more convincing.”

    If the secret is common among a group of physical chips, then if you can coax the secret out of one in your possession, you may then be able to use the secret in other ways. It might be the key to an encrypted communication link. Or it might encode heuristics for the behavior of a military device, and knowledge of those could lead to countermeasures. Or the firmware may be a trade secret. Maybe you want to create counterfeit hardware. Lots of possibilities: that’s why the chips are supposed to have the firmware securely locked up.

    Link to this
  17. 17. r0b3m4n 4:25 pm 05/30/2012

    @ Greer

    “Would you like to discover this fact after you fired a missle or smart bomb at a Chinese target only to see that device turn around and fly back to where it was launched?”

    In the US this is not a problem since all Military supply sources must be made in America, that goes for components for a subs-sub even. If the gov’t didn’t require this there would be no more Printed Circuit Board shops in America, they would all be made in China. This is one of the reasons why our war machine is so expensive – we only buy American. On the plus side we have the best security and quality on the planet. You don’t see our nuclear plants getting hacked into…

    Link to this
  18. 18. Momus 11:36 pm 05/30/2012

    @jtdwyer > I do not understand

    Correct. You don’t understand. Your false claims have been responded to, and yet you persist to speak from ignorance. And you comment like this on almost every SciAm article. If you want to learn, ask questions. Don’t keep contradicting from ignorance those that know much more and better than you do.

    John-sensei explained to you that if the back door is an undocumented by the manufacturer way to read the chip’s program, or to change it’s configuration, then whoever uses such chip in their system may unknowingly leave the door open and un-monitored. It doesn’t mean that every system with such a chip can be broken into or maliciously used. It means that some can… That is bad enough.

    Link to this
  19. 19. jtdwyer 4:51 am 05/31/2012

    Momus –
    This article states:
    “The security breach is a particular concern because of the type of chip involved. The affected chip, a ProASIC3 A3P250, is a field programmable gate array (FPGA). These chips are used in an enormous variety of applications, including communications and networking systems, the financial markets, industrial control systems, and a long list of military systems. Each customer configures an FPGA to implement a unique—and often highly proprietary—set of logical operations. For example, a customer in the financial markets might configure an FPGA to make high speed trading decisions. A customer in aviation might use an FPGA to help perform flight control. Any mechanism that could allow unauthorized access to the internal configuration of an FPGA creates the risk of intellectual property theft. In addition, the computations and data in the chip could be maliciously altered.”

    None of the above could be achieved without complicit software to perform complex manipulations using “secret” information obtained using the rudimentary hardware back door. A back door alone cannot produce any damage.

    In my distant past I have been a principal designer in the development of a commercial system security software product, and have extensive experience as an operating systems programmer. I’ve also consulted for many years with large computer manufacturers on technical hardware and system software product requirements. While I do not have any experience with FPGA devices, I may understand more than you might be able to recognize, from a different perspective from those who do have such experience.

    While the source of the back door circuitry found on this chip hasn’t been identified, disregarding John-sensei’s conjecture, its implied here that this is not an intentional maintenance interface included by manufacturer that ‘might be unknowingly left open’ by a component manufacturer during the configuration process. IMO, there are some important considerations involved in this issue that you have misunderstood. Thanks for your advice, I’m sure.

    Link to this
  20. 20. greerite 7:34 am 06/2/2012

    SORRY …

    But the military could be the primary target for these chips, see attached link.

    http://articles.businessinsider.com/2011-06-27/news/30048253_1_microchips-missiles-foreign-chip-makers

    Link to this
  21. 21. jtdwyer 8:52 am 06/2/2012

    greerite – The 2011 story about a 2010 incident has little to do with this article. Could be anything…

    Link to this
  22. 22. staudenmaier 3:03 pm 07/24/2012

    Begging your pardon, but the term “backdoor” is always used to denote an INTENTIONAL security flaw.

    Link to this

Add a Comment
You must sign in or register as a ScientificAmerican.com member to submit a comment.

More from Scientific American

Scientific American Back To School

Back to School Sale!

12 Digital Issues + 4 Years of Archive Access just $19.99

Order Now >

X

Email this Article



This function is currently unavailable

X