Ernest Worthman and Ed Sperling of Semiconductor Engineering recently co-authored a fascinating article that explores the surprising origins of back doors in the technology sector.
“One of the first open references [to back doors] was in the 1983 movie WarGames,” the two explained. “In reality, modern back doors predate Hollywood’s discovery by about 20 years, starting in 196[4] with a time-sharing OS called Multics.”
[youtube https://www.youtube.com/watch?v=s1A4B9AzFNU]
According to Wikipedia, Multics implemented a single-level store for data access, discarding the clear distinction between files (aka segments) and process memory. More specifically, the memory of a process consisted solely of segments that were mapped into its address space. To read or write to them, the process utilized normal CPU instructions, while the operating system was tasked with ensuring all the modifications were saved to disk.
As Rambus Cryptography Research Fellow Pankaj Rohatgi notes, the U.S. Air Force “discovered” the concept of creating back doors during an evaluation of the Multics operating system. To be sure, Paul Karger and Roger Shell inserted trap doors as part of the review process; using them for system testing and to probe for potential security vulnerabilities.
“[This] sequence of code when properly invoked provides the penetrator with the needed tools to subvert the system,” Karger and Shell wrote in a USAF paper. “Such a trap door must be well hidden to avoid accidental discovery by the system maintenance personnel.”
According to Rohatgi, there are now many types of back doors.
“The most obvious is some extraneous process that is running or some extra code that has been inserted that isn’t part of the normal programming,” Rohatgi told Semiconductor Engineering. “The more sophisticated it is, the harder it is to detect. The best back doors are in the binary. That way it will never show up in the source code. There is also the concept of putting back doors in as a crypto implementation.”
Although back doors can be used to compromise a device’s firmware or security mechanisms, not all were designed with devious intent.
“In fact, many so-called [hardware] back doors are design flaws, which can then be used to compromise a chip’s security,” Worthman and Sperling stated. “[However], given the sheer complexity of large SoCs, not to mention the increased use of third-party IP in many designs, it’s also possible to build in extra circuitry that can allow an outsider to take control of a device.”
According to Worthman and Sperling, the practice of inserting back doors into chips has clearly become a highly controversial “science” since the nascent days of Cold War Multics.
Interested in learning more? The full text of “Back Doors Are Everywhere” by Ernest Worthman and Ed Sperling can be read here.
Leave a Reply