First appeared on 23 June 2015
“I have this microphone sitting on top of my monitor. It looks just like a mini-camera to me. It has an “EYE” right in the middle, with two speakers on either side. I am wondering if this so called “microphone” is actually looking at me and the room in the background? Could it be a combined camera and microphone? Is it secretly spying on me in the privacy of my room, watching every expression on my face and recording every whisper? This is scary! I’m getting worried! ”
— Paranoid lady to her psychiatrist.
We documented earlier today that – if you are near your smart phone – the NSA or private parties could remotely activate your microphone and camera and spy on you.
This post shows that the same is true for our computer.
Initially, the NSA built backdoors into the world’s most popular software program – Microsoft Windows – by 1999.
A top expert in the ‘microprocessors’ or ‘chips’ inside every computer – having helped start two semiconductor companies and a supercomputer firm – says:
He would be “surprised” if the US National Security Agency was not embedding “back doors” inside chips produced by Intel and AMD, two of the world’s largest semiconductor firms, giving them the possibility to access and control machines.
And a government expert told the Washington Post that the government “quite literally can watch your ideas form as you type” (confirmed). Even that is just “the tip of the iceberg”, according to a congress member briefed on the NSA’s spying program.
The New York Times reported in 2011 that German police were using spyware to turn on the webcam and microphone on peoples’ computers:
A group that calls itself the Chaos Computer Club prompted a public outcry here recently when it discovered that German state investigators were using spying software capable of turning a computer’s webcam and microphone into a sophisticated surveillance device.
The club …announced last Saturday it had analyzed the hard drives of people who had been investigated and discovered that they were infected with a Trojan horse program that gave the police the ability to log keystrokes, capture screenshots and activate cameras and microphones.
Reuters documented last year that the U.S. and Israeli governments can remotely turn on a computer’s microphone:
Evidence suggest that the virus, dubbed Flame, may have been built on behalf of the same nation or nations that commissioned the Stuxnet worm that attacked Iran’s nuclear program in 2010 [i.e. the U.S. and Israel], according to Kaspersky Lab, the Russian cyber security software maker that took credit for discovering the infections.
Kaspersky researchers said they have yet to determine whether Flame had a specific mission like Stuxnet, and declined to say who they think built it.
Cyber security experts said the discovery publicly demonstrates what experts privy to classified information have long known: that nations have been using pieces of malicious computer code as weapons to promote their security interests for several years.
The virus contains about 20 times as much code as Stuxnet, which caused centrifuges to fail at the Iranian enrichment facility it attacked. It has about 100 times as much code as a typical virus designed to steal financial information, said Kaspersky Lab senior researcher Roel Schouwenberg.
Flame can gather data files, remotely change settings on computers, turn on PC microphones to record conversations, take screen shots and log instant messaging chats.
Kaspersky Lab said Flame and Stuxnet appear to infect machines by exploiting the same flaw in the Windows operating system and that both viruses employ a similar way of spreading.
“The scary thing for me is: if this is what they were capable of five years ago, I can only think what they are developing now,” Mohan Koo, managing director of British-based Dtex Systems cyber security company.
PC Magazine tech columnist John Dvorak writes:
From what we know the NSA has back door access into Apple, Microsoft, and Google. What kind of access we don’t know, but let us assume it is similar to what they did about 7 years ago to AT&T. They had a secret room at Fulsom St. in San Francisco and the AT&T engineers had no control and no access to a room full of NSA equipment that had direct access to everything AT&T could do.
Microsoft is the source of the operating system for Windows and Windows cell phones. Apple controls the OS for Macs, iPhones, and iPads. Google controls the Chrome OS, Chrome Browser, and Android cell phones. The companies regularly push operating system upgrades and security updates to users on a regular basis.
Imagine however that the NSA has access to these updates at the source and has the ability to alter these update in order to install some sort of spyware on your phone, tablet, or computer. The software could turn on your camera or microphone remotely, read all your private data, or erase everything and brick your phone or computer.
The Wall Street Journal notes:
The FBI develops some hacking tools internally and purchases others from the private sector. With such technology, the bureau can remotely activate the microphones in phones running Google Inc.’s Android software to record conversations, one former U.S. official said. It can do the same to microphones in laptops without the user knowing, the person said.
A former high-level NSA insider confirmed to us that any computer’s microphone can be remotely accessed.
Moreover – as documented by Microsoft, Ars Technica, CNET, the Register, Sydney Morning Herald, AP and many other sources – private parties can turn on your computer’s microphone and camera as well.
Cracked noted in 2010:
All sorts of programs are available to let you remotely commandeer a webcam, and many of them are free. Simple versions will just take photos or videos when they detect movement, but more complex software will send you an e-mail when the computer you’ve installed the program on is in use, so you can immediately login and control the webcam without the hassle of having to stare at an empty room until the person you’re stalking shows up.
The bottom line is that – as with your phone, OnStar type system or other car microphone, Xbox, and other digital recording devices – you shouldn’t say or do anything near your computer that you don’t want shared with the world.
Postscript: You could obviously try to cover your webcam and microphone when you don’t want to use them.
But if you really want privacy, take a lesson from spy movies: Go swimming with the person you want to speak with … since electronics can’t operate in water.
by Rick Falkvinge
Yesterday, news broke that Google has been stealth downloading audio listeners onto every computer that runs Chrome, and transmits audio data back to Google. Effectively, this means that Google had taken itself the right to listen to every conversation in every room that runs Chrome somewhere, without any kind of consent from the people eavesdropped on. In official statements, Google shrugged off the practice with what amounts to “we can do that”.
It looked like just another bug report. “When I start Chromium, it downloads something.” Followed by strange status information that notably included the lines “Microphone: Yes” and “Audio Capture Allowed: Yes”.
Without consent, Google’s code had downloaded a black box of code that – according to itself – had turned on the microphone and was actively listening to your room
A brief explanation of the Open-source / Free-software philosophy is needed here.
When you’re installing a version of GNU/Linux like Debian or Ubuntu onto a fresh computer, thousands of really smart people have analyzed every line of human-readable source code before that operating system was built into computer-executable binary code, to make it common and open knowledge what the machine actually does instead of trusting corporate statements on what it’s supposed to be doing. Therefore, you don’t install black boxes onto a Debian or Ubuntu system; you use software repositories that have gone through this source-code audit-then-build process. Maintainers of operating systems like Debian and Ubuntu use many so-called “upstreams” of source code to build the final product.
Chromium, the open-source version of Google Chrome, had abused its position as trusted upstream to insert lines of source code that bypassed this audit-then-build process, and which downloaded and installed a black box of unverifiable executable code directly onto computers, essentially rendering them compromised. We don’t know and can’t know what this black box does. But we see reports that the microphone has been activated, and that Chromium considers audio capture permitted.
This was supposedly to enable the “Ok, Google” behavior – that when you say certain words, a search function is activated. Certainly a useful feature. Certainly something that enables eavesdropping of every conversation in the entire room, too.
Obviously, your own computer isn’t the one to analyze the actual search command. Google’s servers do. Which means that your computer had been stealth configured to send what was being said in your room to somebody else, to a private company in another country, without your consent or knowledge, an audio transmission triggered by… an unknown and unverifiable set of conditions.
Google had two responses to this. The first was to introduce a practically-undocumented switch to opt out of this behavior, which is not a fix: the default install will still wiretap your room without your consent, unless you opt out, and more importantly, know that you need to opt out, which is nowhere a reasonable requirement. But the second was more of an official statement following technical discussions on Hacker News and other places. That official statement amounted to three parts (paraphrased, of course):
1) Yes, we’re downloading and installing a wiretapping black-box to your computer. But we’re not actually activating it. We did take advantage of our position as trusted upstream to stealth-insert code into open-source software that installed this black box onto millions of computers, but we would never abuse the same trust in the same way to insert code that activates the eavesdropping-blackbox we already downloaded and installed onto your computer without your consent or knowledge. You can look at the code as it looks right now to see that the code doesn’t do this right now.
2) Yes, Chromium is bypassing the entire source code auditing process by downloading a pre-built black box onto people’s computers. But that’s not something we care about, really. We’re concerned with building Google Chrome, the product from Google. As part of that, we provide the source code for others to package if they like. Anybody who uses our code for their own purpose takes responsibility for it. When this happens in a Debian installation, it is not Google Chrome’s behavior, this is Debian Chromium’s behavior. It’s Debian’s responsibility entirely.
3) Yes, we deliberately hid this listening module from the users, but that’s because we consider this behavior to be part of the basic Google Chrome experience. We don’t want to show all modules that we install ourselves.
If you think this is an excusable and responsible statement, raise your hand now.
Now, it should be noted that this was Chromium, the open-source version of Chrome. If somebody downloads the Google product Google Chrome, as in the prepackaged binary, you don’t even get a theoretical choice. You’re already downloading a black box from a vendor. In Google Chrome, this is all included from the start.
This episode highlights the need for hard, not soft, switches to all devices – webcams, microphones – that can be used for surveillance. A software on/off switch for a webcam is no longer enough, a hard shield in front of the lens is required. A software on/off switch for a microphone is no longer enough, a physical switch that breaks its electrical connection is required. That’s how you defend against this in depth.
Of course, people were quick to downplay the alarm. “It only listens when you say ‘Ok, Google’.” (Ok, so how does it know to start listening just before I’m about to say ‘Ok, Google?’) “It’s no big deal.”
A company stealth installs an audio listener that listens to every room in the world it can, and transmits audio data to the mothership when it encounters an unknown, possibly individually tailored, list of keywords – and it’s no big deal!?
“You can opt out. It’s in the Terms of Service.” (No. Just no. This is not something that is the slightest amount of permissible just because it’s hidden in legalese.) “It’s opt-in. It won’t really listen unless you check that box.” (Perhaps. We don’t know, Google just downloaded a black box onto my computer. And it may not be the same black box as was downloaded onto yours. )
Early last decade, privacy activists practically yelled and screamed that the NSA’s taps of various points of the Internet and telecom networks had the technical potential for enormous abuse against privacy. Everybody else dismissed those points as basically “tinfoilhattery” – until the Snowden files came out, and it was revealed that precisely everybody involved had abused their technical capability for invasion of privacy as far as was possible.
Perhaps it would be wise to not repeat that exact mistake. Nobody, and I really mean nobody, is to be trusted with a technical capability to listen to every room in the world. Privacy remains your own responsibility.