If you think of your personal computer as almost an extension of yourself, a recent federal court ruling in Colorado sounds a little disturbing. The court has ordered that a woman decrypt files on her laptop so they can be used by prosecutors against her. The woman, who is being tried for mortgage fraud, argued that this is a violation of her Fifth Amendment right to keep from testifying against herself, but the court sees the matter differently. Timothy Lee at Ars Technica’s explanation of the problem gets to the heart of it:
In previous cases, judges have drawn a distinction between forcing a defendant to reveal her password and forcing her to decrypt encrypted data without disclosing the password. The courts have held that the former forces the defendant to reveal the contents of her mind, which raises Fifth Amendment issues. But Judge Robert Blackburn has now ruled that forcing a defendant to decrypt a laptop so that its contents can be inspected is little different from producing any other kind of document.
For some, being forced to decrypt your computer and handing over your password to investigators so they can decrypt it might not seem that different—what’s hidden by your password might well feel as much a part of your mind as your password. But when you think about the precedent a ruling in the other direction might set, things get cloudier. The Department of Justice argues that if encryption is all that’s required to keep documents out of the hands of the courts, then potential child pornographers, drug smugglers, and others can refuse to hand over evidence on the grounds that it’s encrypted. Hmmm.
Another case from this week that shows the difficulty of aligning the modern sense of privacy with the law. The Supreme Court ruled that sticking a GPS device on a suspect’s car to track his whereabouts, without a warrant, is unconstitutional. But the court was divided as to why, on a very important point.
Last year, Google raised the ire of many when it confessed that its city-mapping Street View vehicles unintentionally gathered unencrypted Wi-Fi data as they rolled past people’s abodes. To fix its image and to fend off lawsuits, the company soon tightened its privacy policies and ensured that its Street View cars stopped collecting that information. But the controversies just won’t stop. Google is now trying to convince privacy-conscious Swiss officials to drop the country’s tight Street View restrictions, while security-conscious Israeli officials are concerned that the technology will help terrorists.
Twenty-seven countries have been partially mapped via Street View, a Google product that provides 360-degree panoramic views from ground level. The company creates these images by sending groups of camera-studded vehicles to various parts of the world to snap pictures as they drive.
Although Switzerland is home to one of Google’s largest offices outside the United States, the country has strict privacy laws that have prevented Google from loading new Street View images of Switzerland for the past year. On Thursday, Google petitioned a Swiss court to lift this ban. The search engine company told Switzerland’s Federal Administrative Court that its technology automatically conceals the identity of faces and license plates, and that it is no different from rival services.
“Don’t track me, bro!”
If you’ve long been a fan of the Federal Trade Commission’s “Do Not Call” registry, allowing people to opt out of telemarketing campaigns, the good news is that FTC has taken the first steps toward such a setup for the Internet. Jon Leibowitz, the FTC’s chairman, pitched in a report this week (pdf) the idea of implementing some kind of “do not track” option that would allow people to easily say no to having their online behavior tracked and used for purposes like behavior-based advertising. The bad news is, both legally and conceptually, is that it would be a more challenging idea to implement than “Do Not Call.”
Rather than submitting their names on a centrally maintained list, consumers would use a tool on their Web browsers to signal that they do not wish to be tracked or to receive targeted advertising. Leibowitz said Google, Microsoft and Mozilla have all experimented with do-not-track technology on their browsers. [Washington Post]
As the backlash continues against the TSA’s full body scanning and increasingly aggressive pat-downs of those who opt out, the agency has bent a little in one area. The head of TSA today questioned the need to use the added security on pilots. The pilots organization had already told its members to opt out of the scans to avoid extra radiation exposure. Now, the TSA says that as of 2011 pilots will only need to have their airline-issued IDs checked by computer.
“This one seemed to jump out as a common-sense issue,” Transportation Security Administration (TSA) chief John Pistole told Bloomberg News on Friday. “Why don’t we trust pilots who are literally in charge of the aircraft?” That’s exactly the point commercial airline pilots have been making for years. [Christian Science Monitor]
What Pistole did not do, however, was back off the policy of using the scanners on the rest of us. And yesterday on its blog, the TSA tried to launch a PR counter-offensive to the tidal wave of bad press this week. (Though you might not be terribly satisfied with their answer to the question of whether pat-downs are invasive, about which Ars Technica quips, “Nowhere in the “Fact” response does the TSA directly answer the allegation of invasiveness, probably because the pat-downs are invasive.”)
Since the TSA appears disinclined to change its mind about scanning or getting touchy-feely with the general public, lawmakers are beginning to make some noise. In New York, councilman David Greenfield proposed rules to bar TSA from using the x-ray scanners in the city’s airports.
John Tyner missed his flight to South Dakota for a pheasant hunting trip with his father-in-law. He wasn’t late to the airport, he didn’t get lost in the terminal. He never made it into the terminal because he wouldn’t partake in either a whole body scan or a physical pat-down of his genitals.
After arriving at the airport, Tyner was pulled aside to go through a “whole body scan,” an radiation-based machine that takes an image of your body under your clothes. He “opted out” of the scan only to realize the alternative is just as bad. He asked the TSA officer who was patting him down not to touch his privates. Actually, he said: “If you touch my junk, I’ll have you arrested.” The matter quickly escalated, according to his blog post about the incident:
She described to me that because I had opted out of the backscatter screening, I would now be patted down, and that involved running hands up the inside of my legs until they felt my groin. I stated that I would not allow myself to be subject to a molestation as a condition of getting on my flight. The supervisor informed me that it was a standard administrative security check and that they were authorized to do it. I repeated that I felt what they were doing was a sexual assault, and that if they were anyone but the government, the act would be illegal. [John Tyner's blog post]
After the incident, Tyner was escorted from the area, and was able to get a refund on his ticket and was eventually allowed to leave the airport, but not without being threatened with a $10,000 fine for doing so without having finished the screening procedure. At his blog you can read his post about the event and see his videos (he apparently had his smart phone recording video throughout much of the incident).
Those Street View cameras aren’t just collecting pictures of streets and buildings to make Google Maps better, they’re also scooping up email addresses and passwords, Google admitted Friday.
Back in May the company announced that its Street View cars were mistakenly collecting data from unencrypted wireless networks; now they’ve acknowledged that this data included emails, url addresses, and passwords from people who were sending that data over open (non-password protected) networks when a Google car passed by.
We are mortified by what happened, but confident that these changes to our processes and structure will significantly improve our internal privacy and security practices for the benefit of all our users. [Official Google Blog]
The data-collecting code was a part of the software running on Google’s Street View cars, which have so far mapped over 30 countries, and have established a presence on every continent–including Antarctica. The software was meant to just collect basic data about the presence of WiFi networks as the car-mounted cameras snapped pictures.
The software tool called Haystack was supposed to protect dissidents in Iran who wanted to use the Internet free of the government’s censorship. If third-party software testers are correct, though, flaws in the system meant to help those dissidents could have led authorities right to them. The Censorship Research Center, the San Francisco-based organization that created Haystack, has now pulled it back and asked users to destroy the existing copies.
“We have halted ongoing testing of Haystack in Iran pending a security review,” HaystackNetwork.com said in a brief statement. “If you have a copy of the test program, please refrain from using it.” [AFP]
Jacob Appelbaum, a security expert who volunteers with WikiLeaks, sounded the alarm.
Is it medicine, or is it not?
In May, the University of California, Berkeley unveiled its “Bring Your Genes To Cal” program. The idea was, Berkeley’s 5,500 or so incoming freshman would have the option to have their DNA tested for three particular characteristics: Their metabolism of folate, tolerance of lactose, and metabolism of alcohol. Though the program was limited, it raised privacy hackles. And now the State of California has ruled: This is a medical test, and Cal can’t do it unless it’s in a clinical setting.
Mark Schlissel, UC Berkeley’s dean of biological sciences and an architect of the DNA program, said he disagreed with the state Department of Public Health’s ruling that the genetic testing required advance approval from physicians and should be done only by specially-licensed clinical labs, not by university technicians. The campus could not find labs willing to do the work and probably could not afford it anyway, Schlissel said. He also contended that the project deserved an exemption from those rules because it was an educational exercise [Los Angeles Times].