02
Aug 11

New Tool Keeps Censors in the Dark

facebooktwittergoogle_plusredditpinterestlinkedinmail

A new approach to overcoming state-level Internet censorship relies, ironically enough, on a technique that security experts have frequently associated with government surveillance.

Current anti-censorship technologies, including the services Tor and Dynaweb, direct connections to restricted websites through a network of encrypted proxy servers, with the aim of hiding who’s visiting such sites from censors. But the censors are constantly searching for and blocking these proxies. A new scheme, called Telex, makes it harder for censors to block communications by disguising traffic destined for restricted sites as traffic meant for popular, uncensored websites. It does this by employing the same method of analyzing packets of data that censors often use.

“To route around state-level Internet censorship, people have relied on proxy servers outside of the country doing the censorship,” says J. Alex Halderman, assistant professor of electrical engineering and computer science at the University of Michigan. “The difficulty there is, you have to communicate to those people where the proxies are, and it’s very hard to do that without also letting the government censors figure out where the proxies are.”

The Telex system has two major components: “stations” at dozens of Internet service providers (ISPs)—the stations connect traffic from inside nations that censor to the rest of the Internet—and the Telex client software program that runs on the computers of people who want to avoid censorship.

This is an excerpt from a piece I wrote that was published today in MIT Technology Review. Read the full story here.

Tags: , ,

10 comments

  1. Nice piece.

  2. How do they get any message to be interpretable by any ISP without having the censoring governments set up their own nodes to direct the users to spoofed target sites?

  3. Great writ brian keep the good work up . :)

  4. nice article,

    sadly the telex concept is flawed in its roots.
    The Question is, why should ISPs care?
    Thats not a system you can make money with.
    It creates costs and will most like have negative impact on their business with censoring nations.
    On the other hand, they get money for “monitoring services” and being nice to the government can yield some nice government contracts for them.

    Its a good technical approch, but not for the real world.

  5. I am not convinced that the visionaries behind the idea are taking into consideration the seriousness with which State agencies will likely respond. If any State agency chooses to block traffic, the pressure from global businesses on the friendly ISPs to take out the stations will be immense.

    Additionally, although difficult to analyze the traffic, the differences in traffic will likely arise as an identifiable pattern to allow the State agency to determine which ISPs have the Telex stations included. If not all ISPs take part, then it will be possible to selectively block ISPs. All of this is noted by the creators along with suggesting that if stations are not place in the correct line such that user traffic will pass through, it will not prove effective.

    Additionally, I am wondering whether there is a potential defeative measure of using an outbound load balancing, where the State agency could purposely take any identified HTTPS traffic and change it’s route as it leaves the State agency. In a true TLS session, it does not matter the path between client and server, but depending upon which ISPs implement the stations and where they are in the path between user and the ‘nonblocked.com’ site, altering that path could disrupt that communication. Obviously stations closer to ‘nonblocked.com’ would be preferred as it would be harder for the State agency to affect that routing.

    between multiple ISPs, how the telex model would

  6. From page 2 of the article:

    “We’ve gotten a lot of comment from people who don’t understand the system, who are pointing out ways they believe the system could be defeated, but in almost every case, it’s something we’ve thought about and addressed in the paper,” said Halderman.

    • its a common method to mark critics as “people who don’t understand shit” and stating, that you already have thought about every possible flaw is very very bold.
      looks like there’re alot Sheldon Coopers at work :D

      i dont want to diminish any work for a censorship free internet, but this is too much PR and to less solid proof.
      so far you can only test the client in a completely synthetic environment.
      - no server software
      - no real word examples

      but i would be pleased to be surprised :D

    • From page 12 of their paper available at, https://telex.cc/pub/telex-usenixsec11.pdf, they state although difficult there are potential traffic shaping improvements to be made to make the stations harder to detect. Also from page 12 of the article is a concern over station placement.

      My comments are based upon the information available from the technical paper on the proposed technology. I brought the potential flaws in the technology forward for those who are not going to read the entire 15 page technical paper.

      Also if it were possible to make a bulletproof system, we’d have no security analysts or hackers.

  7. Can’t stuff like this also work in the bad guys favor? A new tool for malware writers to keep C&C servers in contact? Funnel loads of info out of networks they hacked? Sounds like a double edged sword to me. Hope they’re looking at it from both angles.

  8. Unplugging from the Internet isn’t all that uncommon. It happened in Libya, Egypt and China (in some regions when riots broke out). When anti-censor technologies advance to such a degree that a government believes it’s no longer possible to censor effectively, it would definitely seek alternatives.