Taginternet

The New York Times metered access plan

new york times building

(Photo by Alex Torrenegra)

Summary of my view: It’s a great idea, but executing it properly will be extremely difficult.

If you didn’t hear – The New York Times is going to “meter” access to their site. Readers will be able to view a certain number of articles per month for free, after which they’ll have to pay.

I didn’t even know about the Financial Times meter before last week when I first read rumors about the NYT takes the same approach. I occasionally read articles at FT, and have occasionally linked to articles there. Their meter gives me no trouble.

That unobtrusiveness may come at a price. It took me only one Google search to find a way to circumvent their meter – this Greasemonkey script. Apparently, they just use cookies to determine the number of articles you’ve viewed. I’m not sure how many of FT’s paying customers are going to go through the trouble of installing Firefox extensions or manually deleting cookies to get access to the site, but I’d guess it would be more of a problem for the NYT’s larger and more general audience.

So that’s where execution gets tricky. Start making the meter more effective, less easy to route around, and you’re likely to end up making it a lot more intrusive to casual readers. There’s already something of a blogger backlash against the plan, and if the meter ends up being cumbersome, the Times could find their casual readership dropping off (and their advertising revenues declining).

And that’s to say nothing of people outright pirating their articles through copy and paste. If they start trying to implement means to keep people from copying and pasting the text from their articles, they risk alienating their customers even more.

So yes, it will be tricky to pull off. With a sufficiently generous meter (20 articles a month seems reasonable), affordable access rates (I’d be great if they also had some metered plans – say 50 articles a month for $5, instead of having to buy unlimited access), unobtrusive technology, and, of course, high quality content, they could have a winning business model on their hands. (I’d also encourage them to offer free unlimited access to libraries, schools, charities, etc., as well as to visitors from developing nations.) But it will be a hard balance to pull off, especially if NYT bigwigs push for tight security and restrictions.

See also: Paid content has a good look at the ends and out of it.

Freenet, darknets, and the “deep web”

Installing the software takes barely a couple of minutes and requires minimal computer skills. You find the Freenet website, read a few terse instructions, and answer a few questions (“How much security do you need?” … “NORMAL: I live in a relatively free country” or “MAXIMUM: I intend to access information that could get me arrested, imprisoned, or worse”). Then you enter a previously hidden online world. In utilitarian type and bald capsule descriptions, an official Freenet index lists the hundreds of “freesites” available: “Iran News”, “Horny Kate”, “The Terrorist’s Handbook: A practical guide to explosives and other things of interests to terrorists”, “How To Spot A Pedophile [sic]”, “Freenet Warez Portal: The source for pirate copies of books, games, movies, music, software, TV series and more”, “Arson Around With Auntie: A how-to guide on arson attacks for animal rights activists”. There is material written in Russian, Spanish, Dutch, Polish and Italian. There is English-language material from America and Thailand, from Argentina and Japan. There are disconcerting blogs (“Welcome to my first Freenet site. I’m not here because of kiddie porn … [but] I might post some images of naked women”) and legally dubious political revelations. There is all the teeming life of the everyday internet, but rendered a little stranger and more intense. One of the Freenet bloggers sums up the difference: “If you’re reading this now, then you’re on the darkweb.”

Guardian: The dark side of the internet

(via Atom Jack)

I haven’t looked at Freenet in years, but it’s certain relevant to the discussion here about darknets.

How Robber Barons hijacked the “Victorian Internet”

In many ways this story is far field from our contemporary debates about network management, file sharing, and the perils of protocol discrimination. But the main questions seem to remain the same—to what degree will we let Western Union then and ISPs now pick winners and losers on our communications backbone? And when do government regulations grow so onerous that they discourage network investment and innovation?

These are tough questions, but the horrific problems of the “Victorian Internet” suggest that government overreach isn’t the only thing to fear. In 1876, laissez-faire “freedom for all” meant (in practice) the freedom for Henry Nash Smith to read your telegrams if he didn’t like who you supported for President. It meant freedom for Associated Press to block criticism of Western Union, and even to put potential critics and competitors out of business. And it meant freedom for a scoundrel to hijack the system at his leisure.

Sure enough, the technologies and debates are different. Still, one wonders what Charles A. Sumner would say today if told that net neutrality is a “solution to a problem that hasn’t happened yet.”

Ars Technica: How Robber Barons hijacked the “Victorian Internet”

(via Social Physicist)

What are your favorite blogs (besides Mutate)?

What are some of your favorite blogs?

AP Study: computer viruses download child pornography

Innocent people have been branded as child abusers after malware infected their PCs, an AP investigation has discovered.

Technically sophisticated abusers sometimes store images of child abuse on PCs infected by Trojans that grant them illicit access to compromised machines.

The plight of those framed in this way is all the worse because paedophiles commonly use supposed malware infections of their PCs to explain the presence of images of child abuse. Because of this the “Trojan did it” defence is understandably met with scepticism from law enforcement professionals.

“It’s an example of the old `dog ate my homework’ excuse,” says Phil Malone, director of the Cyberlaw Clinic at Harvard’s Berkman Center for Internet & Society told AP. “The problem is, sometimes the dog does eat your homework.”

Register: How malware frames the innocent for child abuse

The death of robots.txt?

Last night I linked to an interview with Rupert Murdoch in which he says that News Corp will probably de-index their sites from Google.

I figured it was all bluster. Search engine traffic is more valuable that Murdoch suggests, and there are probably plenty of people in high places at News Corp who know it.

But Cory Doctorow suggests:

So here’s what I think it going on. Murdoch has no intention of shutting down search-engine traffic to his sites, but he’s still having lurid fantasies inspired by the momentary insanity that caused Google to pay him for the exclusive right to index MySpace (thus momentarily rendering MySpace a visionary business-move instead of a ten-minutes-behind-the-curve cash-dump).

So what he’s hoping is that a second-tier search engine like Bing or Ask (or, better yet, some search tool you’ve never heard of that just got $50MM in venture capital) will give him half a year’s operating budget in exchange for a competitive advantage over Google.

Jason Calacanis has suggested this approach as a means to “kill Google.”

But it may actually be neither the death of Google, nor the death of News Corp if they are so foolish as to carry out this plan. It could be the death of the robots exclusion standard. I would guess News Corp would use robots.txt to de-index their sites. But it’s a “purely advisory” protocol that Google is under no obligation to honor. They could continue indexing News Corps if they so choose. So could every other search engine, big or small. And I’d guess they would if big content providers started going exclusive with search engines.

If News Corps puts all its contend behind a pay wall, this point is moot – Google and other search engines won’t be able to index it, and robots.txt will be fine. But it’s something to think about.

(Hat tips to Jay Rosen for the TimesSelect link and Chris Arkenberg for the Jason Calacanis video)

Google unveils protocol for an interplanetary internet

Vint Cerf, Google’s internet evangelist, has unveiled a new protocol intended to power an interplanetary internet.

The Delay-Tolerant Networking (DTN) protocol emerged from work first started in 1998 in partnership with Nasa’s Jet Propulsion Laboratory. The initial goal was to modify the ubiquitous Transmission Control Protocol (TCP) to facilitate robust communications between celestial bodies and satellites. […]

The core issue is that TCP assumes a continuous (and fairly reliable) connection. DTN makes no such assumptions, requiring each node to buffer all of its packets until a stable connection can be established. Whereas TCP will repeatedly attempt to send packets until they are successfully acknowledged, DTN will automatically find a destination node with a reliable connection, and then send its payload just once. Given the latency of space communications and the minimal power restrictions placed upon satellites, DTNs approach seems prudent.

However most people don’t have a need for regular satellite communication (well, our columnist Warren Ellis has that death ray of his), but Cerf sees his robust protocol having more down-to-Earth applications. Mobile networks, for example, must regularly cope with long periods of delay or loss – a train tunnel rudely interrupting a YouTube stream, for example. Perhaps looking to gain an edge on its competitors, Google has already integrated DTN into Android’s networking stack.

Wired UK: Google unveils protocol for an interplanetary internet

(via Wade)

Fast Internet access becomes a legal right in Finland

Finland has become the first country in the world to declare broadband Internet access a legal right.

Starting in July, telecommunication companies in the northern European nation will be required to provide all 5.2 million citizens with Internet connection that runs at speeds of at least 1 megabit per second. […]

In June, France’s highest court declared such access a human right. But Finland goes a step further by legally mandating speed.

CNN: Fast Internet access becomes a legal right in Finland

(via Disinfo)

It’s unclear to me – does this mean that telecom companies are required to provide this service for free, or does it mean it has to make it available to everyone (including people in remote areas)?

How the Net aids dictatorships

Evgeny Morozov makes a case similar to the one I made in my essay “Birthers adn the Democratization of Media“, only looking at dictatorships abroad (specificaly Iran and China).

Bullies Worse than Predators On Social Networks

Contrary to the often cited statistic that one out of five minors is sexually solicited online, a controversial report released this week indicates that cyberbullies are a more prevalent problem than predators on social networking sites like MySpace and Facebook, and that in the case of predators, “the image presented by the media of an older male deceiving and preying on a young child does not paint an accurate picture of the nature of the majority of sexual solicitations.”

About half of minors who report receiving sexual solicitations online say the advances come from other minors, the report says.

Where sexual interactions do occur between adults and minors online, they rarely progress to physical encounters offline and, when they do, they usually involve post-pubescent minors between the ages of 14 and 17, who are aware before the encounter that the person they are planning to meet is an adult.

The researchers found that the minors who are most at risk of encountering inappropriate content and encounters online often engage in risky behaviors or come from environments that make them more susceptible to risks, such as environments where there is little adult supervision or where there is drug abuse or physical and mental abuse.

“Those who are most at risk often engage in risky behaviors and have difficulties in other parts of their lives. The psychosocial makeup of and family dynamics surrounding particular minors are better predictors of risk than the use of specific media or technologies,” the report says.

The report also says that although cyberbullying is a greater problem than predators, there is no evidence that bullying has increased because of social networking sites and that bullying still occurs more often offline than online, although social networking sites have created another avenue for expressing it.

Full Story: Threat Level

© 2024 Technoccult

Theme by Anders NorénUp ↑