Mar 2nd 2011, 21:01 by The Economist online | SAN FRANCISCO
WHEN the iPad was launched last year
, it was dubbed “the Jesus tablet” because of the quasi-religious fervour with which it was greeted by consumers worldwide, who have since snapped up more than 15m of them. Now Apple wants to create even more converts. On March 2nd Steve Jobs, its boss, returned briefly from sick leave to introduce the iPad 2
, a revamped version that will compete with a host of rivals now coming to market.
Among these are devices such as Motorola’s Xoom that are based on a new version of Google’s Android operating system designed specifically for tablets. Android-based smartphones have rapidly eroded the market share of Apple’s popular iPhone. But when it comes to tablets, the iPad’s lead should prove more durable.
For a start, Apple has had the tablet field to itself for a year, allowing it to refine its offering and raise the bar for rivals. The iPad 2 is considerably thinner, lighter and faster than its predecessor and offers videoconferencing and other capabilities whose absence in the first iPad were widely criticised.
Another reason to bet Apple will maintain its lead is that rivals with similar capabilities have turned out more expensive, whereas the new iPad, despite its extra features, will cost the same as the old one. In America the Xoom costs $800 without a wireless contract and $600 with a two-year one from Verizon. The cheapest iPad 2 will cost $499 without a contract. Sarah Rotman Epps of Forrester, a research firm, reckons high prices will prove fatal for these rivals. Apple has other advantages too, such as an online store full of software programmes, or apps, designed for iPads, as well as content that can be downloaded to them.
Yet the closed nature of such stores also makes some people hesitate to buy tablets. In a recent survey by the Boston Consulting Group (BCG), more than 80% of American respondents said being able to access content from anywhere would be an important factor in their choice of e-reader or tablet. John Rose of BCG reckons Apple’s iTunes music store succeeded because it had to strike deals with only the handful of firms that dominate the record business; it will be far harder to reach exclusive agreements with the diverse owners of the many other types of content tablet users might buy.
That is unlikely to stop Apple from trying, though. Mr Jobs is a notorious control freak. He is also a tech visionary whose notion of tablet computing has delivered yet another smash hit for Apple. The father of the Jesus tablet is no doubt already planning his next miracle.
Mar 2nd 2011, 15:10 by G.F. | SEATTLE
TO MANY of his constituents, Charles Schumer, a Democratic senator from New York, appeared to spout gibberish on Sunday. "Major web sites [should] switch to secure HTTPS web addresses instead of the less secure HTTP protocol," he told Reuters
in a Manhattan coffee shop. Mr Schumer's statement, however, constitutes perfectly sensible advice—he was well briefed by his staff. Such a move would prevent theft of casual digital identities and personal information in public places—and hinder politically motivated interception by repressive (or democratically elected) governments.
HTTPS is the secured or encrypted form of HTTP (Hypertext Transfer Protocol), a communications language that directs the way in which web browsers and web servers interact to request and retrieve pages, images and other files. HTTPS layers encryption on top of plain HTTP using SSL/TLS (Secure Sockets Layer/Transport Layer Security). These are the old and current names for web-page securing technology that dates back to the world wide web's juvenile days, not long after Netscape alerted the masses to its existence.
Websites that offer SSL/TLS security allow connections via a URL that starts with "https" in the location field or link. First, the browser silently requests security credentials that the server provides. Next, it validates this information independently using either its own built-in data or those included in the operating system. If it passes muster, the browser and server exchange an encryption key, unique to each session, which is then used to guard the data that passes between them. Any whiff of interception or rerouting is enough to alert the user. Because of the way browsers and operating systems validate SSL/TLS certificates, an interloping party (the so-called "man in the middle") cannot pretend to be a secured server (to a browser) or a secured browser (to a server) without provoking such warnings.
Flaws in earlier versions of SSL/TLS were patched up years ago and it is generally regarded as foolproof—and vital. The risk of not using it was readily demonstrated in the early stages of Tunisia's recent upheaval. The government allegedly intercepted connections between citizens and the unencrypted version of Facebook's local site, as Alexis Madrigal explained
on January 24th in the Atlantic
. The government could then intercept traffic by pretending to be Facebook; users, unaware, would blithely bung in their credentials, handing over access to their account and their entire social network. (To its credit, Facebook decided to flip on SSL/TLS for all of Tunisia and, later, made it available as an account preference worldwide. The internet company has offered HTTPS for some time but users outside Tunisia still have to opt in.)
Mar 1st 2011, 21:12 by G.F. | SEATTLE
PASSWORD selection typically lacks sex appeal. The longer it is, and the more exotic the characters (punctuation marks, say), the less likely a brute-force effort to crack it is to succeed. But coming up with complex concatenations of alphanumeric symbols tends to be tedious and offputting, so relatively few people bother, plumping instead for simple, and easily guessable, words. Now, a firm from Cape Town, in South Africa, has released a free web tool designed to make strong password selection a little more tantalising.
Feb 26th 2011, 12:50 by M.G. | SAN FRANCISCO
FOR some time, Google has been threatening to make life harder for so-called "content farms", which produce mountains of low-grade articles stuffed with popular keywords, to ensure that they appear high up in search results. Google users grumble at having to wade through reams of such articles to find ones that are really informative. Now the search giant has revealed that it has tweaked the secret formula
that it uses to rank web pages in ways that it hopes will make life harder for the purveyors of such spam.
The changes to Google's alogorithm algorithm, which will affect some 12% of the queries that its search engine handles, come in response to a growing chorus of criticism from some Google users. A number have publicly accused the company of being slow to crack down on content farms such as Demand Media and Associated Content because Google benefits from the revenue generated by the ads served up alongside the anodyne content they churn out. "Google has become a jungle: a tropical paradise for spammers and marketers," lamented Vivek Wadwha, an entrepreneur-turned-academic, in a recent blog post.
Other critics, such as Paul Kedrosky, a blogger, have given warning
that people may look for new ways to find relevant information on the web if Google cannot get to grips with its spam headache. And search-engine rivals have been doing their best to profit from the company's discomfort. Blekko
, a start-up that allows people to hunt for information online in narrow vertical categories such as "health" and "finance"—and to highlight and block any rubbish that creeps into its results—has even launched a "spam clock"
that underlines the volume of spam being produced on the web every day.
Some observers caution that criticism of Google may be overdone, pointing out that users are not in fact deserting the search engine in droves. Google itself claims that its ability to serve up relevant content fast has never been greater. That may be true, but the search giant clearly feels the need to clean house now rather than risk seeing more of its results pages turned into Augean stables of spam. It has also added functionality to its Chrome browser that lets people block specific sites from search results. The battle with the content farms has begun.
Feb 22nd 2011, 12:26 by S.D. | NEW YORK
THE monster 18-wheel trucks that hurtle along America’s highways carry with them most of the nation’s freight. On long-haul routes there are reckoned to be some 1.3m of these “semi-trailers”, as the combination of a tractor unit and trailer are known. Such vehicles are called articulated lorries in Britain, although these tend to be a bit puny compared with American rigs weighing 32,000kg (70,000lbs) or more. Not surprisingly the big semi-trailers take skill to handle—and they consume a lot of diesel. But a new development could reduce fuel consumption and give truckers one less thing to worry about when on the open road.
The work involves fitting wind-deflecting devices under the trailer of a semi to make the rig more aerodynamically efficient. The devices direct oncoming air around the trailer in such a way that it increases pressure in the area of the slipstream immediately behind the vehicle. Ordinarily, this is a low-pressure area which has the effect of sucking the truck backwards, something that adds to the rig’s fuel consumption.
The low-pressure area in the slipstream of a moving object is exploited in some sports such as cycle racing, speed skating and motor racing. In a technique known as “drafting”, a competitor gets close behind the person or vehicle in front. The low-pressure area reduces wind resistance and hence the amount of drag, which means less energy is needed to maintain the same speed as the leader. Some car and lorry drivers try to exploit this aerodynamic effect by tailgating big trucks in order to reduce their own fuel consumption. But it is exceedingly dangerous, especially if there is sudden braking.
The wind deflectors for the semis were developed by BMI, a small company based in South Carolina. They were inspired by the aerospace industry, says Mike Henderson, its chief executive. Before he started the firm, Mr Henderson ran a Boeing research unit that investigated aircraft aerodynamics using sophisticated computer models. He has now set up a new company called SmartTruck to market the technology as UnderTray.
The UnderTray looks simple enough: plastic and metal structures which direct oncoming air towards the rear in such a way as to raise the air pressure. But the aerodynamics involved are extremely complex and they required a supercomputer to crack. For this, the company won a grant from America’s Department of Energy to run simulations on Jaguar, a Cray XT-5 supercomputer at the Oak Ridge National Laboratory in Tennessee. Jaguar is capable of 2.3 quadrillion mathematical operations every second (which is about 100,000 times faster than a typical laptop). Even then the process took 18 months, although Mr Henderson reckons it shaved about two years from the time it would otherwise have taken to turn the concept into a final design.
The company claims its UnderTray can improve fuel efficiency in a semi-trailer by as much as 12%. The Department of Energy estimates if all the semis in America had such devices installed it would produce fuel savings of 1.5 billion gallons of diesel a year. At current prices that would add up to about $5 billion a year. A typical semi-trailer travels about 240,000km (150,000 miles) a year, and at $3 a gallon for diesel BMI estimates that its system would pay for itself in 12 to 18 months. That should be an appealing proposition for truckers, especially if the increased air pressure behind them means they no longer have to worry about tailgaters as well.
Feb 21st 2011, 18:03 by J.P. | WASHINGTON, DC
THE American Association for the Advancement of Science (AAAS), one of the world's leading bodies tasked with supporting and propagating all manner of boffinry, and the publisher of Science, a pre-eminent scientific journal, knows how to throw a party. Its Annual Meeting invariably draws hordes of researchers, both bow-tied doyens and dishevelled upstarts, political movers and shakers, besuited public-relations types, press officers, students, geeky high-school teens and legions of even geekier hacks. On February 17th-21st they descended in their thousands on Washington, DC, to mingle, schmooze and, hopefully, find out about some interesting research.
One morning Babbage, never one to miss a good intellectual sortie, took advantage of the beautiful, unseasonably warm weather to saunter from the Washington Convention Centre, which the event has nearly monopolised, towards Capitol Hill. Every person he encountered seemed to be sporting a AAAS badge, fluttering in the blustery wind on a signature blue lanyard. True, the American capital was uncannily empty on Presidents Day weekend, but that only reinforced the feeling of science's ubiquitous presence.
At this point, a waggish remark would usually follow to illustrate the plethora of topics covered in the innumerable lectures, seminars, symposia, poster sessions and the like. But no quip along the lines of "topics range from the inner ear of the golden mole to synthetic magnetism in ultracold atoms" (to mention papers presented on Friday morning in just two of the dozen or so concurrent sessions) could possibly do justice to programme's awe-inspiring breadth. (Incredulous readers can see for themselves
.) Although The Economist has sent two correspondents to Washington, giving us the enviable ability to be in two places at once, many interesting-sounding sessions had to be left out of the itinerary, jam-packed though it was.
Some of the speakers which Babbage and his colleague did get to listen to made a valiant effort to link their talks to the meeting's predictably anodyne theme, "Science Without Borders". But most were simply excited to present the fruits of their intellectual labour. The Economist will report on some of these in this week's print edition. Watch this space.
Feb 18th 2011, 22:13 by A.M.
THE European Union’s authority on environmental issues such as air pollution has grown greatly since the 1980s. So too have Eurosceptics' carps about meddling foreign bureaucrats sticking their noses into domestic affairs. But Brussels politicking is looking increasingly puny compared with recent exploits of some local politicians, choking under the threat of EU sanctions.
Alberto Ruiz-Gallardón, the mayor of Madrid, was recently accused by state prosecutors of having air pollution monitoring stations moved in secret from busy road-side locations to the verdant tranquility of the city’s parks. The unsurprising consequent drop in the recorded level of pollutants had been proclaimed as evidence of a great victory for the mayor in the capital’s battle against its noxious urban atmosphere, leading Mr Ruiz-Gallardón to boast that air quality in Madrid had never been better.
Readings from such sensors are used by the European Commission to enforce mandatory limits on dangerous airborne pollutants. The smaller sorts of airborne particles are alone thought to cause 380,000 premature deaths in Europe every year. A recent study by London’s Queen Mary University linked particulate matter exposure to pneumonia, particularly among children. It found that Londoners breathe in air that is more harmful than that in Accra, the capital of Ghana. Although London's air is no more polluted than Accra's overall, wood smoke, prevalent in poorer countries, is less toxic to the airway cells than the diesel exhausts that waft through the rich world's urban centres.
Such findings have not stopped 20 of the 27 EU member states brazenly flouting air-quality targets. And Madrid is not the only city accused of underhand tactics in fending off financial sanctions. Later this month the Commission will deliver its verdict on Britain's plea for more time to meet pollution-reduction targets. If it is rejected, British taxpayers may have to pay through their noses for the dubious pleasure of inhaling their capital's air, which has been at illegal levels every year since 2005 and, as a result, risks burning a €300m hole in the cash-strapped government's pocket.
Predictably, green campaigners remain unconvinced by government's argument that a single monitoring station on London’s Marylebone Road is located in an area with the highest airborne particle concentrations in London. They note that this claim is based on a computer simulation that shows that part of London to be particularly vulnerable to such pollution, while ignoring several other sensors in different parts of the city which also record levels well above EU limits. The government insists their location, too, is inauspicious, though it has done little to explain why it thinks so.
In fact, officials in both Madrid and London are free to select sites as they please. Janez Potoÿnik, the EU's environmental commissioner, told a recent meeting in London that he could only consider the data presented to him, and hope that governments are not being disingenuous. This need not be an entirely vain hope. A Commission spokesman explained to Babbage that measuring air-pollution levels in parks or green areas, while smacking of political expediency, may accurately reflect long-term exposure levels, provided that the sites are situated close to residential areas, and additional spots are included to reflect the exposure of the whole urban population.
None of this has not prevented Spain from being referred to the European Court of Justice, along with Italy, Portugal, Sweden, Slovenia and Cyprus. It remains to be seen whether the judges, like the greens and many European city dwellers, smell something amiss.
Feb 18th 2011, 10:07 by N.V. | LOS ANGELES
IT WAS not quite a foregone conclusion, but all the smart money was on the machine. Since the first rehearsal over a year ago, it had become apparent that Watson—a supercomputer built by IBM to decode tricky questions posed in English and answer them correctly within seconds—would trounce the smartest of human challengers. And so it did earlier this week, following a three-day contest against the two most successful human champions of all time on “Jeopardy!”, a popular quiz game aired on American television. By the end of the contest, Watson had accumulated over $77,000 in winnings, compared with $24,000 and $21,600 for the two human champions. IBM donated the $1m in special prize money to charity, while the two human contestants gave half their runner-up awards away.
IBM has a long tradition of setting “grand challenges” for itself—as a way of driving internal research and innovation as well as demonstrating its technical smarts to the outside world. A previous challenge was the chess match staged in 1997 between IBM’s Deep Blue supercomputer and the then world champion, Garry Kasparov. As shocking as it seemed at the time, a computer capable of beating the best chess-player in the world proved only that the machine had enough computational horsepower to perform the rapid logical analysis needed to cope with the combinatorial explosion of moves and counter-moves. In no way did it demonstrate that Deep Blue was doing something even vaguely intelligent.
Even so, defeating a grandmaster at chess was child’s play compared with challenging a quiz show famous for offering clues laden with ambiguity, irony, wit and double meaning as well as riddles and puns—things that humans find tricky enough to fathom, let alone answer. Getting a mere number-cruncher to do so had long been thought impossible. The ability to parse the nested structure of language to extract context and meaning, and then use such concepts to create other linguistic structures, is what human intelligence is supposed to be all about.
Four years in the making, Watson is the brainchild of David Ferrucci, head of the DeepQA project at IBM’s research centre in Yorktown Heights, New York. Dr Ferrucci and his team have been using search, semantics and natural-language processing technologies to improve the way computers handle questions and answers in plain English. That is easier said than done. In parsing a question, a computer has to decide what is the verb, the subject, the object, the preposition as well as the object of the preposition. It must disambiguate words with multiple meanings, by taking into account any context it can recognise. When people talk among themselves, they bring so much contextual awareness to the conversation that answers become obvious. “The computer struggles with that,” says Dr Ferrucci.
Another problem for the computer is copying the facility the human brain has to use experience-based short-cuts (heuristics) to perform tasks. Computers have to do this using lengthy step-by-step procedures (algorithms). According to Dr Ferrucci, it would take two hours for one of the fastest processors to answer a simple natural-language question. To stand any chance of winning, contestants on “Jeopardy!” have to hit the buzzer with a correct answer within three seconds. For that reason, Watson was endowed with no fewer than 2,880 Power750 chips spread over 90 servers. Flat out, the machine can perform 80 trillion calculations a second. For comparison’s sake, a modern PC can manage around 100 billion calculations a second.
For the contest, Watson had to rely entirely on its own resources. That meant no searching the internet for answers or asking humans for help. Instead, it used more than 100 different algorithms to parse the natural-language questions and interrogate the 15 trillion bytes of trivia stored in its memory banks—equivalent to 200m pages of text. In most cases, Watson could dredge up answers quicker than either of its two human rivals. When it was not sure of the answer, the computer simply shut up rather than risk losing the bet. That way, it avoided impulsive behaviour that cost its opponents points.
Feb 17th 2011, 16:52 by M.G. | SAN FRANCISCO
GETTING consumers to cough up for electronic newspapers, magazines, videos and other content has become much easier thanks to the spread of smartphones, tablet computers and other such revolutionary gadgets. But content producers and tech companies are still haggling over how to share the spoils—in terms of hard cash and valuable information about customers—that such sales generate. This week Apple and Google unveiled competing digital-subscription services that differ from one another in several significant and controversial respects.
One is the amount of money that ends up in publishers’ coffers. Apple’s new subscription service for digital media
, based on its iTunes platform, gives 70% of the revenue generated from sales to publishers, whereas Google’s new “One Pass” system
lets them keep 90%. Apple also insists that prices publishers charge the same price (or less) for their in-app offerings as they do for those offered via other channels. Google, on the other hand, says publishers will be free to set their own prices and terms in One Pass. (Some in the publishing industry speculate that Apple’s determination to influence pricing of subscriptions outside the app store, as well as within it, could attract the attention of anti-trust watchdogs
Google’s system also appears to differ from Apple’s in another important respect. When people purchase content via One Pass, certain information about them such as their names, zip codes and e-mail addresses will be shared with publishers, unless they request that it be kept private. Apple, on the other hand, has said such data will only be shared with publishers if users explicitly agree to this when purchasing a subscription. That means less information is likely to come publishers’ way in Apple’s ecosystem than in Google’s one.
All this has prompted plenty of debate. In a blog post, James McQuivey
, an analyst at Forrester, a research firm, argues that Apple is shooting itself in the foot by siphoning off 30% of the revenue from subscriptions generated via its app store. This hefty tax on app-generated transactions, he argues, will destroy the profitability of many subscription models, forcing publishers to switch to rival platforms instead. It may also encourage them to provide enticing content elsewhere that isn’t available to Apple app users, in a bid to get customers to sign up via other channels that offer publishers more flexibility.
Apple’s approach will certainly raise the hackles of rivals such as Amazon, which have been using free apps to drive purchases of content outside the Apple ecosystem. In future they will have to offer an in-app purchasing opportunity, sharing the revenue generated with Apple according to its formula. Rhapsody, a music-streaming service, has already kicked up a fuss about this, complaining that Apple’s approach is “economically untenable”
because services such as its own already operate on tight margins. Handing 30% to Apple will wipe out those margins. But Apple's rule that in-app prices must be the same or lower as those elsewhere means that firms offering e-books, music streaming or video services cannot raise their prices without also penalising customers who access their services via non-Apple devices.
Other folk worry that Apple’s decision to give consumers a choice about whether or not to share personal information with publishers could prove even more damaging if it limits publishers’ ability to market their products efficiently. John Squires, a publishing industry consultant, has given warning
that if they lose their direct access to customers, content companies “risk a final digital Armageddon”. Apple's defenders say that consumers will benefit, because subscribing and unsubscribing will suddenly be much easier.
Apple is betting that any qualms publishers may have about this will be more than assuaged by their eagerness to reach its vast user base. It must also be hoping that they will be less impressed with Google’s One Pass, which one critic has dismissed as little more than a “warmed-over content paywall”
Some magazines such as Elle and Popular Science are betting that they will be able to extract information from readers after they have bought subscriptions via Apple’s app store. (The Economist is also available on the iPhone and iPad via an app.) Others are biding their time to see if Apple will relent and offer more attractive terms to publishers as the market for digital subscriptions grows. They shouldn’t hold their breath.
Feb 16th 2011, 20:22 by G.F. | SEATTLE
SURELY no self-respecting hack would argue that a moment of insight or analytical expertise that lies at the heart of solid journalism can be reduced to a series of simple, easily reproduced tasks? There is, after all, no way that the spark of inspiration ignited by the nuanced and intangible intercourse of analysis and synthesis can be clasped, not to mention crammed into the rigid corset of algorithmic rules. Or can it? Jim Giles, a writer who has contributed to this newspaper, fellow journalist MacGregor Campbell and a team of researchers led by Niki Kittur, from Carnegie-Mellon University in America, decided to check.
Under the rubric "My Boss Is a Robot"
they are testing whether it is possible to draw on the sort of distributed creativity that the internet has made possible—and faddish—to perform the equivalent of journalistic piecework. To start with, the group has chosen to bash out the kind of article with which Babbage is all too familiar: a write-up of a newly released scientific research paper. Rather than assign the task as a whole to a single person, their system will try to tease apart and outsource different elements of analysis and production.
The effort will not embrace a wiki-like approach, in which drafts are successively (and sometimes simultaneously) revised by unrelated parties who may or may not bring particular expertise to the table, and who can all see the current state of work. Instead, the group will atomise the process of writing an article into multiple steps which can be accomplished in isolation. (Part of the project is to see how reproducible—or not—such tasks really are.) Tasks might include writing a headline, summarising a chart, or providing a conclusion for a subsection of text. Each component will be assigned to multiple people without allowing them to see what the others have come up with. The collected products will then be sent out again for examination by another batch of eyes, again unable to compare notes. "You need redundancy for quality," explains Mr Giles. This competitive culling process is designed to judge which contributors excel, as well as reduce the need for editorial oversight by crowdsourcing part of that function.
The team finds participants in its experiment via Mechanical Turk, an automated task-jobbing service built by Amazon.com, an online retailer, as part of its cloud and computation division. Amazon uses it for some of its data gathering and processing. Mechanical Turk allows anyone to post tightly defined jobs, dubbed Human Intelligence Tasks (HITs). Those assigning HITs can set tests that prospective workers must pass to qualify for the job, or limit employ to workers with a good track record rated by previous finished jobs for which the assigner agreed to pay. Each HIT has a price tag or range of fees attached, often exceedingly low by developed-world standards. One current HIT, for instance, consists in collecting museum and art gallery entrance fees, and pays $0.12 for each gathered item. A few firms, such as Clowdflower
, offer a layer on top of Mechanical Turk to help companies indentify high-quality workers. (The service's name refers to the Turk, a chess-playing automaton
built in the late 18th century later determined to be a fake.)
Feb 16th 2011, 9:33 by T.Y. AND K.N.C. | TOKYO
THE iPhone has not only overturned the cellphone industry, but the portable game market too. Its popularity has forced today's giants, Sony and Nintendo, to change in order to survive.
In the case of Sony, it means combining its portable Playstation console, called PSP, with a cellphone. The result, announced on February 13th by Sony Ericsson, was the Xperia Play. It will hit American stores first, in March. Sony will provide a platform for users to download and play selected PlayStation games by the end of the year. Sony will also release a new portable gadget called NGP (Next Generation Portable), with 3G wireless service this year.
The company may argue that it planned to do these things anyway—and surely it did have such plans in a filing cabinet somewhere. But the fire under their workbenches was Apple's runaway success, particularly with the iPad, which poses an even greater threat to game device makers than the iPhones' small screens.
Nintendo has been scrambling to catch up too. When the iPad was released Nintendo's boss, Satoshi Iwata, dismissed it as a direct rival. And he maintained his nonchalance last month when he welcomed Sony's NGP as a catalyst to spur the market. The worry, however, is that the market may enjoy those rivals devices more than his. Nintendo's response, to be launched on February 26th, is the Nintendo 3DS
. It is provides 3D images visible to the naked eye, avoiding the need to wear cheesy glasses.
Moreover a gaggle of other competitors are eyeing the field, including Samsung's Galaxy tablet and smartphones, and even Panasonic, which is testing an online portable game device in America. Together, all this means today's gaming powerhouses are facing enemies on all sides. But they certainly have experience in getting past hostile environments to reach the next round.
Feb 15th 2011, 22:21 by G.F. | SEATTLE
I printed the New Yorker article about Paul Haggis and Scientology and now there aren't any trees left in the world. Sorry, everybody. (@johnmoe)
@johnmoe I saw how long the New Yorker's Paul Haggis article was and @instapaper'd it to my Kindle. No trees and fewer watts needed. (@kawika)
CONTENT sites (other than this newspaper's, of course) often appear to view readers' attention as a commodity to be diced up, then bartered or sold. Part is flogged to advertisers. Part is handed off to editors who try to ensure visitors stick around the site for a bit—like flies in honey—rather than buzzing off elsewhere. Part is given up to navigational gubbins meant to make the site stickier still. Only a small portion is devoted to the reason people come to a website in the first place: the unique article, blog post, or other entry found there.
Two years ago Arc90, a technology-strategy firm, created Readability. This free piece of software let readers reclaim 100% of their attention by stripping cluttered websites of all the superfluous bells and whistles. It began life as a bit of browser code encapsulated into a bookmarklet, a self-contained mini-program you plop into a browser's bookmark toolbar or in a bookmark menu. Tap Readability on a page with a block of continuous words and, lo and behold, everything but the text disappears. The software rewrites the text in reasonably-sized black type, and unfurls it over a white background that resembles a book page.
The firm then released Readability's underlying code under a broad free license. Apple picked it up for a reading-mode feature in its Safari browser, and it was built into plugins for other browsers. Soon it became a common noun, at least in some milieus. Readability was inspired by Instapaper, a service developed by Marco Arment, a former head programmer at Tumblr. Mr Arment also had based his free service on a bookmarklet. Click Read Later, and a stripped-down version of a page is stored in your account on his server, along with a link back to the original page.
Readability has since metamorphosed into a standalone company of the same name. It is now a fee-based web service
with many elements in common with Instapaper, including archiving stories in your account, but aimed at providing a cash stream from readers to publishers. An Apple iOS app is around the corner. As a result, Readability competes with donation-based payment systems aimed at publishers like Flattr
, Kachingle or Sprinklepenny. Like these, Readability collects a few dollars each month from every subscriber (though higher contribution are possible) and distributes about 70% of the revenue to participating publishers (its competitors distribute 80% to 90%).
But it differs in important ways, too. First and foremost, the other three pitch themselves to users along the lines of public radio and television, aiming to cast publishers—whether humble bloggers or media conglomerates—in the role of put-upon content providers pleading for spare change. On the face of it, then, they are looking to ride on readers' guilt or selflessness.
Richard Ziade, Readability's boss, prefers to rely on his customers' simple, unadulterated self-interest: I want to read this now, or I want to read this later, but I want to read it without all the irritating frippery. In this respect, Readability is more akin to crowdfunding operations, where a bonus is offered to entice a small-time donor to become a fully fledged patron. These extras range from a dinner with the creator of whatever it is that is being crowdfunded (often artistic projects
) to what is, in effect, an exclusive advance pre-purchase
. In Readability's case, the bonus is its core archiving and text-conversion function.
Another difference is that unlike Flattr or Kachingle (but not Sprinklepenny), Readability does not require a content site to put a badge on every page for readers to click on in order to indicate that a micropayment has been made. Any such badge or widget means introducing new code in the website's innards. It also means readers cannot donate to unaffiliated websites.
Although Readability does offer publishers the option of including "Read Now" and "Read Later" widgets on their content pages, which lets users dispense with a bookmarklet or a browser plug-in, it does not require that they do. Instead, the software tracks the pages it converts and archives, promising to hold the fees for all sites visited by its subscribers in the form of self-administered escrow. Any site that decides to adopt the widget will receive its share of fees collected from the service's inception.
However, the firm faces the same problems as any micropayment system. It must reach a critical mass of subscribers to make the revenue drip appealing to the biggest publishers who will then, hopefully, reel in more subscribers. The rub is that big publishers' sites typically brim with precisely the sort of unwanted clutter Readability targets. Removing it may annoy such clutter's main purveyors, ie, advertisers, who also frequently happen to be such sites' principal sources of revenue. True, for Readability to work its pruning magic, a page must load in full first; only once a visitor has beheld it in all its glory may he choose to pare it down. But that decision can be made rapidly—that is, after all, the whole point. Giving readers full command of their attention implies having to wrest some of it from the ad men. In this regard, at least, surfing the web remains a zero-sum game.
Feb 15th 2011, 12:39 by E.B. | NEW YORK
ALGAE have been infesting the ether of late. One television commercial features a buff faux boffin strolling around a virtual lab, waxing eloquent about their promise for greener energy. "Fuel", a documentary, hails them as a biotech alternative to ethanol and traditional fuel sources. Even the American Navy has been in the news for commissioning Solazyme, a producer of algae-based biofules, to provide the juice for some of its ships.
The company initially focused on algae-based biodiesel but has since diversified into jet fuel and cosmetics—and food. In 2009 its chemists began a series of experiments to see how algal derivatives fare in the kitchen. Not bad, it turns out. Algae flour—something of a misnomer given its mushy, moist texture—can be used to make cookies, honey mustard and even a milk substitute, replacing butter, vegetable oils and eggs and removing the need for processed sugar and preservatives. Nutritionally, this translates into fewer calories, a healthy dose of the ever popular antioxidants, some salubrious micronutrients, as well as oodles of dietary fibre which makes these easier for the body to absorb.
Crucially, cookies still taste like cookies and not, as some may fret, sushi—at least to Babbage's tastebuds. (He does not know whether the sentiment was shared by Barbara Boxer, a Democratic Senator pictured right cagily sampling an algal flour cookie.) Little wonder, then, that in November 2010 Solazyme teamed up with Roquette, the second largest starch supplier in Europe and the fourth largest in the world, to bring this new superfood to the market.
This may yet prove a tough sell. For a start, consumers will need some convincing before they nibble on the stuff that many associate with the muck on the walls of fish tanks. Pitching algae as an alternative cooking ingredient or health food will doubtless appeal to some foodie fashionistas, for a while at least. But this strategy may deter more discerning types who dismiss any such notions as passing fancies.
In fact, algae might be easier to deploy at an earlier stage of the food-production cycle. Nick Baily of the Belgrave Trust, a New York-based firm that helps individuals and companies to offset their carbon footprints, estimates that as much as 40% of food-related footprints (which account for about a fifth of the total in Britain) is attributable to the petrochemical fertilisers used in producing foodstuffs. Replacing such dirty fertilisers with the lighter-footed algae-derived sort may make a sizeable dent in carbon emissions, and spur environmentally conscious consumers to plump for foods grown using the technology. The cost of such algal alternatives will need to drop before they become widely used in agriculture. But at least farmers are more inured to dealing with muck than are food faddists.
Feb 15th 2011, 9:05 by M.K.
IN THE early 20th century, a horse named Clever Hans was believed capable of counting and other impressive mental tasks. After years of great performance, psychologists put the ruse to rest by demonstrating that though Hans was certainly clever, he was not clever in the way that everyone expected. The horse was cunningly picking up on tiny, unintentional bodily and facial cues given out not only by his trainer, but also by the audience. Aware of the “Clever Hans” effect, Lisa Lit at the University of California, Davis, and her colleagues, wondered whether the beliefs of professional dog handlers might similarly affect the outcomes of searches for drugs and explosives. Remarkably, Dr Lit found, they do.
Dr Lit asked 18 professional dog handlers and their mutts to complete two sets of four brief searches. Thirteen of those who participated worked in drug detection, three in explosives detection, and two worked in both. The dogs had been trained to use one of two signals to indicate to their handlers that they had detected something. Some would bark, others would sit.
The experimental searches took places in the rooms of a church, and each team of dog and human had five minutes allocated to each of the eight searches. Before the searches, the handlers were informed that some of the search areas might contain up to three target scents, and also that in two cases those scents would be marked by pieces of red paper.
What the handlers were not told was that two of the targets contained decoy scents, in the form of unwrapped, hidden sausages, to encourage the dogs’ interest in a false location. Moreover, none of the search areas contained the scents of either drugs or explosives. Any “detections” made by the teams thus had to be false. Recorders, who were blind to the study, noted where handlers indicated that their dogs had raised alerts.
The findings, which Dr Lit reports in Animal Cognition, reveal that of 144 searches, only 21 were clean (no alerts). All the others raised one alert or more. In total, the teams raised 225 alerts, all of them false. While the sheer number of false alerts struck Dr Lit as fascinating, it was where they took place that was of greatest interest.
When handlers could see a red piece of paper, allegedly marking a location of interest, they were much more likely to say that their dogs signalled an alert. Indeed, in the two rooms where red paper was present and sausages were not, 32 of a possible 36 alerts were raised. In the two where both red paper and sausages were present that figure was 30–not significantly different. In contrast, in search areas where a sausage was hidden but no red piece of paper was there for handlers to see, it was only 17.
The dogs, in other words, were distracted only about half the time by the stimulus aimed at them. The human handlers were not only distracted on almost every occasion by the stimulus aimed at them, but also transmitted that distraction to their animals–who responded accordingly. To mix metaphors, the dogs were crying “wolf” at the unconscious behest of their handlers.
How much that matters in the real world is unclear. But it might. If a handler, for example, unconsciously “profiled” people being sniffed by a drug- or explosive-detecting dog at an airport, false positives could abound. That is not only bad for innocent travellers, but might distract the team from catching the guilty. Handlers’ expectations may be stopping sniffer dogs doing their jobs properly.
Feb 13th 2011, 19:31 by L.S.
LESS than a year ago Babbage wrote an article, entitled “Clouds under the hammer”
, in The Economist discussing whether computing capacity would become a tradable commodity. In the conclusion he sided with the sceptics, arguing that “virtual machines”—the main unit of measurement in cloud computing—would increasingly move about, but mostly within clouds controlled by a single company (“private clouds”) or trusted federations of “public clouds” (where anyone can buy capacity).
One of the people quoted in the article, Reuven Cohen, the founder of Enomaly, a maker of software that allows firms to build public clouds, was more optimistic. He has now proven that he was right: today Enomaly will launch SpotCloud
, the world’s first spot market for cloud computing.
Fundamentally, SpotCloud works like other spot markets (for more screen shots click here). Firms with excess computing capacity—operators of data centres, cloud providers, hosting firms—put it up for sale. Others, who have a short-term need for some number-crunching, can bid for it. Enomaly takes a cut of between 10% and 30% depending on the size of the deal. But there is an important difference: SpotCloud is what Enomaly calls an “opaque market”, meaning that the firms offering capacity do not have to reveal their identity. Thus selling computing services for cheap on SpotCloud does not cannibalise regular offerings.
Technically, the service is also not entirely what one would expect. Enomaly did not build a big central infrastructure—because the bandwidth demands “would have killed us”, in the words of Mr Cohen: files of virtual machines are very large. Instead, the firm works with Google App Engine, itself a cloud-computing provider, which gives Enomaly access to a decentralised global system. The buyer’s virtual machines files are parked on App Engine before being send to a seller’s servers. Buyers can also define in which country or even city they want their virtual machines to run.
SpotCloud has been up and running since November in a closed trial—and has already attracted a lot of supply. It often comes from unexpected corners of the computing universe. For instance, an entertainment company has offered capacity on 4,000 servers that would otherwise sit unused (probably in a lull between making animated movies). In other cases, says Mr Reuven, firms offer the capacity of old servers, which otherwise would be thrown away at some point.
The big question is whether there is also enough demand. Again, Mr Cohen is optimistic, and sees many ways in which Enomaly’s market place could be used: getting non-critical computing tasks done quickly and cheaply, testing new websites and quickly adding computing capacity in certain regions. But he also warns that SpotCloud is not for those who want long-term computing needs satisfied with the service-level guaranteed. Instead, it is for those who need capacity quickly and do not much care if the computing task has to be restarted if something goes wrong.
Feb 11th 2011, 11:50 by N.V. | LOS ANGELES
FOR decades, your correspondent has watched, with more than casual interest, every new twist and turn in the quest for an “artificial leaf”. His hope has been that industry might one day replicate the photosynthetic process used by plants, and thus create forests of artificial trees for making hydrocarbon fuel direct from sunlight. Apart from helping offset the emission of carbon dioxide caused by burning fossil fuels, such man-made leaves could provide an endlessly supply of energy for transport. Finally, it seems, something is stirring in the forest.
In his recent State of the Union address, President Obama drew special attention to the $122m research programme on artificial photosynthesis that is underway in laboratories across California. “They’re developing a way to turn sunlight and water into fuel for our cars,” said the 44th president. He might have added that the Joint Centre for Artificial Photosynthesis (JCAP), involving some 200 scientists and engineers from universities and research laboratories around the state, was seeking to make liquid hydrocarbons not from solar-powered electrolysis, biomass, micro-organisms or other round-about routes, but direct from sunlight—just as the chlorophyll in a leaf does.
Sunlight is the world’s largest source of carbon-neutral power. In one hour, more energy from the sun strikes the Earth than all the energy consumed by humans in a year. Yet, solar energy, in the form of sustainable biomass, provides less than 1.5% of humanity’s energy needs. Meanwhile, solar panels contribute a mere 0.1% of electricity consumption.
The problem is that the sun does not shine all the time. Night intervenes. So do clouds. If people are to rely on the sun for more of their energy, then a reliable form of storage is required. And the best way to store solar energy is to convert it into chemical fuel. That is what nature has been doing for millions of years.
Unfortunately, artificial photosynthesis is still in its infancy. Researchers reckon that, at least in the laboratory, they can make fuel direct from sunlight far more efficiently than can the fastest-growing plants. But no-one can yet do so at a cost that would make the process economic. Nor can they make it robust enough to work continuously, year in and year out, under the full glare of the sun. And they are years away from integrating the various steps—from capturing the sunlight in the first place to producing the finished fuel—into working prototypes, let alone commercial-sized factories capable of producing something resembling petrol.
Nevertheless, chlorophyll—the stuff of life—is as good a place as any to start. This large organic molecule has a magnesium ion at its core, surrounded by a ring of porphyrin. In nature, porphyrins are a group of organic pigments that give plants, corals and even animal skins their colours. One of the most common porphyrins is heme, the pigment in red blood cells. The porphyrin in chlorophyll absorbs strongly in the red and blue-violet parts of the visible spectrum, but not in the green. By reflecting such wavelengths, chlorophyll gives plants their colour.
It would be better, of course, if chlorophyll could absorb light across the whole of the visible spectrum. But plants take what they have been given. As such, chlorophyll’s job is to absorb all the energy it can from sunlight, and use it to transform carbon dioxide from the atmosphere and water from the soil into carbohydrates and oxygen. The energy stored this way is what makes it possible for practically all living things to survive and thrive.
What makes chlorophyll so good at capturing sunlight is the way its ring-like structure can lose and gain electrons easily. When a leaf absorbs photons from sunlight, electrons in the chlorophyll molecules get excited from lower energy states into higher ones, allowing them to migrate to other molecules. That forms the starting point for chains of electron transfers that end with electrons being "donated" to molecules of carbon dioxide. Meanwhile, the chlorophyll molecules that gave up electrons in the first place accept electrons from elsewhere. These form the end points of transfer processes that start with the removal of electrons from water.
In this way, chlorophyll acts as a catalyst that drives the oxidation-reduction reaction between carbon dioxide and water to produce carbohydrates and oxygen. In the pursuit of the artificial leaf, then, the main task is to find catalysts that can mimic the intricate dance of electron transfers that chlorophyll makes possible.
Feb 11th 2011, 9:40 by G.F. | SEATTLE
Mr Babbage stated that upon the first occasion he was disturbed by the noise of the defendant's organ, and he went out and requested him to cease playing, and to go away...The people in his neighbourhood encouraged the organ-men. He could not, he said, walk along the streets now without being insulted by persons living in the neighbourhood...Mr Babbage was engaged on works of great scientific importance, and of a nature which his persecutors could not understand.—Street Music in the Metropolis
CHARLES Babbage, this blog's namesake, disliked the street noise of London prevalent in the 1860s, a couple of decades after this newspaper was founded. He was joined in public campaigns to squelch the discord that kept knowledge workers from maintaining focus by Charles Dickens, Thomas Carlyle and Richard Doyle, among others. (Walter Bagehot's opinions on noise are unknown.) His antipathy was a watchword in his day. And now it is immortalised in cartoon form, along with alternative history versions of several of his inventions.
The present Babbage came across 2D Goggles
some months ago, a rousing episodic comic of Ada King, Lady Lovelace (née Byron) and the eponymous Mr Babbage fighting the oral depredations of "The Organist", a foul besmircher of mental clarity. Lady Lovelace and Mr Babbage did collaborate, but their work together unfortunately fell short of incorporating sewers, monkeys and difference engines the size of buildings. "The Organist" has a backbone of facts over which artist Sydney Padua has built her fancy. This adventure recently concluded, with Lady Lovelace and Mr Babbage coming out smelling like roses (not monkeys). "It really is true that Babbage had this obsessive battle with the street musicians," says Ms Padua. "There are tons of documents. It was a byword. Any time anybody wrote a popular article about music in the 1860s, they had to mention Babbage."
Ms Padua, a Canadian animator who lives in London, created 2D Goggles for Ada Lovelace Day
, an annual event to celebrate and encourage women in subjects dear to this blog: science, technology, engineering and maths. As Lady Lovelace died aged 36, Ms Padua says "I just did the comic as a joke, and the punchline of the joke is that she didn't die. They built a difference engine and fought crime." That first sequence was vivid enough to cause readers to ask for more, expecting an entire tale.
Ms Padua has read Mr Babbage extensively, examining work contemporary to his time available through Google Books, which is also the source of the book quoted at the start of this post. She notes that, as part of a project to build Mr Babbage's Analytical Engine, a more advanced computer than his Difference Engine, an effort is simultaneously underway to digitise Mr Babbage's personal papers. (A donation drive for the effort, Plan 28, ended without reaching its goal on February 1st. John Graham-Cumming, the man behind the programme, plans to proceed regardless. You may recognise Mr Graham-Cumming's name from his successful petition to rehabilitate Alan Turing
, which resulted in a public apology by the then Prime Minister, Gordon Brown.)
Ms Padua's portrayal of Lady Lovelace's obsession with and expertise in maths is likewise rooted in history. Lady Lovelace was the only legitimate child of George Gordon, Lord Byron, though she was raised apart from him and he died when she was nine. Her mother had her tutored rigorously in maths in an effort to keep rationality to the fore, and her father's madness in abeyance. Lady Lovelace's genius in maths had many admirers. "You couldn't make up this thing about Lovelace being brought up with this mathematical regimen because otherwise she'd go mad," Ms Padua says. Lady Lovelace is often credited with having written the first credible computer program.
2D Goggles is a side project for Ms Padua, who works on both computer- and hand-animated films for her living. Her Babbage/Lovelace comics use a rough drawing style intentionally. She employs both paper and computer tablet to create them. She says that the heavy subcultural interest in steampunk—the expression of computational functions in the design and mechanical aesthetics of the 19th century—derives from the same motivation. "The thing about steampunk that's really attractive is that technology is so abstract now. You push a button, and it goes into the box, and it doesn't make a noise. It's very abstract. It doesn't satisfy the monkey brain."
Ms Padua has so far written several stories of derring-do in addition to the Organist, including a visit from Her Majesty (Victoria, that is), interested in the mechanical device to which she had given patronage. More is to come. "If I did all the stories I would like to do, it would probably take me about 15 years," says Ms Padua.
The artist is happy to foster additional awareness of both the stars of her comic. Common knowledge of them is limited, despite their crucial roles on the path to functional computers. "You can draw them, and in a sense bring them back to life," she says. She notes of Mr Babbage that "he should be much more famous than he is. He needs better PR." We are trying...
Feb 10th 2011, 12:38 by G.F. | SEATTLE
IN THE beginning, the city had a name, Urm
, a smell (strong), and scarce anything else. Its outlines have emerged. The process of telling is the process of making. Townspeople fish the river
with singing harpoons. The sky and people (as a mating ritual) are bioluminescent. An uncompleted dome towers over the city. If you fail to pay dues to a union
that oversees maps, you may simply disappear, along with "buildings, streets; even whole districts".
Urm is a construct arising from many minds, a crowdsourced city being built a few words at a time on top of Twitter with a hash mark serving as mortar between the bricks of the story. Novelist Nick Harkaway laid the first stone. He wrote on Twitter:
From out of the city of Urm, which is imaginary, there arose in that year a great stink. #Urmcollab
In Twitter parlance, "#Urmcollab" is a hashtag, which allows easy following of a single theme as it develops. No one owns a tag, nor is there membership involved. Mr Harkaway has put the hash tag to use in a simple experiment in unconstrained collaborative narrative construction with no intent and no planned end. This approach prevents someone from "trying to grab the narrative and take control of it", too, he says. "If you write three tweets, and you never come back to the story again", that's fine, he says. "There's no obligation inherent in it." (This Babbage was tipped to the project by Yoz Grahame, a community creator irresponsible for the Starship Titanic Employee Forum, discussed here earlier
Mr Harkaway's first novel, "The Gone-Away World" (2008), is troubled with an absence of tangible fact, as nations peppered each other with reality bombs that peeled away the information content of matter. The world of the novel sees thoughts turned into reality—a bit akin to Stanisław Lem's Solaris—with a thin strip of sanity running around the planet's middle. His narrator is unreliable and possibly semi-material. Mr Harkaway's construct of Urm has a similarly hazy quality: what is stated becomes real within the story. He likened this quality to Jorge Luis Borges's "Tlön, Uqbar, Orbis Tertius"—a favourite of Babbage's—a story in which the discovery of a volume of an encyclopaedia about an Earth with different properties than the familiar one seems to change the world. Keen readers may see an echo, too, in China Miéville's "The City and the City", a tale of two cities that may or may not be invisible to one another.
"Literally, I looked on this: wouldn't it be cool if people wrote 140-characters statements about an imaginary city", Mr Harkaway says from London. He decided to "throw three descriptive tweets into the world with the same hashtag in them; this is here, go play." So far, a handful of people have added a hundred or so tweets, including fellow speculative-fiction writer William Gibson, most recently the author of "Zero History". "People are putting it together into a little string of narrative. If the conventional narrative is a road, this has the potential to turn into a town or city." However, Mr Harkaway has no pretensions about its development. "Not everybody is 100% brilliant at condensing a literary idea into 140 characters," he says, although he adds that it doesn't matter. He has already been taken aback by the beauty of some ideas, such as "some fantastic stuff about fields upon fields upon fields of bladed weapons growing out of the ground".
Mr Harkaway says, "Even if it stops dead right now, it says to me there is a ridiculously cool creative possibility inherent in things like Twitter that I would really never have credited." He says that he has a great interest in collaborative writing, but that the requirements and timing necessary to pull off projects make them difficult to assemble. Mr Harkaway has also been waiting for new forms of story to evolve from new media, such as deep non-linear narratives. "Nobody has yet kind of done the kind of thing where it's native to the web or native to social media", he says. Imagine, he suggests, "You are reading your iPhone and the FaceTime camera tracks where your eye is going, and that tracks your interest and determines what you see on the next page."
The experiment has just begun, however, and Mr Harkaway has no suspicion of how it will end. "At a guess right now, the vast likelihood is that it will tail off, rather than become a big thing." But, he says, "If it's going to survive at all, it's going to survive without me. It doesn't need me to tell them what it is. The great thing about it is that I feel completely out of control."
Feb 10th 2011, 12:06 by J.B.
EVERY journalist loves a scoop. Lawyers are starting to love them too. They are hoping to use a 1918 decision from America's Supreme Court, stemming from an argument over war coverage between William Randolph Hearst (pictured above), a press baron, and the Associated Press, to protect the business models of traditional publishers against internet-based rivals. The case gives legal protection to "hot news", ostensibly to encourage news gathering. Although the facts of the original case concern telegraphs, the issues go to the heart of today's internet news business, the efficiency of markets and freedom of speech.
During the closing stages of the first world war Mr Hearst's International News Service (INS) was banned from Allied-controlled telegraph lines for what was seen as over-excited reporting of British losses ("Zeppelins set London ablaze!"). Unable to transmit its own news back from the front, INS started rewriting Associated Press (AP) dispatches instead. AP sued, and INS responded that AP had no case under copyright because it had rewritten the dispatches. Facts are not protected by copyright—only the expression of them in a specific piece of text.
Eventually a majority of the Supreme Court came to agree with AP. Justice Mahlon Pitney, writing the majority opinion, rejected the idea that there might be any intellectual-property protection for news itself. "It is not to be supposed that the framers of the Constitution...intended to confer upon one who might happen to be the first to report a historic event the exclusive right for any period to spread the knowledge of it," he argued. But he couldn't let go his conviction that INS wasn't playing fair, and that the rules of fair competition needed to be upheld. So he argued that the gathering of news did create a "quasi property right", and although that right should not constrain the newspaper-buying public, it should prevent a commercial competitor from using AP's news for its own gain.
In dissent, Justice Louis Brandeis argued that Justice Pitney should have just swallowed his moral indignation. "To appropriate and use for profit, knowledge and ideas produced by other men, without making compensation or even acknowledgment, may be inconsistent with a finer sense of propriety; but, with the exceptions indicated above, the law has heretofore sanctioned the practice," he argued. To create a new property right would just make a mess unless it could be clearly defined and enforced—and Justice Pitney's quasi-right for news seemed to do neither. Nearly a hundred years later, the courts are starting to test Justice Brandeis's point.
The case now reviving interest in this antique dispute concerns a tiny website called Theflyonthewall.com. Its business was republishing share tips made by investment banks. The banks sued, and in the summer of 2010 they won. A court ruled that Theflyonthewall was unfairly appropriating some of the value created by the banks in researching and writing the reports, and required the site to wait for at least two hours before republishing them. Theflyonthewall, in turn, has had its case taken up by some of the Internet's big guns—including Google, Twitter and the Electronic Frontier Foundation (EFF), an internet lobby group. They supported the website's appeal to the Second Circuit Court, and made arguments
on its behalf to the court. Now they are anxiously awaiting the results of that appeal, which could appear anytime in the next few weeks.
The Second Circuit is authoritative in intellectual property. It set the stage for the suit against Theflyonthewall in a 1997 decision which overturned an injunction granted to the National Basketball Association, preventing Motorola, a maker of telecoms gear, from broadcasting game scores over its pagers. In that case the Second Circuit ruled that specific criteria had to be met for the hot-news doctrine to apply. The information in question must be expensive to gather and time sensitive. One party must be free-riding directly on another's expenditure, and the two must be in direct competition. Finally, the free-riding must significantly reduce the incentive to gather information, and so threaten its quality or existence. The NBA failed those tests, but Theflyonthewall passed.
At the appeal, the Second Circuit was asked to reconsider whether Theflyonthewall should have triggered the hot-news rules–and, in particular, whether a information website could really be considered to be in direct competition with an investment bank. But, more interestingly, it was also asked to look again at the criteria for applying the hot-news doctrine in light of the new questions thrown up the internet. Are bloggers, for example, in direct competition with newspapers? Does Google free-ride or provide free marketing and distribution? How do you define the value of information, and what restrictions, if any, would enable a supplier to capture that value? Is there a public interest in the news being as widely disseminated as possible, and would that interest differ between, say, a share recommendation and the news that Port au Prince has been flattened by an earthquake?
Even in the realm of more-or-less traditional news outlets, there would seem to be a lot of information sharing that would have to be carefully parsed if the hot-news doctrine were to be consistently applied. Jonathan Stray, an Associated Press journalist working with Harvard's Nieman Foundation, analysed the 121 unique articles thrown up by Google News concerning a school of Chinese hackers that penetrated Google, for example. His best reckoning was that only 13 contained original reporting
Google's representative at the Theflyonthewall appeal argued that the internet makes the whole concept of hot news outdated. Breaking news goes from being unknown to widely known so quickly that there is no longer any chance to misappropriate its value. The EFF argued that more and faster information flow was generally a good thing. In the numerous collisions between intellectual-property law and the First Amendment since the hot-news doctrine was created, free speech has usually trumped property. So the EFF particularly urges the Second Circuit to test the hot-news criteria against this evolved body of First Amendment law.
At the end of the day the real test of the hot-news doctrine will be that posed sceptically by Justice Brandeis. Whatever the Second Circuit decides in its forthcoming ruling, can it be enforced simply and efficiently?
Feb 9th 2011, 12:32 by I.M.
VISITORS to South Korea cannot fail to be impressed by the speed of the country's online connections. While even basic broadband access is unobtainable across parts of the developed world, most South Koreans can enjoy high-speed fibre-optic services for just $30 a month. Closing this broadband gap has become a priority for some governments. In April 2009 Australia unveiled a hugely ambitious plan to bring superfast broadband connections to more than 90% of the population by 2018
, at a cost to the public purse now estimated at around A$27 billion ($27 billion). The British
governments also want to use taxpayers' money to plug the broadband holes in their rural communities.
Why do governments feel such a need for speed? Many private-sector companies insist there is no commercial case for investment in high-speed networks. Although internet usage is soaring, network operators earn no more from traffic on bandwidth-gobbling sites like YouTube, which functions better over faster connections, than from customers accessing simpler web pages. Yet authorities increasingly see broadband as integral to economic prosperity, with the energy, education and health-care sectors among those set to benefit from the roll-out of improved infrastructure. That means broadband is bound up with governments' political fortunes, too.
The question is whether such vast public-funding commitments as Australia's are desirable, or even effective. Critics say taxpayers' money would be better spent elsewhere and that broadband development is best left to the private sector. Others argue for less heavy-handed public-sector involvement. South-East Asia, notably, appears to have built its broadband lead by encouraging companies to stump up the huge investments required.
A new study from the Economist Intelligence Unit
, our sister company, makes some judgments about what governments should and should not be doing. While broadband rankings typically measure factors such as the speed, availability and retail prices of existing services, a measure created for the study, the government broadband index (gBBi), looks instead at the components of the highest-profile public-sector plans. Besides targets for speed and population coverage, these include the cost to the taxpayer as a percentage of annual government revenues and the deadlines for universal access. The gBBi also considers the regulatory aspects of the various plans. Here are the results:
Feb 8th 2011, 17:46 by G.F. | SAN FRANCISCO
NEW friends are hard to come by. This Babbage is married, has passed 40 and finds small children clustered at his bedside each morning. Marriage and parenthood create new social circles, but also bind one more closely into them. Until the children are grown, there is not much spare time for a parent to pursue new activities and have the time needed for the acquaintance that deepens into friendship. Indeed, even old friends may feel neglected along the way.
Yet Babbage has new chums, to his surprise, arising from his obsession with Twitter. Your correspondent's online career stretches back to 1980 and the CompuServe dial-up service, where a live forum known as the CB (Citizens Band) Simulator
allowed real-time chat at 110 bits per second. The notion was that it resembled the kind of conversation possible over CB radios, which were popular at the time.
Of the same vintage were bulletin board systems (BBSes), into which you could make a connection to post messages in discussion groups, and also download software. The most sophisticated had multiple phone lines available at once, allowing chat among those simultaneously connected. Babbage participated in all that, and in each subsequent development in live and asynchronous online community, from Bitnet Relay via The Well, Internet Relay Chat and Instant Messaging to the current belle of the ball, Twitter.
And yet, he writes abashedly, he never made what one might call true friends solely online during those three decades of chatter. Acquaintances, yes. Well wishers, colleagues, enemies, boors and even slightly disturbing fans. And such means have strengthened or maintained ties formed in the world of handshakes, hugs and tears. But there was never a connection that started with electrons and led to the consumption of fermented and distilled beverages, and that interchange of pure nonsense, deep thoughts, shared experience and common loss which underpins a strong tie. Never, at least, until Twitter.
Despite—and perhaps because of—the trivial and by necessity shortened discourse on Twitter, several friendships have blossomed, recently confirmed in person. That description may sound a bit bloodless, but Babbage turns his steely eye upon his own soul and motives, not just those of others.
Twitter enables the sort of chitchat impossible among strangers on email or instant messaging, and outside the scope of Facebook. Facebook's boss, Mark Zuckerberg, often describes his service as a way to connect with friends you already have. Facebook creates circles upon circles of acquaintance, but most conversation is among those already known to each other.
Twitter, however, is a different beast. The asymmetry of follower and followee creates a different rhythm, allowing the possibility of falling into conversation with an unknown someone without invading his or her space. It is a simple matter to ignore or block those who you find uninteresting. And people you know and trust outside the electronic realm lead you to their friends, colleagues and family. Likewise, you may be on the receiving end of tendrils of acquaintance. The shared set of relationships and communication among those you know vets new people for you and you for them.
Tweaks made to Twitter a few months ago that make a real-time stream generally available—instead of the retrieval of periodic updates—have blurred the lines between synchronous communications like chat and asynchronous methods such as email and forum commenting. Direct messaging on Twitter, a one-to-one method that may take place only between two parties who mutually follow each other, provides a trimmed-down email analogue without the weightiness and effort associated with handling yet more messages. Continuous speech is more or less possible without the overhead and commitment of a chat room or an instant-message session.
Babbage tends towards gregariousness, online or off, and struck up Twitter conversations with several friends of friends in the past year. He found himself communicating at first through "mentions" on Twitter—the @name convention that allows you to point a reply or message in the public stream at someone—but then moved into direct messages, instant messaging and email. Those he grew to know best also blog and use Facebook. Over the course of months, he learned quite a bit about his newfound mates, cheering on their successes, sympathising over their straits and introducing them to new people. With a little trepidation, he finally met a few such people in person.
While Tweet-ups, a mashup of tweet and meetup, are common, those typically involved likeminded groups of people who have met or continue conversations online. In my case, the meetings were at Macworld 2011, in San Francisco—the annual convocation of the Apple community and we reporters thereof. One hates to ruin a good thing. Would Babbage fail to live up to the expectations of his new frequent correspondents—or they to his? Does Twitter's distilling process remove the chaff and leave so much wheat as to provide a mistaken impression of personality and mutual interest?
Babbage happily reports the results of his accidental experiment: those who are genuine in 140 characters are equally so over meals and laughter. Twitter is not a guarantee of friendly compatibility, but your correspondent found it an awfully close match.
Feb 4th 2011, 23:17 by J.P. | MEYRIN AND SAINT GENIS POUILLY
PUTTING aside unfounded fears of stockpiling of weapons-grade antimatter or poking mini black holes that will gobble up Earth in a trice, there seem to be at least three less paranoid misconceptions about CERN. One consists in equating it with the Large Hadron Collider (LHC), admittedly its fanciest bit of kit. Another is to assume that the LHC's brief is to find the Higgs boson, period. Finally, it is to liken experimental particle physics to hunting—a trope which, to be fair, physicists themselves blithely perpetuate
Start with the last. What goes on at CERN has precious little to do with the romantic (to some at least) notion of tweed-clad gentlemen sniping at game. If anything, it is more akin to fishing with explosives, where throwing a heftier charge into a smaller pond shortens the odds of seeing a bigger fish float belly up. So, too, in particle accelerators like the LHC.
Here, protons are sped up to a smidgen below the speed of light, the equivalent of lighting a sizeable stick of dynamite. Next, as they enter the LHC detectors, they are squeezed into a beam just 16 microns across, one-third the width of a human hair—a very small pond indeed. However, because the individual particles are so minuscule, even a compressed beam contains plenty of empty space and head-on collisions—the sort to generate the most energy and thus, by dint of Albert Einstein's famous equation, E=mc2, the heavy particles of most interest to physicists—are only expected extremely rarely.
One such big fish is the Higgs boson, sometimes dubbed the "God particle", though the moniker makes most physicists cringe. It is the particle associated with the hypothetical "Higgs field" which is thought to pervade all space and whose interactions with other elementary particles give them their mass. This explains how they clumped together into galaxies, planets and people, rather than whizzing around eternally at the speed of light, as massless photons do.
Many LHC scientists see netting the Higgs as a done deal, especially if its mass lies at the lower end of the range predicted by theory. A less massive Higgs means less energy would be needed to produce it, increasing the likelihood of doing so when protons merely graze each other. Since, statistically, this happens much more often than head-on collisions, several Higgses—or, strictly speaking, signatures left by the less fleeting particles into which the Higgs is thought almost instantly to decay—may already be buried in the haul of data from last year. (The obverse is that a lighter Higgs would be harder to tell apart from all the other particles created in the collisions than a heavier one; though a heavy Higgs is only expected to crop up extremely rarely, as a result of direct proton-proton impact, it would leave a more unmistakeable trace.)
Out with the old
This is all very exciting, of course, but only as the known unknowns of "old physics" go. The Higgs is the last unobserved piece of the Standard Model, a 40-year-old mathematical framework which links all the known particles and all of the fundamental forces of nature expect for gravity. Researchers your correspondent spoke to gave the impression of being far more aflutter talking about the unknown unknowns of what they refer to as "new physics".
Feb 4th 2011, 16:49 by N.V. | LOS ANGELES
IT IS remarkable how risk-conscious people have become, especially on the road. Sure, some motoring maniacs will always push their luck, causing mayhem for themselves and others—and everyone makes mistakes from time to time, gets distracted, becomes impatient and is, perhaps, not as mindful of other road users as he ought to be. Nevertheless, the statistics for traffic accidents, at least in developed parts of the world, reveal a heartening downward trend.
In the United States, for instance, the latest figures from the National Highway Traffic Safety Administration (NHTSA) show that 33,808 people died on American roads in 2009—the lowest level since 1950. That is still way too many personal tragedies. Even so, it represents a 9.7% decline from the figure in 2008, which was itself 9.7% lower than 2007's. The absolute number of fatalities may grab the headlines, but the more relevant statistic—the fatality rate per 100m vehicle-miles travelled—has also been inching steadily down over the past half century. In 2009, the American rate had fallen to 1.13 deaths per 100m vehicle-miles. Only Britain, Denmark, Japan, The Netherlands and Sweden fared better. For that, traffic authorities everywhere can thank the wholesale introduction of safety-belts and air-bags, as well as tougher drunk-driving laws.
As could be expected, the recession has played its part in reducing the deathly toll on the road—especially among the most vulnerable group, 16- to 24-year-olds. They have suffered most from unemployment and hence have been exposed to fewer hazards on the road. The worry is that there could be a rebound in fatalities once the recovery gets seriously underway and the young resume their reckless driving habits.
While horrifying, traffic accidents are far from being mankind’s greatest scourge. Around the world, they account for 1.2m deaths a year, compared with the 35m people who die from non-communicable illnesses such as cardiovascular disease, cancer and diabetes (5.4m of which are caused by smoking alone). According to the World Health Organisation, some 25m people all told have died in road accidents since horseless carriages took to the streets (the first such fatal accident occurred in London in 1896). That is the same as the number of people who have died over the past 30 years from AIDS.
The irony is that, while the roads are safer than ever, motorists have become more safety conscious. Back in the early 1970s, when your correspondent built a car for himself, he considered its backbone frame—made of pressed-steel sections braced with steel tubing—as state-of-the-art as far as crashworthiness was concerned. With the engine and transmission amidships, the front third of the vehicle was effectively a dedicated crumple zone. Likewise, the rear had strategically placed structural members designed to collapse on impact and mop up excess kinetic energy if shunted from behind. An added virtue was that, being a mere 1,450lb (660kg), the car had very little inertia to overcome relative to most other vehicles on the road, and thus tended to be shovelled down the highway intact when hit from behind (as has happened twice) rather than being crumpled on impact.
Today, though, he considers his beloved 39-year-old car a death trap, and won’t allow his wife or daughter to drive it or ride with him. The reason is not that he thinks it dangerous to drive. Over the decades he has upgraded—on a machine that was inherently safe to start with—the brakes, the tyres and the suspension, and made the frame torsionally even stiffer. As a result, the vehicle now has far more primary safety (the agility, stability and stopping power needed to avoid accidents) than the vast majority of modern cars.
The problem is the vehicle’s secondary safety—the ability to save occupants’ lives if the car is, despite all its primary safety, actually involved in a crash. While the car's original seat belts have been replaced with four-point harnesses, it still has no air-bags, nor any side-intrusion protection. Viewed from the side, its occupants sit within a fragile eggshell of fibreglass. Tee-boned at a crossing, they would be instant spam in a can.
Feb 3rd 2011, 17:49 by G.F. | SEATTLE
FAR OUT in the uncharted backwaters of the unfashionable end of the Starship Titanic website sat a specious message board with posts by the senior crew of the fictional ship. The site was launched in 1997 to support Douglas Adams's CD-ROM game of the same name, offering enhancements, misleading ideas, technical support and, of course, nonsense. The board provided clues and additional amusement to players and potential players of the game.
This is not its story.
Rather, it is the story of the Employee Forum, a thriving hidden society buried deep within the site, where lost travellers wandering down several dead ends inadvertently ended up. And they did not even need to pass a door labeled "Beware of the Leopard" leading to a disused lavatory. But the site's developers ensured the ride was not wholly smooth. As Yoz Grahame recently explained at MetaFilter, in "The Post That Cannot Possibly Go Wrong":
...one day, folks got a mail from the intranet admin, "Chris Stevedave", giving folks the link to the intranet and the current password, which was hurriedly followed by a second mail apologising for the accidental mail leakage and urging customers not to click the link, then a third email noting that Chris Stevedave had been demoted to Bilge Emptier Third-Class.
(Stress and nervous tension are now serious social-networking problems in all parts of the internet. In order not to exacerbate the condition, Babbage will disclose in advance that the Employee Forum is alive and well
Starship Titanic was an epic video game based on a story by Douglas Adams that he was also supposed to turn into a book, but—as is invariably the case with every story involving Mr Adams, deadlines, promises to keep said deadlines, promises following failure to keep promises to keep said deadlines, and so on—he did not. Monty Python's Terry Jones, a collaborator, pulled what is widely regarded as one of the hardest three weeks' work in history to complete the book in time. Mr Adams focused his efforts on the CD-ROM game, and, to a vastly lesser extent, the accompanying website. Commendably, the game's programmers wound up only a lot behind schedule. (The well-loved Mr Adams, who died in 2001, was also incapable of writing an introduction to his own site
, either prior to its launch or after, until his sudden demise.)
Feb 2nd 2011, 18:33 by J.P. | MEYRIN
IT IS easy to equate CERN with the Large Hadron Collider (LHC), the multi-billion dollar device that sits in a 27 km loop underneath the Franco-Swiss country side and has come to symbolise mankind's scientific and technological prowess. But CERN is not just the LHC. There is plenty of unrelated, often quirky physics going on there, sometimes relying on surprisingly frugal methods. Finally, there are the scientists themselves. True, they spend most of their waking hours immersed in work. Yet, despite rumours to the contrary, they are, in fact, human. Below, a series of photographs capturing some of the lesser known aspects of life at Europe's main particle-physics laboratory.
PS For more in-depth reporting from CERN, read here and here.
In this blog, our correspondents report on the intersections between science, technology, culture and policy. Follow Babbage on Twitter »
Latest blog posts - All times are GMT
Stay informed today and every day
Subscribe to The Economist's free e-mail newsletters and alerts.
Subscribe to The Economist's latest article postings on Twitter
See a selection of The Economist's articles, events, topical videos and debates on Facebook.