Americas

  • United States

Asia

Size matters: Yahoo claims 2-petabyte database is world’s biggest, busiest

feature
May 22, 20084 mins
Business IntelligenceData CenterDatabase Administration

Year-old database processes 24 billion events a day

The petabyte is the new petaflop.

Interest in raw computational speed waned — sorry, IBM — after data center managers began turning away from super-expensive supercomputers and toward massive grids comprised of cheap PC servers.

Meanwhile, the rise of business intelligence and its even more technical cousin, business analytics, has spurred interest in super-large data warehouses that boost profits by crunching the behavior patterns of millions of consumers at a time.

Take Yahoo Inc.’s 2-petabyte, specially built data warehouse, which it uses to analyze the behavior of its half-billion Web visitors per month. The Sunnyvale, Calif.-based company makes a strong claim that it is not only the world’s single-largest database, but also the busiest.

Based on a heavily modified PostgreSQL engine, the year-old database processes 24 billion events a day, according to Waqar Hasan, vice president of engineering in Yahoo’s data group.

And the data, all of it constantly accessed and all of it stored in a structured, ready-to-crunch form, is expected to grow into the multiple tens of petabytes by next year.

By comparison, large enterprise databases typically grow no larger than the tens of terabytes. Large databases about which much is publicly known include the Internal Revenue Service’s data warehouse, which weighs in at a svelte 150TB.

EBay Inc. reportedly operates databases that process 10 billion records per day and are also able to do deep business analysis. They collectively store more than 6 petabytes of data, though the single largest system is estimated at about 1.4 petabytes or larger.

Even larger than the databases of Yahoo and eBay are the databases of the National Energy Research Scientific Computing Center in Oakland, Calif., whose archives include 3.5 petabytes of atomic energy research data, and the World Data Centre for Climate in Hamburg, Germany, which has 220TB of data (download PDF) in its Linux database but more than 6 petabytes of data archived on magnetic tape.

But Hasan noted that archived data is far different from live, constantly accessed data.

“It’s one thing to have data entombed; it’s another to have it readily accessible for your queries,” he said. He also pointed out that other large databases store unstructured data such as video and sound files. Those can bulk up a database’s size without providing easily analyzable data.

Hasan joined Yahoo more than three years ago. At the time, Yahoo already had huge non-SQL databases storing hundreds of terabytes of data. Problem was, the data was in the form of large collections of compressed files that could be accessed only by writing programs in a language such as C++, rather than more easily and quickly via SQL commands, he said.

One of Hasan’s first moves was to buy a Seattle database start-up called Mahat Technologies, which had tweaked the open-source PostgreSQL to run as a column-based database rather than a conventional row-based one. Rotating tables 90 degrees, while slowing down the process of writing data to disk, greatly accelerates the reading of it.

Yahoo brought the database in-house and continued to enhance it, including tighter data compression, more parallel data processing and more optimized queries. The top layer remains PostgreSQL, however, so that Yahoo can use the many off-the-shelf tools available for it.

The largest tables in the database already comprise “multiple trillions of rows,” said Hasan, who helped develop database technology at Informix, Hewlett-Packard and IBM before coming to the user side.

The huge table sizes enable Yahoo to do broader, more complicated analyses, so it can better understand how to make its banner and search ads more effective, enabling it to reap more money from advertisers. They also help the company make its Web sites better for users by, for instance, making its search results more relevant, Hasan said. But loading the data takes several hours, so Yahoo does its real-time analysis with a different data warehouse.

The database requires fewer than 1,000 PC servers hosted at several data centers, said Hasan, who declined to reveal the exact number. He did claim that the number of servers used is one-tenth to one-twentieth fewer than the number that would be needed if the database were a conventional one such as Oracle, IBM’s DB2 or NCR’s Teradata.

Despite the success of Amazon.com Inc.’s EC2 cloud-based application hosting service, Yahoo has no plans right now to rent access to its database as a Web-based utility, nor to sell licenses of the technology to enterprises that want to install it on their own premises, Hasan said.