C laiborne Pell died at age 90 on January 1, 2009. In the weeks that followed, the former Democratic senator from Rhode Island was lauded for his many achievements, but one stood out: The first sentence of Pell’s obituary in The New York Times cited "the college grant program that bears his name." Pell Grants are the quintessential progressive policy, dedicated to helping low-income students cross into the promised land of opportunity and higher education. "That is a legacy," said Joe Biden, "that will live on for generations to come."

What the encomiums to Pell failed to mention is that his grants have been, in all the ways that matter most, a failure. As any parent can tell you, colleges are increasingly unaffordable. Students are borrowing at record levels and loan default rates are rising. More and more low-income students are getting priced out of higher education altogether. The numbers are stark: When Pell grants were named for the senator in 1980, a typical public four-year university cost $2,551 annually. Pell Grants provided $1,750, almost 70 percent of the total. Even private colleges cost only about $5,600 back then. Low-income students could matriculate with little fear of financial hardship, as Pell intended. Over the next three decades, Congress poured vast sums into the program, increasing annual funding from $2 billion to nearly $20 billion. Yet today, Pell Grants cover only 33 percent of the cost of attending a public university. Why? Because prices have increased nearly 500 percent since 1980. Average private college costs, meanwhile, rose to over $34,000 per year.

But the biggest problem with American higher education isn’t that too many students can’t afford to enroll. It’s that too many of the students who do enroll aren’t learning very much and aren’t earning degrees. For the average student, college isn’t nearly as good a deal as colleges would have us believe.

Pell Grants do nothing to address that problem. Low-income students are increasingly forced to attend inexpensive but under-resourced, non-selective universities and community colleges, where student results are often astoundingly bad. The average graduation rate at four-year colleges in the bottom half of the Barron’s taxonomy of admissions selectivity is only 45 percent. And that’s just the average–at scores of colleges, graduation rates are below 30 percent, and wide disparities persist for students of color. Along with community colleges, where only one in three students earns a degree, these low-performing institutions educate the large majority of Pell Grant recipients. Less than 40 percent of low-income students who start college get a degree of any kind within six years.

Some might argue that colleges are just enforcing academic standards by refusing to graduate unprepared students. But the evidence suggests otherwise. A 2006 study from the American Institutes for Research found that only 31 percent of adults with bachelor’s degrees are proficient in "prose literacy"–being able to compare and contrast two newspaper editorials, for example. More than a quarter have math skills so feeble that they can’t calculate the cost of ordering supplies from a catalogue.

Why is the quality question so obscure, when the cost question is so well-known? In part because it has been masked by the American higher education system’s unchallenged reputation as the best in the world. Unfortunately for the average collegian, this notion is entirely driven by the top 10 percent of institutions and the students who attend them–Harvard, Stanford, MIT, and the like. Much of the rest is a sea of mediocrity, or worse.

But the biggest culprit is the lack of objective, publicly available information about how well colleges teach and how much college students learn. Nobody knows which colleges really do the best job of taking the students they enroll and helping them learn over the course of four years. After decades of inaction, some recent efforts have been undertaken to collect that information: It now exists, but colleges and their powerful (and virtually unknown) lobbies will not permit the public to see it. As a result, colleges are far less focused on student learning than they should be, and consumers haven’t a clue what to do and have come to believe, mistakenly, that the most expensive colleges are also the best.

In their myopic attention to student financial aid, in their total indifference to price and quality, Pell Grants symbolize the larger failure of progressive higher education policy. Pell’s heart was in the right place. But by focusing only on helping the needy–the worthiest of instincts–progressives have ignored the larger issues that are driving runaway price increases and rampant neglect of student learning.

There’s a solution to these problems, but it won’t come from more tinkering with student aid programs. The key to giving students a better, more affordable education turns out to be focusing less on college financial aid and more on college itself. We must fundamentally change the relationship between the federal government and higher education, forcing institutions that receive vast amounts of public funding to provide a modicum of useful information in return. The time has come to rip open the veil of secrecy that has shrouded higher education for as long as students have walked next to ivy-covered walls, and to use that information to build far more effective, more egalitarian, and more student-focused colleges than we have today.

The Secular Church

Colleges are often lumped in with other non-profit entities like charities and hospitals in the public mind. But they actually most resemble the institution from which many of the oldest and most renowned colleges sprang: organized religion. Like the church, colleges have roots that pre-date the founding of the republic. They see themselves as occupying an exalted place in human society, for which they are owed deference and gratitude. They cherish their priests and mysteries, and they are disinclined to subject either to public scrutiny.

This was a tolerable state of affairs for the first 150 years or so of the nation’s existence. Initially, the federal government left higher education to the states, which in turn liberally granted charters to sectarian colleges and universities. College was mostly reserved for the sons of privilege–at the turn of the twentieth century, only 240,000 collegians were enrolled nationwide. While that number had grown to 1.5 million by the eve of World War II, it was still only 1.1 percent of the total population.

But the postwar social reordering changed higher education profoundly. Returning soldiers flooded campuses using money from the G.I. Bill. Middle-class prosperity brought status anxiety and aspiration to match. The emancipation of women and minorities created new classes of students, while the erosion of well-paying manufacturing jobs drove more young people to earn degrees. Today there are 18 million college students, a sixteen-fold increase since 1900 relative to the population as a whole.

This created an obvious challenge: All those new students had to go somewhere. The existing network of elite institutions had little desire to open their doors to the masses. So policymakers, mostly at the state level, made a historic–and in retrospect, deeply flawed–decision. They stamped out hundreds of copies of the standard university model. Teacher’s colleges were converted into regional universities. Branch campuses opened up with compass point suffixes describing their proximity to the mother ship. Big states like New York and California built whole university systems from scratch.

Even this wasn’t enough to handle the tide of new undergraduates. An additional layer of community colleges was built, many in Sun Belt states where in-migration and population growth were most acute. While these two-year public institutions didn’t offer advanced graduate programs in particle physics or produce bowl-winning football teams, they were still unmistakably colleges, firmly in the realm of higher education.

Each of these new institutions faced a question of identity. How should they act? Who should they be? Naturally, they looked to the original colleges that shaped the mold. And without exception, those institutions offered two core lessons. First, they taught that status in higher education is derived from wealth and selectivity–the most renowned institutions have gigantic piles of money and allow only the "best" students to attend. Second, they insisted that questions of quality, particularly as they relate to what students are taught and how much they learn, are nobody’s business but the institution’s own. This position was derived from the principle of academic freedom, a tremendously important and necessary idea when it comes to protecting controversial scholarship. Unfortunately, it was extended to the student side of the equation–colleges demanded freedom to teach as well or poorly as they pleased. Thus, any attempt by the government to inquire about academic matters was resisted at all costs.

The new colleges and universities took these values to heart. As a result, the customers, donors, and governments that finance America’s allegedly world-beating institutions know remarkably little about whether individual colleges and universities are any good at the single most important thing they do: helping students learn.