Forget the old comic book collection in the attic or the cabinet full of new-in-box Star Wars collectibles -- for middle-class Americans looking to build a secure, stable financial life, there are three traditional investments: a college degree, a house and a retirement account. For decades, these were sufficient to offer a reasonable shot at long-term prosperity, but over the past few years, that classic tripod has gone wobbly.
Roller-coastering financial markets have devastated retirement investments, plummeting real estate values have torched trillions of dollars in home equity, and rising tuitions combined with a weak job market have transformed shiny new bachelor's degrees from sound investments into risky gambles.
One effect of all this has been a change in the way many Americans view college. More would-be students -- and their parents -- are beginning to approach higher education like any other investment. They're considering statistics like graduation rates, post-grad employment rates, expected salaries, and cost-to-profit ratios when choosing schools. In short, they're working to become informed consumers.
Our Information Deficit
The trouble is, that data isn't always easy to find, and what's available isn't always trustworthy. Unlike buying a house or trading a stock -- highly-regulated transactions in which sellers are legally required to disclose large amounts of information -- prospective college students often have a hard time digging up the vital statistics they need to make educated decisions about their educations.
The problem is beautifully illustrated by U.S. News and World Reports, one of the accepted authorities when it comes to college rankings. The magazine is largely reliant on information supplied directly by the schools. Unfortunately, as various scandals have demonstrated, these rankings are relatively easy to game via policy changes on campus. Some schools don't even go to that much trouble: A few recent revelations tarnished the reputations of universities caught lying about their key statistics to make themselves look better. Unethical and potentially humiliating? Yes. But misreporting information to U.S. News is not actually a crime.
A Federal Report Card?
Thus, when it comes to making one of the most important (and expensive) purchases of their lives, students must rely on questionable information that is inconsistently reported. In the face of this, The Atlantic recently explored the question of whether or not Washington should step in with a federally-mandated report card program to help students choose their colleges.
The arguments in favor are compelling. After all, total student loan debt recently hit $1 trillion, outstripping credit card debt. On average, students who borrow money for college end up owing more than $25,500 by the time they leave school, and -- with jobs at a premium -- many may never be able to completely discharge their student loan debts.
What would students need to make an informed decision? Certainly four-year and five-year graduation rates are useful, as they can help students determine which universities delay graduation by failing to offer sufficient sections of vital core classes. Similarly, post-grad employment rate data can indicate which colleges -- and majors -- do the best job of getting their students into the work force, while average test scores can give a good indication of the level of students that a university is able to attract. And every parent needs a complete, honest listing of all the costs involved.
These are hardly revolutionary ideas; in fact, they form the basis of the higher education "shopping sheet" that President Obama called for in January. If the president's proposal were passed by Congress, it would require colleges and universities to compile post-graduate employment and earning information -- something they've never had to do before. What's more, it would compel schools to present their loan and grant information in a clear, understandable manner, so that students could do apples-to-apples comparisons between them.
Moving Beyond the Money
Some critics, including The Atlantic's Marty Nemko, argue that students need a much wider variety of information -- including statistics like average class size, university funding sources, alumni satisfaction, accreditation reports and even crime rates -- many of which wouldn't seem to directly affect the return-on-investment of a college diploma. In this vein, one rating website,What Will They Learn, approaches the question from a more holistic perspective. Run by the American Council of Trustees and Alumni, it notes tuition costs and graduation rates, but focuses on seven key areas of academics: composition, literature, foreign language, U.S. government or history, economics, mathematics and science.
These seven academic areas represent only a fraction of the standard core curriculum, but as Michael Poliakoff, vice president of policy at ACTA and director of the website, notes, "These are the essential ones. If they aren't represented in a curriculum, then students are leaving college with gaps in their knowledge."
Then again, as anyone who has ever dealt with an academic knows, the question of which areas of study are "essential" is both loaded and subjective. With that in mind, Poliakoff and his team moved beyond academia in their search for standardized criteria: "Making this list, we drew on what employers and the business industry told us they wanted."
Based on that, ACTA's criteria make a lot of sense: In 2010, the American Association of Colleges and Universities commissioned a survey of executives at 302 companies. Among other things, it determined that 89% of employers wanted colleges to redouble their efforts to prepare students "to effectively communicate orally and in writing," 81% wanted students who were better prepared to use "critical thinking and analytical reasoning skills," and 79% wanted workers who were better able to "apply knowledge and skills to real-world settings through internships or other hands on-experiences."
The Ultimate Report Card?
Poliakoff notes that his site's ratings cover most of the key indicators of a profitable college education. But as the gaps between the ACTA approach, Nemko's perspective and President Obama's proposal demonstrate, the criteria for a good -- or even a cost-effective -- degree program are highly controversial. What's more, with factors from program cost to employment ratios to college reputation all potentially affecting the value of an education, students provided with every potentially relevant data point about a school could find themselves drowning in a sea of excess information.
Perhaps the real answer is to offer students not more information, but less. Rather than haggling over whether to provide crime rates or class sizes, loan default rates or core curricula, students and universities may benefit from a simpler solution: widespread publication of the ratio between overall educational price tag and the average student earnings ten years after graduation. By reducing the battle to a simple measurement that indicates the long-term profitability of a degree, colleges could highlight the value of their product while offering vital information about the actual impact that higher education has on earnings.
Then again, given the shrinking value -- and rising cost -- of a college degree, it's not likely that educators will embrace this standard any time soon.
This article was originally published on March, 1 2012 on AOL's Daily Finance.