A Year of Wonder: What is the Future of Higher Ed?

It’s been the no-brainiest of no-brainers for as long as anyone can remember: If you’re a parent, and you have the means to do so, a mark of your commitment to your children is measured by the amount of money you’re able to sock away for their college education.

But what if it’s no longer true?

What if, in the wake of impending seismic shifts in higher education, the parents of today’s Kindergartners (aka the Class of 2032) need to stop worrying about their 529’s, and start preparing for a radically different landscape – one in which the very notion of “admissions” is an anachronism, the price tag is reasonable, and the experience doesn’t unfold over four years, but one’s entire life?

Before you block me from your Twitter feed for inanity, hear me out.

First, let’s consider the present state of higher education:

  • Student debt is out of control. Whereas in 2004, the total amount of loans American college students had was around $250 billion, by the time today’s Kindergartner was born, that number had surpassed $1 trillion.
  • Nearly half the students who begin college don’t finish within six years.
  • It is now the norm for students in the bottom income bracket to borrow at least half their household income to attend a four-year college.
  • And in the last twenty years, the average amount owed by a typical student has more than doubled.

Meanwhile, the majority of our colleges and universities are not even set up to give students an equivalent return on their investment: after all, the central function of the “hybrid university” is research, not teaching.

So what does that leave us with? An increasing disconnect between the skills young people have, and the skills their prospective employers need. As Forbes put it in 2014: “With the large numbers of recent college graduates who can’t find employment that pays well enough to justify the costs of going to college, it appears that we have reached the final stages of a process that has driven costs up but value down.”

Even the U.S. Department of Education sees the writing on the wall: “Today,” they write, “college remains the greatest driver of socioeconomic mobility in America, but if we don’t do more to keep it within reach for middle-class families and those striving to get into the middle class, it could have the opposite effect – serving as a barrier, instead of as a ticket to the American Dream.”

That’s bad news for middle-class families, but it’s devastating news for low-income families, who are constantly told that the only way to change their fortunes is by making sure the next generation is college-bound – despite the fact that just 9% of the country’s poorest students actually graduate with a bachelor’s degree by age 24 (as opposed to 77% of the wealthiest).

This is all starting to sound as sham-tastic as Trump University.

To be clear, the promise of college as a pathway to prosperity is still true for many – just not as many as before. Making college more accessible and affordable helped create the modern American middle class, thanks to policies like the G.I. Bill and the Higher Education Act. In fact, Title IV of the law focused on ensuring equal access for students from low-income families. It created support programs to help those students enroll in and afford higher education. And the 1972 reauthorization of the bill saw the beginning of the modern framework for federal student aid.

The problem is that while Congress has built out its own framework for federal aid, the overall costs of higher education have risen dramatically, resulting in a lower purchasing power for Pell Grants and a greater reliance on federal student loans. Consequently, whereas middle class kids can still get saddled with unwanted debt and a degree of limited utility, poor kids can get buried altogether.

This is not, therefore, merely a question of privilege. It’s a question of priorities – and of making sure that our institutions of higher education have the right ones.

The good news is there are more examples than you might think of colleges and universities that are changing their practices in ways that better serve the needs of kids and their families – and there are compelling reasons to think that by the time today’s five-year-olds graduate from high school, the landscape they face will be filled with options that are more personally relevant, affordable, and accessible.

If you’re looking for a future-oriented public university, for example, take a close look at Arizona State. As recently as a decade ago, ASU was little more than a party school with nice weather. Now, it’s the top-ranked university in the country for innovation – ahead of places like Stanford and M.I.T. Yet it’s still pretty affordable, and still pretty easy to get into.

And if you’d rather see how one of the world’s top universities is reimagining itself, check out Stanford 2025 – and see what you think of the creative ways they are redefining the structure and purpose of college (Goodbye Transcripts, Hello Skillprints). Or take note of Nike founder Phil Knight’s recent decision to give Stanford $400 million – not for its endowment or a new sports facility, but to create a new scholarship that recruits students from around the world to solve the world’s most intractable problems.

As Google’s Jamie Casap argues, this is precisely where education should be headed. Today’s students shouldn’t be thinking about what they want to be; they should be asking themselves what problems they want to solve.

Of course, that sort of shift requires a very different model of higher education – and a very different price point. But as Kevin Carey writes in The End of College, what schools like ASU and Stanford are showing us is where other universities will need to go if they want to survive. “To prosper,” he argues, “colleges need to become more like cathedrals. They need to build beautiful places, real and virtual, that learners return to throughout their lives. They need to create authentic human communities and form relationships with people based on the never-ending project of learning. They need to do it in ways that are affordable and meaningful for large numbers of people.”

For Carey and others, that means that many of the things we associate most strongly with “the college experience” may, once today’s Kindergartners reach university age, no longer exist. “The idea of ‘applying to’ or ‘graduating from’ colleges won’t make as much sense in the future,” Carey suggests. “People will join colleges and other learning organizations for as long or as little time as they need.”

Before that can happen, however, a vital monopoly must be broken: the sale of recognized credits and credentials. But Carey believes the exponential advances in information technology, coupled with the widening understanding of how people learn, will soon result in a digital marketplace of credentialing that reduces the diploma to its rightful, antiquated place. “The way the Internet allows people to connect with one another and share information creates new sources of authority that can be used to validate credentials,” he argues.

“They will allow people to control and display information about themselves in new and powerful ways, by assembling credible evidence of knowledge and skills gained in a variety of contexts – in college, in the workforce, in life.” And they will make each person’s educational identity “deep, discoverable, mobile, and secure.”

This is certainly the bet LinkedIn made when they spent $1.5 billion to acquire Lynda.com a year ago. It’s what new organizations like Accredible or Mozilla’s Open Badges platform are starting to develop. And it’s why some of the world’s top universities have combined to launch EdX and offer free online courses from top instructors.

So if all of these future-oriented efforts are already underway, why are they still at the margins of how we think about college?

Simply, because old habits die hard.

The comfort we get from continuing to imagine college as it has always been – as a symbol of acculturation and access, more than a vehicle to meaningful skills acquisition – will take a bit longer to collapse under the weight of its own untruth.

But make no mistake about it – we are already chasing chimeras. The unquestioned promise of college is, for too many, an illusion – and, worse still, an increasingly unaffordable and reckless one to pursue.

That means change is upon us. But perhaps by the time today’s five-year-old is graduating from high school, greater affordability, access and relevance will be, too.

The $10,000 Education?

There’s a lot of excess noise in just about every contemporary debate about public education, which makes it hard for anyone to see clearly what’s happening, and what needs to happen, in order to pull our institutions of American schooling – from Kindergarten to College – out of the Industrial era and into the modern world.

One thing, however, seems clear at every level: we need to become a lot more efficient in how we spend our money (not to mention a lot smarter in how we use our degree). Which is why I find it interesting that almost no one is talking about what Florida Governor Rick Scott proposed last week.

Scott challenged every community college in his state to create bachelor’s degree programs that cost students $10,000 or less. “Every business has to figure out how to make itself more efficient,” he said. “They’ve got to use technology, they’ve got to use the Internet, things like that. We can do the same thing with our state colleges.”

It’s too soon to tell how effective Scott’s challenge will be, and if Florida ultimately becomes a model of affordable higher education for the country. But his idea certainly comes at an interesting time – and one has to wonder when exactly the keepers of America’s colleges and universities will wake up and smell the MOOC.

On one hand, one could look at the numbers and wonder what all the hubbub is about: in worldwide rankings, more than half of the top 100 universities are American – including eight out of the top ten.

On the other hand, there is a growing sense that investing in higher education does not yield the path to prosperity it once did. Indeed, according to a recent article in The Economist, the cost of college has risen by almost five times the rate of inflation since 1983, and the amount of debt per student has doubled in the past 15 years. Two-thirds of graduates now take out loans. And those who graduated last year did so with an average of $26,000 in debt.

Making matters worse is this untidy stat: the odds of an American student completing a four-year degree within six years stand at no better than 57%. Yet all the while, universities are spending plenty more on administration and support services, while states are cutting back on financial aid.

In this sort of environment, it should be no surprise that new entities are stepping into the fray and threatening the way we think about higher education.  In fact, 2011 may be remembered as the year of the MOOC, or “massive open online courses.” They’re free, they’re college-level, and they’re available to everyone. Entities like Western Governors University (WGU) now offer tuition costs of less than $6,000 a year, and students have complete freedom over deciding when they study and take their exams. And companies like StraighterLine are offering online courses for as low as $49.

So amidst all these tectonic shifts, what does the future hold? A few things seem clear: first, there will always be the Ivory Towers of the university, and the growth of online courses will never replace the value of real-time, relationship-driven learning. Second, we need to admit that we have arrived at a point at which the production of credentials (e.g., knowing how to graduate) has come to matter more than the cultivation of anything real (e.g., knowing how to think). And third, the most certain future path for all levels of American education is the one down which learning becomes more personalized and customized – and that, I say, is a good thing. “The best sort of democratic education,” says Shop Class as Soulcraft author Matthew Crawford, “is neither snobbish nor egalitarian. Rather, it accords a place of honor in our common life to whatever is best [for each individual].”

Amen. And onward march.

Don’t Believe the Hype (About College)

(This article also appeared in the Huffington Post.)

It’s not what you think.

I’m a proud graduate of the University of Wisconsin (and two graduate schools). I loved college. And it’s undeniable that the United States boasts some of the best universities in the world.

I’m also someone who flunked out my freshman year with a 0.6 GPA. In fact, I’d say it wasn’t until I flunked out that I had a chance of being successful. I simply wasn’t ready for what college was designed to give me (aside from the unsupervised social time).

Although my freshman-year GPA was surprisingly low, my freshman-year experience is unsurprisingly common. Too many young people simply aren’t ready for college, for a variety of reasons – meaning they either coast through four or five years and waste a ton of money along the way, or, if they’re lucky, they crash and burn so badly that they discover, for the first time, what it is they actually want to do with their lives – as opposed to what the adults in their lives have told them they should do.

I’ve been thinking about this a lot recently since reading Matthew Crawford’s bestselling book, Shop Class as Soulcraft. Crawford, as you may know, got his doctorate in political philosophy from the University of Chicago – and then left a cushy job at a DC think tank to open a motorcycle repair shop in Richmond, Virginia.

In this regard, Crawford is uniquely suited to comment on three inextricably linked aspects of modern society – our public education system, our modern economy, and our shared values. And, as Crawford puts it, the news ain’t good.

In some respects, the story starts in the 1990s, when shop class started to become a thing of the past, and educators started exclusively preparing students to be “knowledge workers” – and stopped valuing the ancient notion that our hands are what make us the most intelligent of animals. Yet the clearest starting point stretches back much farther, to the early 20th century, the rise of Industrialism, and the concerted effort to separate thinking from doing – and, in the process, to begin the degradation of “work” as we have come to know it.

Any historian is already familiar, for example, with Frederick Winslow Taylor and his 1911 book, Principles of Scientific Management. It was Taylor who wrote: “All possible brain work should be removed from the shop and centered in the planning department.” It was Taylor who suggested that the modern workplace “will not have been realized until almost all of the machines in the shop are run by men who are of smaller caliber and attainments, and who are therefore cheaper than those required under the old system.” And it was Taylor whose ideas led people like Ellwood Cubberly, a former head of Stanford University’s Department of Education, to recommend in 1920 “giv[ing] up the exceedingly democratic idea that all are created equal. . . . Our schools are, in a sense, factories in which the raw products (children) are to be shaped and fashioned into products to meet the various demands of life.”

What has this legacy begotten? According to Crawford, it has given us a society where the production of credentials (e.g., knowing how to graduate) matters more than the cultivation of anything real (e.g., knowing how to think). It has led us to devalue the specific skills of the craftsman, and overvalue the general knowledge of the office worker. And it has engendered the gradual WALL-E-fication of our culture, in which the larger goal becomes the creation of passive consumers whose assembly-line work environments – be they the actual assembly line or the assembly-esque world of modern office work – can only be cured by the illusory freedom we exercise when we choose different products to purchase.

The bigger concern, and the one that relates to my own skepticism about whether everyone should go to college, has to do with the changing nature of the workforce. As Princeton economist Alan Blinder has written: “The critical divide in the future may instead be between those types of work that are easily deliverable through a wire with little or no diminution in quality and those that are not. And this unconventional divide does not correspond well to traditional distinctions between jobs that require high levels of education and jobs that do not.”

In other words, it’s easier to imagine outsourcing your need for legal advice than your need for an electrician. But the point is not that no one should go to law school and everyone should become an electrician – just that the goal of our schools, our economy, and our society should be to help people find work that engages their human capacities as fully as possible. And that’s not happening. And that’s a really big problem – and one that will never be solved if our knee-jerk reaction is to urge every young person to go to college.

“The best sort of democratic education,” says Crawford, “is neither snobbish nor egalitarian. Rather, it accords a place of honor in our common life to whatever is best [for each individual].” Amen, I say. So let’s stop pretending that college by itself is a cure-all for every person. Let’s start recalibrating our schools in ways that will help children discover their worth – and acquire the skills they’ll need to unleash their full potential on the world. And let’s keep searching for ways to help people understand, in the deepest, fullest sense, what it means to be free.