Saturday, August 8, 2015

Up - and Down


I skipped last Thursday’s Republican “debate”, instead watching a DVD of Up, my all-time favorite animated film. 

A quick plug:  If you haven’t seen Up, please do.  The first ten minutes tell, perhaps, the sweetest love story in all of film – animated, and almost without dialogue.

What follows is the story of a crusty, curmudgeonly widower (voiced by Ed Asner) who finds a way to reconnect with the love of his life by going on the adventure they had always promised each other.

There’s also a talking dog who is… well, you just have to meet “Dug”.

But back to the Republican “debate”.

I’ll probably watch if the networks ever figure out how to manage an actual debate.  Which they could, if they’d ask their sports divisions.

The truth is, a candidate can’t be measured in a ten-man joint press conference.  One-on-one debates would tell us a lot more. 

If I were in charge, I’d host a series of one-on-one debates, using a double-elimination bracket system like, say, the NCAA College World Series.

Seed the candidates, put them into brackets, and conduct a series of two-candidate debates – using viewer polls to pick the winners. 

After the first round, both would candidates go on – one to the “winner’s bracket”, the other to the “loser’s bracket”.  After the each subsequent round, two-time losers would drop out.

If they ran the tournament between now and the New Hampshire primary, it would give each candidate – even those not well-known at present – two chances to show his or her stuff against a single opponent.

And that would tell us something.

And if you think that’s a lot of debates, remember that there were twenty Republican debates in 2012, most using the present absurd format – and they produced *sigh* Mitt Romney.   

A double-elimination tournament would help voters choose – and once the actual primaries started, the networks could stage a second series of debates featuring still-viable candidates.

Thursday’s ten-candidate format would only have made sense if we lived in an age of greatness.

Imagine a debate in 1789, with George Washington appearing on-stage with nine other candidates.  The nine would probably have fallen silent to let the Great Man speak – or, if he preferred (as he probably would) say a few words and just stand there.

But then, Washington was a certifiably Great Man.  Foreign travelers familiar with the courts of Europe wrote privately that Washington was vastly more impressive than any ruling European monarch. 

Today, as we audition people for Washington’s job, we have nothing resembling a Great Man.  Most of the Republicans – and several of the Democrats – resemble nothing so much as a convention of funeral directors. 

Which might, indeed, be appropriate.

Because the present election – like most elections since I reached voting age – seems to represent another step towards the death of the Republic.

Apparently, our democratic process can no longer produce leaders able to do the hard work of governing in complex times.  Such men and women are probably out there, but we don’t vote for them.

As citizens and voters, we lack the conviction, intelligence, and – let’s face it – patriotism to vote for people with the wisdom to see what needs to be done; do it; and not care very much if we don’t agree. 

We prefer politicians who agree with us – which is curious, since most of us will freely admit that we don’t understand the complexities of today’s world.

Nonetheless, we prefer candidates who make us feel smart – which, of course, means they are lying to us.

In recent weeks, I’ve been reading a first-rate biography of Julius Caesar by Christian Meier, a German scholar of the post-War generation.  It’s not a book I’d recommend to everyone.  Like many German scholars, Meier takes a philosophical approach to history, seeking great patterns rather than simply telling a story.

Meier’s prose – at least in English translation – can be opaque.  Occasionally, I find myself wandering in the fog.  But the book’s overall impact is to confirm impressions I gained from earlier, more accessible, reading.

Some years ago, Colleen McCullough – the Australian novelist – wrote a fascinating, well-researched series of novels about the last hundred years of the Roman Republic.  The opening volume, The First Man in Rome, deals with Julius Caesar’s uncle, Gaius Marius, a military man who gained great power in the Republic.  Subsequent novels explore the careers of the dictator Sulla, Pompey the Great, and Julius Caesar. 

All in all, McCullough – like Meier – explores the factors which transformed Rome from a free, self-governing city-state to a vast, despotic empire. 

I’m increasingly fascinated by the history of the late Republic – not because I believe America will follow its exact path, but because there are sufficient parallels to set off alarm bells.

One of the consistent themes is the smallness of the men – mostly patrician Senators – who claimed the right to govern Rome.

And the silliness of the ordinary people who voted in Rome’s elections. 


Those parallels, at least, are much in evidence in 2015.

Tuesday, July 14, 2015

Dollars for Athens

Here’s a thought, for those who aren’t ready to throw Greece to the wolves.

Why don’t we send them a few hundred billion dollars?

Okay, don’t get all excited.  I’m not suggesting that we give them the dollars.  Just that we exchange those dollars, at the current rate, for their holdings in Euros.

In other words, offer the Greeks the option of making the dollar their currency – at least for the time being.

The process is called dollarization, and it’s neither novel nor particularly radical.  At present, ten states – besides the United states – use the dollar as their official currency.  (The largest is Ecuador.)  Another dozen or so use the dollar as one of their official currencies, or peg their currency to the dollar.

It costs us nothing for another country to decide to use dollars instead of a currency of their own.  Indeed, it’s a sort of compliment – not to mention a boon for American tourists.

But the real advantage of dollarization – for the Greeks – would be that it would offer them a way of leaving the Euro zone without having to plunge into the chaos of printing and distributing a new drachma at a time when its value would be very, very unstable.

It would also – and this is important – offer the Greek government a weapon with which to negotiate on slightly better terms with the European Union, the German government, and the bankers who run both.

Now, of course, there are those among us who want to see the Greek people, and their government, humbled by the European equivalent of our Wall Street “masters of the universe”.  The Greeks have been profligate, without question.  They’ve elected governments which have spent too freely, made commitments that far exceeded prudence, and done a miserable job of collecting taxes from the rich and powerful.

That said, Greek democracy is barely out of its infancy. 

Yes, that’s ironic, considering where democracy got its start.  But since Athens lost the Peloponnesian War, in 404 BC, the birthplace of democracy has been part of the Macedonian, Roman, Byzantine, and Ottoman empires – to say nothing of spending four years under the bootheels of the Third Reich.

Modern democracy in Greece dates from 1975 – not much time to learn to choose one’s leaders responsibly.  A nation with more than two centuries of self-government might be mature enough to balance its budgets and maintain a reasonable level of national debt – though you couldn’t prove it by the United States – but that’s a lot to expect of an adolescent democracy like Greece.

After all, we don’t allow teenagers to qualify for credit cards with high borrowing limits – and we’d prosecute anyone who offered them the option.

Yet that’s more or less what the European Union and its pet banks – not to mention some of the same Wall Street firms which got us into the recent recession – did to Greece.  They talked a bunch of inexperienced and corrupt politicians into borrowing more than they could possibly repay – and are now using that excuse to pillage Greece, privatizing its best assets at a fraction of their value.

The United States, of course, shouldn’t let this happen – but if we were too spineless to stand up to an enfeebled Russia over the Crimea, we’re not likely to stand up to our best allies over Greece.

But dollarization wouldn’t really require that.  The Greeks could do it on their own.

To be sure, if I were President, I’d go farther.  I’d front the Greeks a few billion to bridge the next few payments to the European banks.  I’d even help to fly in some currency to help them switch to the dollar more conveniently – flying home the Euros which we exchanged them for.

And then, I’d sit down with the Chancellor of Germany and let her know, politely but firmly, that we didn’t allow German troops to overrun Europe in World Wars I and II, and that we don’t intend to allow German bankers to take over sovereign states in the 21st century.

In doing so, I would note, we are not trying to harm our ally, Germany.  Nor are we trying to destroy the European Union.

Indeed, you could say we were trying to keep the European Union from destroying itself.

But whether we act thus boldly or not, there’s nothing to keep the Greek government from adopting the dollar on their own.


And it would serve the Eurocrats right if they did just that.  Or if we helped them.

Sunday, June 14, 2015

To the Graduates

Half a lifetime ago, I was invited to deliver the graduation address at Lloyd C. Bird High School.

As I recall, the event took place in broad daylight, on the football field, and it was absurdly hot and humid.  Moreover, I found my words coming back at me from stadium loudspeakers – on about a one-second delay – which was rather distracting. 

I’m not sure I made a good deal of sense.

At any rate, I’ve never been asked to deliver another graduation speech, which is probably just as well. 

When I spoke at Bird, I was the sort of person who might be invited to speak at a graduation – a young lawyer and politician who could be counted on to say nothing disturbing.

These days, I’m an increasingly curmudgeonly, sixty-something opinion writer who would be apt to say something uncomfortable if offered the opportunity.  Perhaps something along these lines:

First, if you’re planning to go to college next year, postpone those plans.

Almost any college will reserve your place while you take a “gap year”.  Indeed, the better the school, the more likely that it will recognize that a year, or even two, will make you a more mature, serious scholar when you eventually matriculate.

So take a year off.

Get an apartment with a friend, get a real job, and learn what it’s like to work for a living.

Get a passport, grab a rucksack, and bum your way around the world.  Try to avoid actual war-zones, or places where you’ll be hated just for being American.  (That will limit your options.)  Visit your planet.

Spend a year doing something of service to humanity.  Try to avoid “mission work”.  You’re too young to know anything of intellectual – much less, spiritual – value to people in need.  But you’re the perfect age to do hard, thankless work.

If you really want some perspective, get a job in a nursing home or assisted living facility.  Confront your mortality.

Or, if you’re really ready to grow up, join the military.  Spend two years serving your country and getting to know Americans who can’t afford to go to college without taking that route.

Second, while you’re taking that “gap year”, reconsider your college plans.  Almost no one your age has a clue about choosing a college.

Mostly, kids choose a school because some of their friends are going there.

Or because a parent, close relative, or favorite teacher went there.

Or because of some vaguely-perceived idea of its “reputation” – which probably has more to do with its graduate programs or the prowess of its athletic teams than the quality of its undergraduate education.

Here’s some news: 

Within a year, few of your high school friends will be nearly as close as they are now.  Their college preferences will turn out to be irrelevant.

While your parents’, relatives’, and teachers’ affection for their respective alma maters is doubtless passionate, you mustn’t confuse their decades-old memories with current reality.  Colleges change.

For example, I have three degrees from UVA, and I love the place.  But I’ve counseled decades of my own students to go elsewhere for their undergraduate study.

The best education available in this country is at small liberal arts colleges you’ve probably never heard of - schools where undergraduate courses are taught by professors – not underpaid adjuncts.  Schools with, at most, Division III athletic teams.

Third, when you eventually go to college, don’t choose your courses in order to prepare for a career.

Yes, in today’s economy, it’s a good idea to have a college degree when you start applying for jobs.  But, within five years of graduation, most young people end up working in careers that have nothing to do with their college majors.

So study things which will teach you to think, logically and critically.  Study things which will stimulate your imaginative side.

Most important, study things that will last.  Ten years from now, today’s fashionable major will end up being about as serviceable as today’s cellphone.

Keep this in mind.  You’ve just graduated from a public school system which is increasingly devoted – not to education – but to preparing students for standardized tests.

If you’d graduated twenty years ago, there’s a chance you’d know how to think.  In the era of SOLs, there’s almost no chance of that.

It’s not your fault, but your parents – and the politicians they have helped to elect – have assured that you belong to the most ignorant generation in this country’s history.

College represents your best chance to correct that.

So study History, Philosophy, Literature and the Classics.  These enduring subjects will teach you to think.

Become fluent in a second language.  Your country isn’t doing too well, these days.  It’s good to have options.

And learn Accounting.  The big corporations – aided by your government – are shipping jobs overseas, or giving them to robots.  Chances are, if you want to get ahead, you’ll end up working for yourself.

It would be good if you could balance your books.

Finally, understand that this could all be bad advice.  You’ve spent the past eighteen years listening to adults who don’t know what they’re talking about.  There’s no reason to believe that I’ll turn out to be different.


Perhaps that’s the best reason to devote the next few years to learning to think for yourself.

Wednesday, May 20, 2015

Good News from the North

I’m going to ask you to try something difficult.

Imagine that the state of Texas held elections for governor and state legislature, and that – as a result of those elections – the Republican Party not only lost the governor’s mansion and the statehouse, but finished third.

Imagine, further, that the winning party was not the Democrats, but a fourth party which had never before tasted power in that state – a party with environmental and social policies well to the left of the Democrats’ positions.

Imagine, finally, that the newly-elected governor of Texas was a petite, blonde lawyer from a hipster neighborhood of, say, Austin.

Impossible, right?  Such a thing could never happen in America.

But it could happen in Canada.

On May 5, the citizens of Alberta elected a commanding majority of 53 New Democrats to their provincial legislature – considerably better than their present four members.

As a result, Rachel Notley, a 51-year-old labor lawyer and advocate for special-needs children, will be Alberta’s first left-of-center prime minister in a half-century.

Ms. Notley – everyone calls her “Rachel” – will command a 60% legislative majority, which includes a 20-year-old Thomas Dang, a computer science major at the University of Alberta.
  
The pro-business Progressive Conservatives – who have dominated Alberta for all those decades – were reduced to a small minority.  The official party of opposition will be the Wildrose Party – this is Canada, eh? – a coalition of libertarians and social conservatives.

I mention all this here for several reasons. 

First, it’s unlikely that many Americans will be aware of this sea-change.  Our polarized national media – obsessed with our own surplus of obscure presidential candidates, Baltimore, and Tom Brady’s somewhat deflated reputation – managed to cover Britain’s recent elections – but I didn’t hear a peep about Alberta.

Not even on NPR – which is rapidly losing any resemblance to a serious news network. 

Indeed, were it not for Facebook, I might still be living in ignorance of the most remarkable political story in some time.

My second reason for is the sheer, stunning unexpectedness of the event.

Keep in mind the challenge I set forth at the beginning of this piece.  Alberta isn’t Texas, but it’s as the closest thing Canada has to Texas – a Prairie province which is home to the environmental Hell-on-earth known as the Alberta tar sands. 

Alberta, in other words, is the source of the horrid gunk which promoters want to push through the Keystone XL pipeline – if that absurd boondoggle is ever built – and, when the inevitable ruptures occur, into the streams and groundwater of the American Plains.

Yet the people of Alberta – apparently indifferent to the economic power of the oil industry – have voted out that industry’s partisan defenders and elected a new leader who felt compelled, in her first day, to reassure Big Oil that her party isn’t out to destroy them.

Just tax them, and regulate them.  And take a long, hard look at new pipeline projects.

Beyond this, I’m not at all sure what to say.  I used to know a little about Canadian politics, but – during the Harper years – I lost interest.  For some time now, Canada has seemed to be drifting in the direction of American politics – and I wouldn’t wish our present political system on anyone.

Perhaps it’s time I got a new subscription to Maclean’s, Canada’s leading news magazine.  I get the feeling Rachel Notley and her party are going to be making news for some time to come – especially if they forge an alliance with British Columbia, which has the greenest regional government in North America.

There are battles to be fought, and – having lived my whole life in Virginia – it’s not easy to be optimistic.  Sometimes, it’s good to be reminded that there are places – not that far away – where English-speaking people enjoy a political system which isn’t entirely run by corporate money.

Perhaps, someday, we Americans will take our own country back – though not, I suspect, until we develop some new political parties of our own.

For the moment, though, I’m happy for the good people of Alberta.  I wish them well.

A hundred years from now, if we humans don’t prove entirely self-destructive, the widespread use of fossil fuels will be as unthinkable as human slavery.  

We’ll be using advanced technologies to get most of our power from the ultimate source of all energy on this planet – the Sun – with assistance from conservation and other renewable sources.

And some countries – the ones which have taken the lead in the great conversion from fossil fuels to sustainable energy – will be the economic leaders of the planet.

At present, the United States is still positioned to become first among those leaders, but neither American political party has offered a plan for making that happen.

Canada might now be poised to outstrip us.  If the good folks north of the border can shake off the political dominance of Big Oil – as has apparently happened in Alberta – powerful resources are in place to make our neighbors the New World’s leader in New Energy.

Too bad we're not yet prepared to join them.

Saturday, April 18, 2015

Wasted on the Young?

On Sunday mornings, my local public radio station broadcasts a BBC News segment called “More or Less” – a regular feature exploring key statistics which describe our changing world.

This Sunday, presenter Ruth Alexander interviewed Dr. Hans Rosling, a Swedish expert on international public health.  Dr. Rosling, who possesses a delightful sense of humor, is founder of the “Ignorance Project” – an attempt to bring citizens of the Western world up-to-date about shifting realities in what they insist on calling the “developing world”.

Reversing roles, Dr. Rosling posed three questions to Ms. Alexander.

First, he asked about measles vaccines, which public health experts agree is the most important vaccine for preventing deaths among young children.  What percentage of the world’s children receive the measles vaccine?  The choices were 20%, 50%, or 80%.

Second, Dr. Rosling asked about the world’s population of children under fifteen.  In 1950, there were fewer than one billion children.  This number had doubled by 2000.  What is the projected number of children in the year 2100?  The options were 2, 3 or 4 billion.

Finally, he turned to the percentage of the world’s population living in desperate poverty.  What trend has prevailed over the past twenty years?  Has that percentage doubled, remained stable, or decreased by half?

Listening, I made my guesses along with the presenter.  I actually got two right.  More than 80% of the world’s children receive the life-saving measles vaccine.  The percentage of people living in extreme poverty has been halved in the past two decades.

I was quite wrong about population trends.  At the end of this century, the projected population of children will be back around 2 billion – about what it was in 2000.  But this will be a crowded century.  World population will rise to 10 or 11 billion before it starts to decline.

I had no idea.

Dr. Rosling asked these same questions to attendees at the prestigious World Economic Forum – at Davos, Switzerland.  The world’s political and corporate leaders did poorly, outscoring random guessing on only one question in three. 

Most educated people, Dr. Rosling said, would do worse – for this reason:  We don’t continue to educate ourselves after we leave college or graduate school.  Our image of the world might be more or less accurate in our early 20s, but thereafter, in grows increasingly outdated. 

Now, here, I must stop referring to Dr. Rosling.  “More or Less” is great radio – and I intend to sign up for the podcast – but it only runs ten minutes.  Ms. Alexander had to thank Dr. Rosling and sign off.

But the lesson of these three questions remains to be examined.  In a sense, it's no marvel that even the Davos crowd did so poorly on the Dr. Rosling’s pop quiz.  Our educational system - our very view of what education is - concerns itself almost entirely with young people.

From time immemorial, the notion of education has focused on transferring knowledge about the world – as it is – to young people. 

Consider our images of education:  parents teaching toddlers their letters; a professor in front of a classroom; a scoutmaster conducting a knot-tying session in a forest glade; a coach helping an athlete improve his technique. 

Each involves an older person passing along time-honored lore to a younger one.

And there’s nothing wrong with that – except that it doesn’t always work, for three reasons:

First, we live in a world which is changing at a faster rate than it has ever changed – at least, since humans evolved.

Second, we live longer lives, on average, than humans have lived in the past.  Thus, the amount of change which takes place between childhood and the end of active adulthood is enormous – and would be, even if change weren’t happening so fast.

Third, as citizens of a republic – and citizens of a planet which, with technology, seems to be moving in a more democratic direction – our need to keep up with our changing world is greater than ever.

The fact that most of us don’t keep up is usually blamed on the fact that we’re busy.

But, looked at another way, it might be said that we're too busy because we divide up the tasks of a lifetime in a way that no longer makes sense.  When we're young,  we're too busy learning, and not busy enough dealing with the "real world".  Thereafter, we're too busy with everyday problems to continue educating ourselves.

And it doesn't have to be that way.

An infant born today, in America, can expect to spend twenty of her first twenty-four years in school.   More, if she wants to enter a profession.

After that, with the exception of job-related training, she will likely spend sixty or seventy years outside the realm of public education – ending her life in a world she simply doesn’t understand.

Perhaps it’s time we moved away from front-loading education so entirely - giving young people a taste of reality before their mid-20s, and building serious continuing education into the lives of adults.

Perhaps we need to get young people into the adult world a few years earlier – re-organizing public education, through a range of technologies, so that it continues to take place throughout a citizen’s entire lifespan.

Educating young people is essential, but it’s not enough. 


Perhaps – as with youth – too much of our educational effort is wasted on the young. 

Monday, April 6, 2015

Things Worth Learning

To return to a familiar theme, I posit this:  American education lacks a sense of mission, and consequently, manages to spend colossal sums without accomplishing much.

Having no mission, American educators – and the politicians who have invaded and usurped the educational system – have adopted two default positions.

First, because both political parties are entirely subservient to the lords of unsustainable, corporate consumer capitalism, education has increasingly come to be linked to the sort of job-training responsible companies once did for themselves.

Today’s great corporations – which already avoid paying their fair share of taxes – demand that the burden of training their employees be funded by those of us who do.

For the corporations, this not only represents an enormous savings.  It also means that – having invested little or nothing in training their workers – they can casually discard individual employees, or whole battalions of them.

For the nation, it means that young people are, year by year, less prepared for their primary responsibility – that of citizenship in a self-governing republic.

Second, under the leadership of George W. Bush – who benefitted less from his own education than any president since Warren Harding – the United States adopted politicians’ pet project of imposing high-stakes standardized testing at the Federal level.

High-stakes testing is a politician’s dream.  By making teachers and local administrators strictly responsible for whether students memorize a finite body of useless information, politicians can have it both ways.  If the kids do well filling in their bubble-sheets, politicians can claim credit for how well schools are doing.  If the kids do poorly, citizens will be inclined to blame the teachers – not the politicians.

Heads, I win.  Tails, you lose.

As a result of these two trends – the replacement of education for citizenship by training for vanishing jobs, and the replacement of teacher-led pedagogy by a top-down testing regime – our schools increasingly turn out young people who can’t see beyond the present.

Offered nothing of enduring value upon which to exercise their curiosity and critical intelligence, today’s kids are ever more focused upon the evanescent fascinations of the internet.

What’s timeless yields to what’s trending.  And the schools offer no resistance.

Of course, it’s inevitable that youth will be drawn to novelty.  It’s the nature of adolescence to attend to the new, the fashionable, even the outrageous.

But the job of education – in every civilization worthy of the name – has involved balancing this natural proclivity for ephemera with the disciplined study of enduring classics.

Schools dedicated to achieving this balance produce graduates who will grow into citizens capable of sustaining the nation.  Schools that fail turn out herds of perpetually-distracted sheep, willing to perform mindless – even soulless – work in return for the means to purchase ever more useless stuff. 

And here’s the great irony of it all:  The products of post-classical education are, of all American generations, the most insistent upon their own individualism – even as they follow the herd into the electronic marketplace, the mega-church, or two-option voting booth.

For forty years now, America’s schools – even its elite universities – have done nothing so well as turn out people who insist on thinking for themselves, but who lack the essential equipment for doing so.
 
People who know no history, no philosophy, no literature – nothing of the classics of humanity’s past – lack the capacity to challenge the present or imagine a different future.

Stuck in the Valley of the Present, they cannot even imagine the vistas open to the few who climb the slopes and gaze out on the sunlit mountaintops and dark valleys of the past – or the mist-shrouded topography of the future.

It isn’t difficult to believe we live at the end of an age – not in the apocalyptic sense, but in the historical sense.  Our particular brand of modernity has become both irrational and unsustainable. 

Just look at how we live.  Our particular brand of capitalism is not based – as was Adam Smith’s – on more efficiently meeting basic human needs, but on mindless consumerism, driven by inescapable, non-stop advertising.   When we started building enormous complexes of rental units to store the stuff we cannot cram into our closets, attics, basements and garages, that mindlessness became apparent.

But it’s more than that.  Our economy is also based on recklessly plundering finite natural resources; heedlessly fouling the only planet yet known to be capable of supporting human life; and destroying the habitats of other species upon whom our own lives – and our sense of beauty and wonder – depend.

And there are far too many of us, living increasingly longer lives.  As our lives decrease in meaning and quality, we substitute quantity – both in numbers and in years.

Indeed, our mad insistence on mere human existence as being valuable in itself is proof enough that we no longer understand the notion that there are things – beyond having a pulse or minimal brain-stem function – which make human life worth living.


But then, how could we?  Our schools no longer teach these things.

Saturday, February 14, 2015

Before We Spend More

In 1969, when I matriculated at UVA, America’s colleges and universities were in the process of abandoning the time-honored requirement that all undergraduates study a common curriculum during their first two years. 
  
At the time, this was celebrated as a reform – allowing students more freedom to pursue their interests, while ending the privileged status of courses focused on the literature, history, and philosophies of “dead white males”.

As it turned out, of course, the reform proved nothing of the kind.  It was part of an ambitious power grab by colleges and universities.  The goal, as so often with institutions of any kind, was empire-building – in this case, by exponentially expanding college enrollments.

The opportunity was there.  Baby Boomers were reaching college age.  The Vietnam War was at its peak – with nearly half-a-million young men serving in Indochina at any given moment.  And, with draft exemptions for college students, parents who could afford to pay a son’s college costs would certainly do so – rather than see him shipped off to Southeast Asia.

The table was set for colleges and universities to grow – building new dorms and enormous lecture halls; admitting a flood tide of middle-class boys less interested in academics than in avoiding jungle warfare; and financing the whole venture by taking full advantage of generous Federal student aid, made available by LBJ’s Higher Education Act of 1965.

Colleges and universities saw an opportunity to expand their “mission” – to become, not an option for the very bright or very well-off – but a necessary rite of passage for every middle-class youth eager to climb America’s economic ladder.

But this newly swollen generation of undergraduates – skeptical of their elders and, often, unaccustomed to academic rigor – demanded the end of the traditional curriculum.  They wanted courses that were “relevant” – and easier.

Their demands won considerable faculty support.  Professors whose departments were not represented in the old core curriculum – especially the so-called “social sciences” – saw an opportunity to increase enrollments at the expense of English, History, Foreign Languages, Mathematics and “hard sciences” such as biology, physics, chemistry, and astronomy.

Increased departmental enrollments required additional faculty – which meant more institutional power for department chairs, more prestige for their academic fields, and more jobs for their graduate students. 

The result had much to do with shaping the modern university, which – like its dining halls – is now more concerned with catering to students’ tastes than with offering a sound, balanced diet.

Higher education has become a smorgasbord, with departments and star teachers competing for students, while sacrificing rigor for popularity.  This is why, in today’s university, grade inflation makes anything less than a “B+” an occasion for formal complaints, the intimidation of instructors, and the intervention of “helicopter parents”.

The traditional function of a university – the development of a common vocabulary of ideas and cultural references, derived from the study of time-honored classics – has given way to academic faddism.

Departments have become rival fiefdoms, competing for students, prestige and resources.  At today’s university, there’s a major for every taste.

Departments of Economics, Commerce, and Political Science bid for students aspiring to join the global establishment.

The enticements of professional victimhood lure others to such dubious majors such as Women’s Studies, African-American Studies, LGBT Studies, etc.

And for those who prefer frat houses to the library, the mushy majors – Psychology, Sociology, Speech Communication, Sports Management, etc. – offer paths to graduation without excessive mental strain.

As for the old core subjects – other than Mathematics and the “hard” sciences – they, too, have learned to pander. 

What passes for History at today’s university offers a politically-correct, vaguely leftist – yet safely pro-establishment – alternative to the muscular realism which once defined the field.

The study of literature has become more about deconstruction and negation than the pursuit of wisdom, compassion, and beauty.

Philosophy, Rhetoric and Classics languish near death.

The downward race goes on and on, with universities and departments outbidding each other to offer less and less substance to more and more students – with Federal dollars paying the freight.

Having abandoned its traditional role as advocate for classical learning and unifying principles , the modern university wallows in narcissism and cultural diversity.

Yet, in this context, President Obama – among many – speaks of making college free for all who wish to attend.

Really?

If I were President – before handing the universities still more money and power – I’d insist on guaranteeing that they are teaching something worth studying. 

I’d achieve that by requiring that – after two years of undergraduate study – every student seeking further Federal financial aid pass a battery of substantive examinations in the subjects which once formed the core curriculum, not excluding mathematics, a second language, and at least one “hard” science.

To be sure, the Federal government lacks jurisdiction to mandate curriculum at private – or state – institutions of higher education.


But if these institutions covet yet higher enrollments, subsidized by additional Federal spending, we have the right to insist that they teach something worth knowing.