Saturday, April 18, 2015

Wasted on the Young?

On Sunday mornings, my local public radio station broadcasts a BBC News segment called “More or Less” – a regular feature exploring key statistics which describe our changing world.

This Sunday, presenter Ruth Alexander interviewed Dr. Hans Rosling, a Swedish expert on international public health.  Dr. Rosling, who possesses a delightful sense of humor, is founder of the “Ignorance Project” – an attempt to bring citizens of the Western world up-to-date about shifting realities in what they insist on calling the “developing world”.

Reversing roles, Dr. Rosling posed three questions to Ms. Alexander.

First, he asked about measles vaccines, which public health experts agree is the most important vaccine for preventing deaths among young children.  What percentage of the world’s children receive the measles vaccine?  The choices were 20%, 50%, or 80%.

Second, Dr. Rosling asked about the world’s population of children under fifteen.  In 1950, there were fewer than one billion children.  This number had doubled by 2000.  What is the projected number of children in the year 2100?  The options were 2, 3 or 4 billion.

Finally, he turned to the percentage of the world’s population living in desperate poverty.  What trend has prevailed over the past twenty years?  Has that percentage doubled, remained stable, or decreased by half?

Listening, I made my guesses along with the presenter.  I actually got two right.  More than 80% of the world’s children receive the life-saving measles vaccine.  The percentage of people living in extreme poverty has been halved in the past two decades.

I was quite wrong about population trends.  At the end of this century, the projected population of children will be back around 2 billion – about what it was in 2000.  But this will be a crowded century.  World population will rise to 10 or 11 billion before it starts to decline.

I had no idea.

Dr. Rosling asked these same questions to attendees at the prestigious World Economic Forum – at Davos, Switzerland.  The world’s political and corporate leaders did poorly, outscoring random guessing on only one question in three. 

Most educated people, Dr. Rosling said, would do worse – for this reason:  We don’t continue to educate ourselves after we leave college or graduate school.  Our image of the world might be more or less accurate in our early 20s, but thereafter, in grows increasingly outdated. 

Now, here, I must stop referring to Dr. Rosling.  “More or Less” is great radio – and I intend to sign up for the podcast – but it only runs ten minutes.  Ms. Alexander had to thank Dr. Rosling and sign off.

But the lesson of these three questions remains to be examined.  In a sense, it's no marvel that even the Davos crowd did so poorly on the Dr. Rosling’s pop quiz.  Our educational system - our very view of what education is - concerns itself almost entirely with young people.

From time immemorial, the notion of education has focused on transferring knowledge about the world – as it is – to young people. 

Consider our images of education:  parents teaching toddlers their letters; a professor in front of a classroom; a scoutmaster conducting a knot-tying session in a forest glade; a coach helping an athlete improve his technique. 

Each involves an older person passing along time-honored lore to a younger one.

And there’s nothing wrong with that – except that it doesn’t always work, for three reasons:

First, we live in a world which is changing at a faster rate than it has ever changed – at least, since humans evolved.

Second, we live longer lives, on average, than humans have lived in the past.  Thus, the amount of change which takes place between childhood and the end of active adulthood is enormous – and would be, even if change weren’t happening so fast.

Third, as citizens of a republic – and citizens of a planet which, with technology, seems to be moving in a more democratic direction – our need to keep up with our changing world is greater than ever.

The fact that most of us don’t keep up is usually blamed on the fact that we’re busy.

But, looked at another way, it might be said that we're too busy because we divide up the tasks of a lifetime in a way that no longer makes sense.  When we're young,  we're too busy learning, and not busy enough dealing with the "real world".  Thereafter, we're too busy with everyday problems to continue educating ourselves.

And it doesn't have to be that way.

An infant born today, in America, can expect to spend twenty of her first twenty-four years in school.   More, if she wants to enter a profession.

After that, with the exception of job-related training, she will likely spend sixty or seventy years outside the realm of public education – ending her life in a world she simply doesn’t understand.

Perhaps it’s time we moved away from front-loading education so entirely - giving young people a taste of reality before their mid-20s, and building serious continuing education into the lives of adults.

Perhaps we need to get young people into the adult world a few years earlier – re-organizing public education, through a range of technologies, so that it continues to take place throughout a citizen’s entire lifespan.

Educating young people is essential, but it’s not enough. 


Perhaps – as with youth – too much of our educational effort is wasted on the young. 

Monday, April 6, 2015

Things Worth Learning

To return to a familiar theme, I posit this:  American education lacks a sense of mission, and consequently, manages to spend colossal sums without accomplishing much.

Having no mission, American educators – and the politicians who have invaded and usurped the educational system – have adopted two default positions.

First, because both political parties are entirely subservient to the lords of unsustainable, corporate consumer capitalism, education has increasingly come to be linked to the sort of job-training responsible companies once did for themselves.

Today’s great corporations – which already avoid paying their fair share of taxes – demand that the burden of training their employees be funded by those of us who do.

For the corporations, this not only represents an enormous savings.  It also means that – having invested little or nothing in training their workers – they can casually discard individual employees, or whole battalions of them.

For the nation, it means that young people are, year by year, less prepared for their primary responsibility – that of citizenship in a self-governing republic.

Second, under the leadership of George W. Bush – who benefitted less from his own education than any president since Warren Harding – the United States adopted politicians’ pet project of imposing high-stakes standardized testing at the Federal level.

High-stakes testing is a politician’s dream.  By making teachers and local administrators strictly responsible for whether students memorize a finite body of useless information, politicians can have it both ways.  If the kids do well filling in their bubble-sheets, politicians can claim credit for how well schools are doing.  If the kids do poorly, citizens will be inclined to blame the teachers – not the politicians.

Heads, I win.  Tails, you lose.

As a result of these two trends – the replacement of education for citizenship by training for vanishing jobs, and the replacement of teacher-led pedagogy by a top-down testing regime – our schools increasingly turn out young people who can’t see beyond the present.

Offered nothing of enduring value upon which to exercise their curiosity and critical intelligence, today’s kids are ever more focused upon the evanescent fascinations of the internet.

What’s timeless yields to what’s trending.  And the schools offer no resistance.

Of course, it’s inevitable that youth will be drawn to novelty.  It’s the nature of adolescence to attend to the new, the fashionable, even the outrageous.

But the job of education – in every civilization worthy of the name – has involved balancing this natural proclivity for ephemera with the disciplined study of enduring classics.

Schools dedicated to achieving this balance produce graduates who will grow into citizens capable of sustaining the nation.  Schools that fail turn out herds of perpetually-distracted sheep, willing to perform mindless – even soulless – work in return for the means to purchase ever more useless stuff. 

And here’s the great irony of it all:  The products of post-classical education are, of all American generations, the most insistent upon their own individualism – even as they follow the herd into the electronic marketplace, the mega-church, or two-option voting booth.

For forty years now, America’s schools – even its elite universities – have done nothing so well as turn out people who insist on thinking for themselves, but who lack the essential equipment for doing so.
 
People who know no history, no philosophy, no literature – nothing of the classics of humanity’s past – lack the capacity to challenge the present or imagine a different future.

Stuck in the Valley of the Present, they cannot even imagine the vistas open to the few who climb the slopes and gaze out on the sunlit mountaintops and dark valleys of the past – or the mist-shrouded topography of the future.

It isn’t difficult to believe we live at the end of an age – not in the apocalyptic sense, but in the historical sense.  Our particular brand of modernity has become both irrational and unsustainable. 

Just look at how we live.  Our particular brand of capitalism is not based – as was Adam Smith’s – on more efficiently meeting basic human needs, but on mindless consumerism, driven by inescapable, non-stop advertising.   When we started building enormous complexes of rental units to store the stuff we cannot cram into our closets, attics, basements and garages, that mindlessness became apparent.

But it’s more than that.  Our economy is also based on recklessly plundering finite natural resources; heedlessly fouling the only planet yet known to be capable of supporting human life; and destroying the habitats of other species upon whom our own lives – and our sense of beauty and wonder – depend.

And there are far too many of us, living increasingly longer lives.  As our lives decrease in meaning and quality, we substitute quantity – both in numbers and in years.

Indeed, our mad insistence on mere human existence as being valuable in itself is proof enough that we no longer understand the notion that there are things – beyond having a pulse or minimal brain-stem function – which make human life worth living.


But then, how could we?  Our schools no longer teach these things.

Saturday, February 14, 2015

Before We Spend More

In 1969, when I matriculated at UVA, America’s colleges and universities were in the process of abandoning the time-honored requirement that all undergraduates study a common curriculum during their first two years. 
  
At the time, this was celebrated as a reform – allowing students more freedom to pursue their interests, while ending the privileged status of courses focused on the literature, history, and philosophies of “dead white males”.

As it turned out, of course, the reform proved nothing of the kind.  It was part of an ambitious power grab by colleges and universities.  The goal, as so often with institutions of any kind, was empire-building – in this case, by exponentially expanding college enrollments.

The opportunity was there.  Baby Boomers were reaching college age.  The Vietnam War was at its peak – with nearly half-a-million young men serving in Indochina at any given moment.  And, with draft exemptions for college students, parents who could afford to pay a son’s college costs would certainly do so – rather than see him shipped off to Southeast Asia.

The table was set for colleges and universities to grow – building new dorms and enormous lecture halls; admitting a flood tide of middle-class boys less interested in academics than in avoiding jungle warfare; and financing the whole venture by taking full advantage of generous Federal student aid, made available by LBJ’s Higher Education Act of 1965.

Colleges and universities saw an opportunity to expand their “mission” – to become, not an option for the very bright or very well-off – but a necessary rite of passage for every middle-class youth eager to climb America’s economic ladder.

But this newly swollen generation of undergraduates – skeptical of their elders and, often, unaccustomed to academic rigor – demanded the end of the traditional curriculum.  They wanted courses that were “relevant” – and easier.

Their demands won considerable faculty support.  Professors whose departments were not represented in the old core curriculum – especially the so-called “social sciences” – saw an opportunity to increase enrollments at the expense of English, History, Foreign Languages, Mathematics and “hard sciences” such as biology, physics, chemistry, and astronomy.

Increased departmental enrollments required additional faculty – which meant more institutional power for department chairs, more prestige for their academic fields, and more jobs for their graduate students. 

The result had much to do with shaping the modern university, which – like its dining halls – is now more concerned with catering to students’ tastes than with offering a sound, balanced diet.

Higher education has become a smorgasbord, with departments and star teachers competing for students, while sacrificing rigor for popularity.  This is why, in today’s university, grade inflation makes anything less than a “B+” an occasion for formal complaints, the intimidation of instructors, and the intervention of “helicopter parents”.

The traditional function of a university – the development of a common vocabulary of ideas and cultural references, derived from the study of time-honored classics – has given way to academic faddism.

Departments have become rival fiefdoms, competing for students, prestige and resources.  At today’s university, there’s a major for every taste.

Departments of Economics, Commerce, and Political Science bid for students aspiring to join the global establishment.

The enticements of professional victimhood lure others to such dubious majors such as Women’s Studies, African-American Studies, LGBT Studies, etc.

And for those who prefer frat houses to the library, the mushy majors – Psychology, Sociology, Speech Communication, Sports Management, etc. – offer paths to graduation without excessive mental strain.

As for the old core subjects – other than Mathematics and the “hard” sciences – they, too, have learned to pander. 

What passes for History at today’s university offers a politically-correct, vaguely leftist – yet safely pro-establishment – alternative to the muscular realism which once defined the field.

The study of literature has become more about deconstruction and negation than the pursuit of wisdom, compassion, and beauty.

Philosophy, Rhetoric and Classics languish near death.

The downward race goes on and on, with universities and departments outbidding each other to offer less and less substance to more and more students – with Federal dollars paying the freight.

Having abandoned its traditional role as advocate for classical learning and unifying principles , the modern university wallows in narcissism and cultural diversity.

Yet, in this context, President Obama – among many – speaks of making college free for all who wish to attend.

Really?

If I were President – before handing the universities still more money and power – I’d insist on guaranteeing that they are teaching something worth studying. 

I’d achieve that by requiring that – after two years of undergraduate study – every student seeking further Federal financial aid pass a battery of substantive examinations in the subjects which once formed the core curriculum, not excluding mathematics, a second language, and at least one “hard” science.

To be sure, the Federal government lacks jurisdiction to mandate curriculum at private – or state – institutions of higher education.


But if these institutions covet yet higher enrollments, subsidized by additional Federal spending, we have the right to insist that they teach something worth knowing.

Thursday, December 4, 2014

The Biggest Loser

The great majority of my friends – both in real life and on social media – vote Democratic.  In the wake of the midterm elections – which went so badly for Democratic candidates – most of these folks are looking for someone to blame.

Many blame the unlimited availability of money – much of it supplied by organizations which need not report their sources – to flood the media with attack ads and other nonsense.   I share their indignation.

Many blame the voters – especially young and non-white voters – who stayed away by the millions, allowing a minority of older and middle-aged whites to control dozens of closely contested elections.  I understand their concern, while admitting that I had to drag myself to the polls this year.

Many blame institutional purveyors of mendacity, such as Fox News and the AM talk jocks, who have done so much to lower the political IQ of the nation over the past twenty years.  Again, I share their indignation, but I don’t think a majority of the nation can be blamed because a minority chooses to self-propagandize – if not self-lobotomize – by listening to non-stop nonsense.

But for all the blame-throwing, few of my Democratic-leaning friends have begun to admit what has been, up until now, unspeakable.  The one person most responsible for the Democrats’ defeat is the man who had the most to lose – President Barack Obama.

Six years ago, many in this country – and seemingly everyone overseas – hailed Mr. Obama’s election as the dawn of a new era.  It was hardly that, though the President started off fairly well.

After all, the American economy was trembling on the brink of a second Great Depression – brought about by years of deregulation under both Democratic and Republican presidents.

The reasons for this crisis are too complicated for most of us to understand – involving, as they do, new kinds of theft made possible by the migration to Wall Street of hundreds of highly-trained mathematicians for whom our society seems to have no better use. 

But it’s not so complicated that most of us can’t grasp this fact:  The outgoing Republican President, George W. Bush, and his incoming Democratic successor, Mr. Obama, managed to coordinate their efforts in order to avert the worst. 

I have little doubt that future generations of historians will rank Mr. Bush far down the list of American presidents, but his efforts in those final months probably saved him from reaching the abyss inhabited by James Buchanan, Andrew Johnson, and Warren G. Harding.

Similarly, no matter how badly things go in the next two years, Mr. Obama’s role in salvaging the economy will almost certainly keep him out of the historical basement. 
But not, perhaps, by much.

Mr. Obama’s problem is that he never figured out how to leverage the incredible potential of his office.  Never having actually run anything – other than the Harvard Law Review – he has consistently demonstrated an incomprehension regarding the uses of power.

On the one hand, Mr. Obama has over-estimated his ability to make things happen simply by expressing his opinion.  True to his law review background, he has never hesitated to editorialize about situations at home or overseas. 

Now, because he is president, his words have always gained immediate, global circulation.  The problem is that a presidential editorial is news for twenty-four hours.  If the president then shifts his attention to some other issue, instead of hammering away at one, important point, his words – however well-written and well-spoken – quickly fade away.    

A president of the United States can – absolutely – focus the attention of the world on almost anything he chooses.  But unless he stays on that issue – preaching, explaining, educating, mobilizing and motivating – he will not realize the true power of the “bully pulpit”.

And this is the curious thing about Mr. Obama.  A President’s single, greatest power is his ability to educate the American people to the existence of a problem, set forth the solution he proposes – and rally millions to his support.

Yet, for all his gifts as a public speaker – for all his background as a college teacher – this President has been an abysmal failure as America’s educator-in-chief.  Time and again, he has opened a new policy initiative with a brilliant speech – only to move immediately behind closed doors, seeking to formulate a compromise with people who have no desire to meet him halfway.

Where Lincoln, TR, FDR, Truman, JFK, Reagan – or many presidents of lesser historical stature – would have rallied public support sufficient to compel their opponents to deal, Mr. Obama has allowed his opponents to dominate the political debate for his entire term.

He has, simply stated, failed to make the case for any of his policies – in most cases, leaving the floor to his opponents, who have not hesitated to make the most of their opportunities.


Thus, this gifted communicator has lost control of both houses of Congress, lost the initiative for rest of his term – and probably lost his chance at an honored place in history.

Monday, November 10, 2014

Broken

This year, I failed to do my civic duty. 

I voted, of course.   I always vote. 

But I’ve never regarded the mere act of voting as sufficient. 

Since the Golden Age of Athens, in the fifth century BCE, the whole idea of democracy has rested on the assumption that most citizens will be reasonably well-informed; willing to engage in public discourse; and strongly inclined to think for themselves.

Ideally, these citizens will also attempt to balance their natural self-interest with an equally powerful commitment to the common good – what our Founders called the commonwealth

Citizenship in a democracy, or a democratic republic, involves active engagement.  A citizen should speak out – at least among his or her neighbors.  And speaking out should involve some courage – some risk:  the risk of offending family, friends or neighbors; the risk of being corrected on the facts or challenged on one’s reasoning; the risk of being proved wrong.

American democracy once involved a robust – even rowdy – exchange of views.  In big cities, small towns, and isolated villages, the issues which divided the great deliberative bodies at our national and state capitals were also debated, with equal fervor, among people who knew each other well. 

Out of those local controversies, citizens gained a clearer understanding of the issues.   Sometimes, they were also able to spot rising young leaders who saw things with unique clarity or expressed themselves particularly well.

Today, in most communities, Americans seem to prefer not getting involved.  Except in presidential years, most of us don’t even vote.  And, for those who do, voting has come to be regarded as sufficient.

It’s not.

Among the commentariat – the media pundits and political scientists – this lack of public engagement is often deemed apathy.  Personally, I’ve never seen apathy as being the problem.  Indeed, from what I can tell – from personal conversations and on social media – interest in national and global affairs has been on the rise for some time. 

Certainly, that has been true since 9/11.  The trend became even more pronounced after the banking crisis which precipitated the so-called Great Recession.

But this growing public interest has yet to translate itself into public involvement – and therein lies the problem of our times.

In a functional democracy, citizens believe that they can make a difference.  They take action.  When citizens are convinced that they wield no actual power – that their votes and opinions don’t matter; that there’s no point in volunteering for a campaign, or displaying a bumper sticker or yard sign; that sending a modest contribution to a preferred candidate will have no impact – when that situation obtains, democracy is broken. 

And when a great nation – long accustomed to the blessings of liberty and self-government – loses confidence in its democracy, one of two things will happen:  reform, or revolution.

Personally, like most people, I prefer reform.  But our political system, as presently constituted, seems to have become incapable of reforming itself.  The system of campaign finance regulation – never robust – has been gutted by a Supreme Court which is far too politicized to serve its proper constitutional functions.

Political advertising and partisan shouting have replaced virtually all other forms of public discourse – reducing debate to simple-minded slogans, empty symbolism, and the worst sorts of defamation of character.

With only two major political parties – both utterly dependent on unregulated, and often secret, contributions from wealthy individuals, corporations, and unions – no mechanism of reform appears to exist.

Meanwhile, with two-party political warfare confined to sensational – but often unimportant – issues, efforts to meet the great and serious challenges that face us continue to be ignored, postponed, filibustered, or mired in partisan gridlock. 

Little wonder that so many citizens feel alienated from the entire political process.  Yet, given our history, it’s impossible to believe that the American people will long remain willing to live under a system which pretends to be “government of the people, by the people, for the people” – but which is actually none of those things.

An explosion is coming. 

May it come soon!

If this explosion is to be peaceful and political – if it’s to be an explosion of reform, rather than revolution – the majority of Americans need a mechanism with which to wrest power from institutions which no longer work. 

I have long thought that the proper mechanism is a third party – committed to complete reform of our political process, and to other issues which are regularly ignored by the two-party duopoly. 

For years, now, I’ve tried to think of an alternative scenario – a scenario in which one or both of the existing parties initiates the reforms we need to restore American democracy.
But I can envision no such scenario, and no one else has suggested anything that seems workable.    

Last Tuesday, I voted.  But that was hardly enough to claim that I’d done my civic duty.

Does anyone out there share the sense that citizenship in this once-great democratic republic demands more than casting a ballot?


Is anyone else willing to act? 

Friday, October 31, 2014

The Fall'll Probably Kill Ya!

I read recently of a decision by New York State’s Board of Regents eliminating the requirement that high school students complete a year’s study of both US History and Global History in order to graduate.

Apparently, this decision is related to the so-called “Common Core” movement which – unlike Ebola – has reached epidemic proportions in this country.

As an old history teacher, I was, of course, dismayed.  But on reflection, the Regents’ decision struck me as a matter of little significance – something akin to the moment when Butch and Sundance are about to jump off an absurdly high cliff into the river below.  Sundance worries that he can’t swim, and Butch laughs:  “Why, you crazy – the fall’ll probably kill ya!”

This country’s educational system does such a poor job of teaching history that – unless we dramatically change things – we’re already doomed as a free society.  Of all the subjects which citizens can study, the only one which offers any preparation for meeting the challenges of the future is the study of the past.

For, if history doesn’t precisely repeat itself, there are patterns.  An understanding of history is, for a society, analogous to the wisdom an individual gains over the course of a long life.  Everyone makes mistakes.  Those who survive those mistakes – and learn from them – have a shot at wisdom.

History is a society’s collective wisdom. 

America’s educational system gave up on History during the Vietnam era, when colleges and universities expanded rapidly to profit from the tuitions of hundreds of thousands of young men who didn’t want to go to war.  Arriving in unprecedented numbers, this influx of students – whose interest in college had far more to do with survival than learning – demanded all sorts of absurd “reforms”.

Among these was the elimination of general education requirements.  Before 1960, most American colleges and universities required that all students take a set of core subjects – American History, Western Civ, English and American Lit, Composition, a couple of science courses, one foreign language to the level of basic fluency – even physical education.

Your prospective major made no difference.  Future lawyers took calculus.  Future rocket scientists studied poetry.  Everyone grumbled through calisthenics and ran laps.  And everyone learned a respectable smattering of the history of their country and the civilization from which it sprang.    

Which – in terms of the survival American democracy – was the most important part of this common curriculum.  Because History – along with its allied subjects, biography and geography – is the absolute prerequisite for intelligent, active citizenship in a democracy. 

It was probably a good thing, in those days, that future lawyers and politicians had to sweat through elementary calculus.  Higher math teaches humility – something today’s lawyers and politicians clearly lack. 

It was probably good, too, that future scientists, physicians, and engineers learned a little literature.  The more we push back the frontiers of knowledge – the more we find ourselves able to do – the more we need some basis for thinking about what it all means.  Those are the questions which poets and playwrights have been wrestling with for millennia. 

But what was indispensable was that all college-educated Americans – those whose educational attainments would make them the natural leaders of their future communities – learned something about history. 

History is the essential study of the leader.  Always has been.  Always will be.

Read up on any great leader, of any nation, from any period, and you will find that he or she not only studied history as a young person – but continued to read and study it as an adult. 

History teaches us many things.  Above all, because the patterns within and among human societies tend to repeat themselves, history teaches us to recognize dangers before they become obvious – or before it’s too late.

Some years ago, when reasonable people could still question the dangers of anthropogenic global climate change, I wrote a piece comparing Al Gore’s efforts to alert Americans to this danger with Winston Churchill’s efforts to alert Britain to the dangers of Adolf Hitler.

One angry reader responded, furiously insisting that Gore was no Churchill.  He missed the point, which was that democratic societies – confronted with a threat calling for self-discipline, sacrifice, and years of unrelenting effort – will go through all sorts of contortions to deny that a threat is real.

Democracies are fortunate if they have prominent leaders willing to risk telling citizens things they don’t want to hear. 

History teaches leaders how to lead – by adopting the successful methods, and avoiding the mistakes, of those who have gone before.

Lincoln was a lifelong student of George Washington – and, before taking office, he read up on the presidency of Andrew Jackson, who had faced an earlier secession crisis.

Theodore Roosevelt was a devoted student of Lincoln, and actually wrote a biography of Alexander Hamilton.

As wartime leader of Great Britain, Winston Churchill faced the necessity of pulling together a coalition of incompatible partners to prevent Hitler’s Germany from conquering the world.  In the decade before he took power, Churchill wrote a six-volume biography of his ancestor, John Churchill, Duke of Marlborough, a brilliant diplomat and soldier of the late 17th and early 18th centuries.

Serving William III and Queen Anne, Marlborough had pulled together a coalition of incompatible partners to prevent Louis XIV’s France from conquering Europe.

Churchill, you might say, spent the 1930s doing his homework.

A country led by serious students of history can achieve remarkable things.  A country led by those ignorant of history risks disaster.

Because America no longer teaches its citizens history, it must soon either cease to be a great country, or cease to be a democracy – ceding power to an educated elite who have taken the trouble to learn it.

By far the safer course is to teach our children the history they will need to govern themselves.


But to do so, we will need to overrule an educational elite which does not understand history because – having attended college since 1970 – they never learned it.

Fear and Stupidity


I don’t own a television.  I do listen to public radio – and, during baseball season, some sports radio – but, since neither normally carries political advertising, I’m largely spared the annual flood of nonsense through which American political campaigns are conducted.

I’m glad to miss out on the political ad wars.  The issues which politicians believe voters care about are sometimes trivial, sometimes important.  But the solutions which candidates offer – when they bother to offer any – would embarrass the folks who air late-night infomercials for “miracle” products.

And politicians don’t even offer a money-back guarantee.

Occasionally, between elections, you hear an elected leader offering actual, practical solutions.  During a campaign, all you hear is dumb – and dumber.

Still, not listening to the political ads, I sometimes miss things.  Recently, I was astounded to learn – via posts on social media – that some of my friends have become convinced that our government should ban international flights in order to prevent an Ebola epidemic here. 

The source of these panicky posts appears to be a coordinated campaign by Republican candidates for the House and Senate – though some terrified Democrats have apparently climbed aboard the bandwagon.

Now, I’m used to the inevitability of candidates offering up stupid policy ideas during political campaigns.  But an international flight ban isn’t just stupid – it could be suicidal.

Of course, this hasn’t stopped a majority of Americans telling pollsters they support a ban.  

No surprise there.  Americans will fall for anything – for a little while.  The good news is that most Americans – given time – will get back in touch with their native common sense.
That will have to happen soon, if we aren’t to end up electing a bunch if irresponsible fear-mongers to office.  But in an election campaign, two weeks is an eternity.  I’m betting the Republicans created their Ebola panic a couple of weeks too soon.

We’ll see.

It might be a little early for common sense to reassert itself, but – since I’m not running for anything – let’s give it a try.

In the first place, let’s understand that there are basically no direct flights from West Africa to the United States.  To get from Monrovia, Liberia, to JFK, Dulles or Hartsfield, you normally fly through Europe.

So right there, banning flights from West Africa to the US is nonsense.  There are no flights to ban.

But maybe the idea is to prevent anyone flying from West Africa from entering the US.  How would that work?

Well, clearly, US authorities could monitor passenger manifests and prevent the entry of passengers from West Africa who had taken connecting flights through Europe.

But suppose someone in West Africa really needs to get to the US – on business, for school, to visit family – and there’s a travel ban.  The obvious solution would be to take two non-connecting flights.  Fly to Rome or Paris; spend a day or two dining well; then fly into the US as a passenger from that European city.

Or you book connecting flights from West Africa to Toronto Pearson – and cross into the US by land transport.

Now, understand, anyone doing this would be violating the proposed travel ban.  But assume you’re in West Africa; you’re absolutely certain you don’t have Ebola; and you have important reasons for getting to America.

Is a travel ban going to stop you?

Consider our record of success at preventing illegal immigration – or the importation of marijuana and cocaine. 

Guess what?  In today’s world, you can’t keep people from crossing borders.

Not even North Korea can do that – and North Korea is a police state with two short, militarized borders.

Common sense says a flight ban would be unworkable.  But at the beginning of this piece, I suggested it might also be suicidal.

Here’s why.

At present, US authorities automatically screen everyone flying into the country from nations in which Ebola has erupted.  These passengers are asked several key questions.  Their temperatures are taken.  They are instructed on what to do if they begin to develop symptoms.

Thus far, passengers from West Africa have been cooperative with these sensible screening procedures.  They’re not intrusive, and – really – no one wants to bring Ebola into this country, or pass it on to his loved ones or business associates.

But suppose we imposed a flight ban – and passengers from West Africa started avoiding that ban by taking non-connecting flights or by flying into Canada.

People sneaking in via ­an indirect route could hardly be expected to present themselves to US authorities for screening once they arrived.  They’d be here – among us – but we wouldn’t know anything about them. 

And that’s where things get scary.

Because, sooner or later, someone will enter this country who has been infected – and doesn’t know it.  Under our present screening regime, there’s every chance a symptomatic person would promptly contact proper authorities for treatment.

But if he had sneaked in to avoid a travel ban, there’s a fair chance he’d delay doing so.  And that delay is where the proposed travel ban becomes dangerous.

Fear-based political campaigns are nothing new.  But when politicians propose stupid, dangerous policies in order to win elections, they demonstrate their unworthiness to hold public office.


Let’s hope common sense kicks in soon enough to punish this fear-mongering nonsense.