Sunday, October 30, 2011

some thoughts on education


Beginning in the 1950’s, largely because of the G.I. Bill, America started becoming the best educated country in the world—if you measure that rubric by the number of people with post-secondary educations.

Through out the ‘50’s and even into the ensuing decade, the vast majority of college enrollees waded into curricula that really hadn’t changed much since the 19th century, to wit, curricula heavily slanted toward the liberal arts.  The sciences and math weren’t ignored, far from it, but they weren’t the focus either.

That began changing during the Kennedy administration when the great “race to the moon” commenced and JFK urged young Americans to get much more heavily into math and the sciences. 

Among the results of that push were a period of American dominance in hard sciences like Physics and Chemistry, and in applied sciences like Engineering, rocket science and bio-engineering that stretched through the 1970’s, 1980’s and even into the 1990’s.  American graduate programs became the envy of the world, and both the universityies that supported those programs and the country as a whole benefitted enormously from the inflow of the best young minds from Europe and Asia clamoring for admission to American universities.

Despite the high profile that science and math enjoyed however, the bulk of American university students continued to major in liberal arts areas like English, History, Political Science or Communication.  What’s interesting is that the vast majority of liberal arts graduates, from the 1950’s forward, left college and began making a living at something that had little or nothing to do with what they studied in school.  They became salesmen, merchants, public relations people, farmers, small business owners, tradespeople—or they left undergraduate life and enrolled in a professional school; they became doctors, lawyers, architects, teachers, nurses. 

By the mid-1960’s, the one constant had become that if you wanted a salaried job, you needed a college degree.  Didn’t matter much in most cases what the degree was in, but you had to have one.

What doesn’t so often get noticed when the history of  education is discussed is that, starting even before WW II and continuing in even stronger fashion after it, there were very comfortable, middle class livings to be made that didn’t require anything beyond a secondary education—in some cases, not even that.  America during that period was the manufacturing capital of the world, and, thanks largely to unions, hourly wages in most areas of manufacturing were more than adequate to support a family and even pop for an annual vacation.

The 1950’s, ‘60’s and ‘70’s were also boom times for all sorts of construction in this country—everything from roads and bridges to suburban housing.  So if you didn’t go to college, and you didn’t fancy an assembly line, you apprenticed for a few years as a mason or a carpenter or a plumber or an electrician and still made a very comfortable living for yourself and your family.

What’s the point of all this nostalgia?  Simply this.  Times have changed, and many of the things that made the 20th century an “American Century” had, by the 1990’s, become bureaucratized, ossified, gentrified and more problem than solution.  The great push for education that started in the ‘50’s led to bloated quasi-unions like the NEA and NTA, both of which seem more concerned that no teacher be fired than that no bad teacher be retained.  It led to equally bloated Education Schools which gave theory and pedagogy greater import than knowledge of a subject and, partly to justify their existence with numbers, rapidly became populated by students who found majoring in math too difficult and so decided to learn to teach it.  By the turn of the century, it was commonplace that the bottom third of each new freshman class gravitated heavily toward schools of Education.

Put simply, America is no longer the envy of the world so far as education is concerned and one result of that is that we are also no longer the envy of the world economically either.  In my view, those two sad facts are related.  A more or less recent phenomenon has to be added to the mix.  Next year, if projections hold true, the total national debt on student loans will exceed 1 trillion dollars.  That’s with a “T.”  On average, a graduate of a four year school will owe in excess of $26,000.  If you restrict that to graduates of “elite” schools, the number frequently reaches into 6 figures and the average is nearly $50,000.

What many young people—perhaps most—are discovering is that they can’t afford to buy a car, or a new washing machine or, God forbid, a house, because their montly student loan payment squeezes those kinds of things out of the budget.  More to the point, they find that their English major really doesn’t make them all that attractive to the companies that are actually hiring today.

There are a lot of things that need to be done to fix the mess this country is in right now—most of them, unfortunately, requiring a degree of political will and sense of “country first” that just doesn’t exist. 

I’d like to propose one set of changes that wouldn’t require a lot of money and might not even run into a lot of resistance—except from the higher education establishment.  Here, in no particular order, are the changes. Note that these would apply only to public institutions of learning.  The Harvards of the world could continue doing as they please.

1. Change the emphasis in four year undergraduate programs from liberal arts to math, engineering and science, particularly computer science.  Make all the humanities (English, History, Religion, Political Science, etc) subjects that can constitute minors, not majors, and perhaps require every student to have two of them.  Majors would have to be chosen from the sciences and math.

2. Extend government paid tuition to everyone with a minimum score of 24 on the ACT or 1300 on the SAT.  Those are the folks that studies indicate most commonly do well in college and in fact graduate.  Require re-payment of whatever funds a student got from the government if the student fails to graduate.

3. Do away with Education Schools and revise the minimum qualification for teaching in a public K-12 institution  either to completion of a two year major in a humanities subject in a two year college, or completion of a degree from a four year college, plus a year paid internship in a public school.  Require a teacher certification examination, similar to what nurses, accountants, etc., do now, at the end of the internship year.

4. Create, at federal expense, a minimum of two “trade schools” in every state, wth the actual number in each state proportional to the state’s population.  These would need to be residence schools obviously.  Students would transfer to such a school after completing their second year of secondary education and would spend their final two years there.  The schools would offer students a variety of trades to specialize in and the school day would be organized the way it currently is in, for example, arts magnet schools.  Students would spend the morning hours doing traditional secondary education subjects, and the afternoon learning their trade.

5. Make teacher retention in public K-12 schools a function of annual student/peer/administration review with peer review by itself given weight equal to that of student and administration review combined.  Tenure would result from 10 consecutive years of positive reviews, but tenured faculty would be subject to quadrennial reviews that could lead to dismissal or revocation of tenure.

6. In public universities and colleges, faculty would become eligible for tenure after 6 years, but, upon receiving tenure, would be subject to quadrennial reviews that could lead to dismissal or revocation of tenure.

The most obvious effect of these proposals would be to substantially lower the number of students attending four year schools and substantially raise the number attending two year schools.  As I said, high education wouldn’t like this.  What it would accomplish however is a redirection of post-secondary education resources into those areas that are most likely to facilitate economic growth domestically and technological competitiveness globally.

It would also have the effect of significantly shrinking the student loan industry that has grown up, and more importantly, significantly reduce the number of graduates who can’t afford to become full-time consumers because they owe too much in student loans.

Just as importantly, it would make trained mechanics, machinists, dye workers, electricians and so forth once again available to American companies who have been complaining for years that they have jobs in those areas that they can’t find qualified people to fill.

And perhaps most important of all, revamping the way teachers are trained and retained would go a long way toward restoring America’s intellectual competitiveness in the global economy. 

You know, until the middle of the 19th century or thereabouts, most of the world was agrarian and the education that was required to be successful was attuned to that reality.  When industrialization occurred, education requirements changed, and for the most part, educational resources were re-directed accordingly.  In the developed world at least, we are now moving beyond an industrial society to a science and technology economy, and it seems to me, education needs to once again change to accommodate the new reality.






Monday, October 24, 2011

a little thought please


I named this blog alittlethoughtplease because it seemed to me, in public discourse particularly, what most frequently isn't happening happening is thought that goes much beyond is this a Republican position or a Democratic one.  If anything, that tendency is more prevalent now than it was then, so I thought with this entry, I might try applying a little thought to several of the most current issues.

Let’s start with the proposal, coming now from at least some elements of both parties in Washington, that we should declare a “one time tax holiday” during which corporations could repatriate the money they currently have stashed in off shore tax havens.  A Republican bill in the house would tax that money at just 5.25%; the bill’s Senate counterpoint, sponsored by Democrat Chuck Schumer, would set the rate at 8.75%--unless the company created new jobs, in which case the rate would be 5.25%.  The “thought” behind both proposals is that the infusion of cash into the country would enable companies to create jobs.

Let’s apply a little thought to that thought.  First, we’ve gone down that road before, in 2004.  That tax holiday resulted in eye-popping executive bonuses, significantly increased dividends and large stock buy backs.  So far as anyone has been able to determine, it resulted in zero (or near zero) new jobs.  There is nothing in either of the current proposals to prevent the same thing from happening again.  Remember that famous definition of insanity?

Second, American corporations are currently sitting on over 2 trillion dollars in cash reserves, and have been since late in 2009.  Unemployment has been stuck at 9% (or higher) through all that time.  By what logic would allowing companies to exponentially increase their already bulging cash reserves induce them to start doing something they haven’t done in nearly a decade?

Finally, the congressional Joint Committee on Taxation has stated unequivocally that a tax holiday at 5.25% would increase the budget deficit by nearly 80 billion dollars over the next 10 years.  By what logic is there any fiscal responsibility in that?

Moving on, let’s take a look at the mantra, coming now from both parties, that “small businesses create jobs.”  I’m trying to remember the last time I heard any politician talk about anything remotely related to the economy that his talk didn’t include that maxim.  I’m not sure what the thought behind this notion is, unless it’s simply that championing tax cuts and deregulation for “small businesses” is more politically palatable than doing the same for Exxon Mobil.

A little thought seems appropriate to me.  First, 61% of all American businesses have four employees or fewer.  Nearly 2/3 of American businesses, in other words, would double their payroll by adding four jobs.  And since the majority of those small businesses are convenience stores, artisan shops, or trade skill companies like plumbers, roofers, electricians—all of which of necessity operate on razor thin margins, there would have to be almost unimaginable sudden increases in demand for their business before it would make sense to expand employee numbers.  In the vast majority of cases, Joe the Plumber and his two minimum wage helpers can easily handle all the work they get.

A little thought suggests that most small businesses have no reason to create jobs, and the ones that do, because they are small to begin with, aren’t going to effect unemployment numbers much.

More telling are  the numbers indicating just how much economic impact small businesses actually have.   The Treasury Department defines small businesses as ones with annual earnings between $10,000 and $10,000,000—which cuts a fairly broad swath I think we’d all agree.  Indeed, 99% of American businesses fall into that swath, but they account for only 17% of total business income, and more to the point, only 23% of them pay any wages at all.

What does create jobs is not small business, but a small subset of that we call start-ups—new businesses.  For nearly every decade since the turn of the last century, new businesses have been the largest contributors to job growth.  More specifically, new businesses that catch on and start to grow (Apple would be an example).  It’s interesting therefore to note that start-ups, in the last decade, produced less than half the number of new jobs as start-ups had created in previous decades.  The reason, most signs indicate, is something called “allocative inefficiency.”  In the last decade, most of the capital required for start-ups went to businesses creating exotic financial instruments, not businesses that actually employed people.

But, both parties take as a given, if taxes were lowered and regulations relaxed, small businesses would still grow.  We’ll come back to this notion in a moment, but for now, it’s worth noting that when the National Association of Independent Businesses conducted a survey of its members last year, “poor sales”—also known as weak demand—was much more frequently blamed for lack of growth than taxes and regulations combined.

What that suggests is that a program that put people back to work, even if it required government (hence taxpayer) money would be the best thing we could do for small businesses.

That brings us then to a third idea currently being espoused by both parties—to wit, taxes and regulations are stifling economic growth.  (To be fair, this is the holy grail of Republican economic thinking; Democrats don’t challenge its validity often, but don’t champion it as loudly.)

The thought involved here doesn’t seem to be an economic one as much as a philosophical one—the notion that government is bad and capitalism is good. 

The lack of thought here is perhaps most apparent in the clear disconnect between what people (and politicians) want in the abstract, and what they want in the real world.  In the abstract for example, a politician can rail about a government agency like OSHA, but if that politician’s son is injured or killed while doing a summer time assemply line job, that politician will be the first to scream for stricter adherence to OSHA standards.

Similarly, it’s easy to bray loudly about government regulating run amok with the Food and Drug Administration, but when an eColi  outbreak occurs, the braying is just as loud about regulations ignored or not enforced.

A little thought suggests that everyone (except maybe Ron Paul and his progeny) recognizes that without some government regulation, capitalism is an inherently destructive machine, one designed to maximize income in any way possible by minimizing expenses in any way possible.

What politicians (and big chunks of the population) do, however, is “suspend their disbelief,” to borrow a theatre term—meaning they pretend not to know what in fact they do know—and allow their natural antipathy toward government  to prompt the “taxes and regulations kill jobs” mantra.

So let’s look at that.  We can’t make definitive statements about the future because, as Aristotle famously said, “what happens is manifestly possible, else it would not happen.”  We can however make definitive statements about the past, because it has happened.  The period from 2002 till 2010 was a period of unprecedented low corporate and personal taxes.  It was also a period during which virtually no new regulations governing business were put in place, and those that were in place were, at record levels, rescinded, diminished in scope or simply not enforced. 

Those were also the years during which a trillion dollar federal budget surplus became a multi-trillion dollar deficit.  They were also the years in which economic activity was the most anemic it had been since the 1980’s.  In point of fact, what growth there was would have been even more anemic had it not been for the Halliburton’s and Blackwater’s of the world, defense companies that grew exponentially and did so almost entirely at taxpayer expense. 

And while it would not be accurate to say that employment numbers went down during that period, it would be accurate to say they didn’t grow in any significant way.  It would also be accurate to say that all the major factors that would result in the Great Recession, and the resultant unemployment numbers we now have, were put in place during those years—the housing bubble and financial sector meldown primary among those.

The odd thing here is that the Bush administration commissioned a study aimed at determining what the “economic cost” of government regulation was.  Nothing about this was included in the study’s text, but it seems safe to assume the expectation was that the study would demonstrate that regulations have a negative effect on the economy. On the contrary, it demonstrated that the economic benefits of regulations were at least double their costs.

It doesn’t take an Einstein to figure out that cleaner air means fewer people sick, that higher fuel efficiency requirements for autos means more opportunities for start-ups to develop the new technologies that will require—and so on. Similarly, you don’t need a MENSA card to understand that fighting two wars while cutting taxes is likely to run up a deficit.  What a little thought suggests is that regulations actually protect jobs (and people) and that taxes have a major positive role in this country enjoying the quality of life it has.

Can’t end this without looking at taxes—specifically taxes on the rich.  This is not a bi-partisan issue.  The idea that taxes on the rich kill jobs is pretty much owned by the Republicans. 

Let’s set aside the philosophical debate about the necessity or even the fairness of a progressive tax system.  There are a lot of arguments—and good ones—on both sides of that issue.  It doesn’t matter really which side you’re on—the fact of the matter is that we have a progressive tax system, have had it for many decades, and have a societal and governmental structure now that is irrevocably tied to it.  And one of the dictates of that system is that the wealthier one is, the higher the percentage of that wealth one should pay.

Currently, incomes under $20,000 (double if filing jointly), pay no taxes.  The next lowest rate is 15% and the percentage proceeds upward through 28% for most of what we call the middle class to about 32% for the wealthy and up to 36% for the very wealthy.  If you support a progressive system, that would seem a reasonably fair set of numbers.  A little thought however makes you start to wonder.

When you start to look at the tax code, you notice immediately that there are an awful lot of exceptions and gray areas and even areas that don’t actually seem to be covered at all. The exceptions are usually called “deductions” and many of them are very straightforward and apply pretty much equally to everyone.  If you have a mortgage you can deduct the interest on it; most taxes that you pay at the local level you can deduct from your federal liability.  Education expenses mandated by your job and paid from your pocket can be deducted; a percentage of medical expenses can be deducted.  And so on.

As you dig deeper into the code however you begin to find a lot of exceptions that really only apply to significantly higher income levels, or to incomes generated in specific ways.  For example, income generated from investment is taxed as a capital gain, the maximum on which is 15%.  The expense of flying a personal jet for business purposes is deductible, but obviously only if you’re wealthy enough to have a personal jet.

The result is that the actual tax paid by a middle class guy in the 28% bracket is going to be somewhere in the 25% or 26% range.  By contrast, the actual tax paid by someone in the 35% bracket is  closer to 20%. 

So the first canard that should be disposed of is the notion that the wealthy in the United States pay taxes at a higher rate than in other Western countries.  In most European nations, the top tax rate is actually higher than here, but even in those where it is slightly lower, the actual tax rate of America’s wealthiest is lower still. 

All that is important because when we look at the incomes of those in the highest U.S. tax bracket, the first thing we notice is that much—in many cases, most—of it isn’t taxable as regular income.  Because it’s derived from investments, it’s taxed at the same rate as Warren Buffett’s secretary—15%.  The sage from Omaha could care less if the tax rate for his income bracket were raised to 90%; it wouldn’t affect him because his income doesn’t come from wages or salaries. 

But that’s not the real point.  The real point is that the top 1% of American incomes have very little impact positively or negatively on jobs.  If the CEO of Exxon Mobil saw his tax percentage go from 36% (now) to say 40% or even higher, that would have nothing whatsoever to do with whether Exxon Mobil created more jobs.  I can’t find the exact figure right now, but something in the neighborhood of 60% of the people in that infamous top 1% derive more than 90% of their income from investments.  What income tax rate is set for them is irrelevant because they don’t have income-taxable income.  More importantly, even if we changed the rules and taxed investment income like salaries and wages, their increased rate of taxation wouldn’t effect their ability to create jobs because they don’t create any in the first place.

What about the guy who owns several convenience stores and draws an income from them of $250,000—currently the minimum income level being looked at as “wealthy.”  The "taxes are bad" folks will tell you that raising that guy’s tax would inhibit his ability to expand his business and create jobs. 

Again, a little thought please.  Let’s say that guy’s analysis indicates that opening another store, and hiring another 10 employees to do so, would net him an additional $100,000/year in taxable income.  Right now, he would owe $35,000 of that in income tax.  If his rate were increased to the 38% it was under Clinton, he would owe $38,000 in taxes.  Does it really make sense that he would turn down the opportunity to make $62,000 more than he now does because, if the government had left his tax rate alone, he would have made $65,000 more? 

It worries me a little that so much of our discourse today seems based on political agenda rather than what even rudimentary thinking indicates is the true nature of the problem or its  most practical solution.  What should matter to all of us is not whether we are Democrats or Republicans, conservatives or progressives, but what honest and objective examination of the facts tells us needs to be done.  Unfortunately, that seems to be what we do only when the situation has become so dire we have no other choice.

Sunday, October 16, 2011

religion or cult


Robert Jeffress, a Baptist evangelical, caused a bit of a stir at the recent Values Summit (to which only Christian protestants were invited—presumably because only they have values worth summit-ing about) when he declared the Mormon religion a “cult.” 

When I read about that I was mildly amused and a good deal more than mildly irritated. 

What amused me is that Jeffress’ statement was a practically paradigmatic example of the pot calling the kettle black. Almost no matter which definition of the word “cult” you choose, all religions are one.

I went to the Oxford English Dictionary and found that the first definition (preferred and most generally applicable) is “a system of religious veneration and devotion directed toward a particular figure or object.” Can anyone say Baptist?  Catholic?  Jew?  Muslim?  Duh.

The second definition, probably preferred by Jeffress, is “a relatively small group of people having religious beliefs or practices regarded by others as sinister.” Sort of the way Baptists view Catholics?  The way Catholics view Jews?  The way Jews view Muslims?  By this definition, a cult is whatever religion you aren’t. 

But what about the third definition?  Maybe it’s where Jeffress was coming from.  The OED’s third definition is “a misplaced or excessive admiration for a particular person or thing.” So, Mormonism is a cult because its members have excessive admiration for Joseph Smith.  Got it.  But what about Baptists with Christ?  Ah, not the same Jeffress would no doubt say.  Smith is just a man who called himself a prophet, someone delivering the word of God; Christ is God. Really? Then why did he say was delivering the word of God?  Why didn’t he say he was delivering the word of himself? 

The point is, every religion (at least every Western religion) has been brought to us by a man.  Which goes back to my original point:  all religions are cults.  That isn’t meant to demean religion, it is simply to point out the obvious—all religions demand of their followers that they “venerate a particular figure or object” and therefore all religions are cults.

That, of course, wasn’t really Jeffress’ point or his concern.  He knew he was preaching to a room full of preachers and that referring to any cult that wasn’t their cult as a cult would be immediately understood as a way of damning that cult.  In fact, he went on to make his real point, which was that  “those of us who are born again followers of Christ should always prefer a competent Christian to a competent non-Christian like Mitt Romney.”  Here's where my irritation arises.

That Jeffress made that pronouncement in his introduction of Rick Perry to the congregations—oops, convention—is reason enough to question his intelligence (Perry is certainly Christian, but I can’t imagine a definition of “competent” that would apply to him), but that somewhat begs the larger question his statement raises.

At least the way I read the situation, Chirst didn’t tell his followers to worship him.  He told them to worship God.  Joseph Smith doesn’t entreat Mormons to worship him, but to worship God, more or less (see below) the same God Christ was talking about.  The difference between a Baptist and a Mormon isn’t the God they worship, it’s whose direction they follow in doing so.  If the savior is God, what difference does it make whose prayer book one uses to worship him?

What buggers people like Jeffress about Mormons is that Smith presumed to suggest that God’s angel had told him that Christianity had gone astray and that further “scriptures” were needed to correct what had gone wrong.  Those turned out to be largely the Book of Mormon.  To evangelicals like Jeffress, the Christian bible contains the whole of God’s revealed Word, and God stopped talking whenever the Roman church decided which gospels were actually true.  (That it took the church 3 centuries after Christ to do that doesn’t seem to bother anyone.)

Christians also don’t like the idea that, in the Book of Mormon, God the Father and God the Son are treated as actual flesh-and-blood beings, even though that beggars logic far less than the notion of an incorporeal being that can’t be experienced except through the life of a very corporal being who was also God.

I write this not to diminish the value of  religion or the authenticity of faith.  Marx referred to religion as the “opiate of the people,” and most all religions tend to hold that statement in great contempt. But if you stop and think about it, Marx had a point, and it's not even--necessarily--an irreligious point.  Religion and opium are alike in that they both dull our cognitive faculty and allow us to move through life becoming inordinately blissed by its pleasures, and largely ignoring its unpleasant realities.  In the sense that both make life easier for us to live, they aren’t all that bad.

Where religion, like drugs, becomes problematic is when it becomes extreme—when my cult looks wrathfully on all other cults, even to the extent of attempting to exterminate them.  If Jeffress finds his fundamentalist Christiantiy comforting, fine—he should luxuriate in it.  What he should not do is condemn Mitt Romney (or anyone else) for finding comfort in a different cult, and what he especially should not do is suggest that Romney’s belief in his cult disqualifies him for an occupation that, by definition, both transcends all cults and encompasses them.