Archive for January, 2012
It’s almost 4:30 in the afternoon, and on most days I wouldn’t be here trying to write a blog post.
I get up very early in the morning. On the best of days, my 4:30 is like your 9:30. I’ve been up and running around for hours, done a full day’s work on both writing and teaching, and I’m ready to just fall over and let myself be incoherent.
And this has been far from the best of days. I have what is probably the tail end of my traditional beginning of the term cold–and that really is good news, because sometimes it’s a beginning of the term pneumonia, and I hate that stuff–
Anyway, I’m worn out and incoherent from the off.
And, on top of that, when I came back from teaching my single class, we had a two-hour-long power outage, which occassioned a lot of everybody running around being petrified that They Were About To Do It Again.
If you don’t know what I mean, try googling “CT” “power outage” and “October snowstorm.”
But here I am, and the reason for that is that I’ve been hoping to be able to get on and say something for a couple of days.
Cheryl mentioned “national myths,” and I want to protest–to the extent that we use the word “myth” to mean “not true,” I do NOT want to teach schoolchildren “national myths.”
I suppose that that is one way you could go about doing what I was talking about, but it’s not the best way, and it’s not necessary.
It’s perfectly possible to tell the national story without mythologizing it–in either direction.
We don’t have to tell stories about George Washington and the Cherry Tree to teach the significance of the 4th of July–any more than we have to go into paroxysms of silliness about racist old white male slaveholders who cared about nothing but securing their own property to note that slavery was wrong and a bone of contention from the beginning.
I don’t know what it is about human beings, but most of us seem to think that a story about nastiness and badness (preferably with an unhappy ending) is ‘true-er” than a story about people doing good and noble things.
We also want our historical figures to have personal lives that match the moral purity of the cardboard-cut-out characters in very bad children’s books.
That’s why, I think, so many of us, when we do find somebody we admire, do all we can to just blot out any part of that person’s life that would make us uncomfortable or upset.
Welcome to the endlessly delusional cult of Che Guevera.
In the real world, of course, people are never so all of one piece. Great men are seldom good ones. Good men are seldom great ones. And then you get a figure like Robert Carter, whose decades of twisting confusion do not make the stuff of Hollywood movies, even though in the end he was the single one of the Founding Fathers who actually, truly did the right thing about slavery.
If only he’d had the sense to do it with his head held high and his eyes blazing, and not a single moment of indecision or confusion or fear.
If only he’d been a cartoon superhero, and not…just like us.
I don’t want to tell lies to schoolchildren, I want to tell them the truth, and I want to tell all of them the same truth.
Somebody once asked W.H. Auden what children should learn in school, and he said, “It doesn’t matter, as long as they all learn the same things.”
I think it does matter, but I get the point about the same things–we build a common identity by building common experiences.
There was a lot more of that in the Fifties and early Sixites than there is now, and not just in the content of the schoolwork being presented in tens of thousands of public schools.
We had only three television channels, so we all watched the same programs. Our high schools had football and cheerleaders and proms. We could meet up on college campuses or draftee-populated boot camps hundreds of miles from home and still have things to talk about.
It feels to me like there is less and less of that lately. We’ve become alien to each other in ways more fundamental than I’m used to.
Half the reality shows I come across on television–you REALLY have to keep the remote out of my hand after 8 pm–seem to be “laugh at the stupid hillbilly” shows.
Our political discourse sounds like two alien races from two different planets that hate each other on sight.
It’s getting virtually impossible to find people to do things like run library bake sales or coach Little League or man the tables at the charity store.
We’re either all Americans together, or we’re a lot of little interest groups only out for ourselves.
And I know there are people who say that’s what we’ve always been–
But they might be a bit surprised when the reality finally gets here.
I have no idea if I’ve made any sense at all.
Let me start out like this–it is always something of an omen (of what, I’m not sure) when Sunday morning starts out with Beethoven and not with Bach.
I’ve been sick as anything for two days–I’d LOVE to have a recording of how my class went on Friday; it’s a three hour, one day a week thing, and I think I might have been blithering.
Anyway, I’ve been sick for two days, and I’m still more sick than not, but I suppose it’s a better sign than it might be that the Beethoven is the 3rd and not the 9th, or, God help us, the 5th.
Maybe I’m just panicking. The problem with conking out for a couple of days is that there ends up being a lot of work needing to be done.
But let me start out by address–ack, I AM sick, I can’t remember the name–anyway the comment that said that schoolchildren should not learn that the American Revolution started on July 4, 1776 with the issuance of the Declaration of Independence, because this is “not true.”
Is that the case? Is it “not true” that July 4, 1776, when the Declaration was issued, was when the American Revolution started?
It’s perfectly true.
And how do I know that?
I know it because the men who made that Revolution said that’s when it started, and that is the date from which they forever afterward pegged the existence of the United States as a nation.
The attempts to date the Revolution from various occurrences in 1775–the shot heard round the world!–is like trying to date WWII from Hitler’s occupation of the Rhineland or his annexation of Czechoslovakia.
It’s almost certainly the case that the one set of incidents (occupation, annexation, shot heard, etc) made the next set of incidents (actual war, actual revolution) almost inevitable, that is not the same thing as saying that those incidents were the actual start of the actual war that followed.
The reason lies partially in that “almost”–as it turned out, in both cases, war did follow. But in neither case was it inevitable that war should follow.
Over the long course of the British Empire, there were many periods where such incidents occurred in one colony or another, many of them far more violent than anything that happened here in 1775.
I am, at the moment, reading about one of them–the Indian Mutiny, which was exponentially bloodier and more violent than anything that happened here even during the Revolution, and occassioned much more complete loss of control of the colony than anything that would happen here until we actually one the contest.
If the British had mopped up the “incidents” and reasserted control, the shot would not have been heard around the world for long, and we would not now be talking about an American Revolution, because no such Revolution would have occurred.
The July 2nd/July 4th thing I find more egregious, because it’s part of what I think of as the “we’re too smart and sophisticated to believe in fairy tales” school of how to teach American history to schoolchildren.
I’m not saying that the person who posted that comment here is of that school, but that I’m fairly sure his or her teachers were–or if not her teachers, then their teachers.
The problem with it is varied and multiple.
There is, of course, the fact that if independence had been voted but never proclaimed, the vote wouldn’t have mattered a damn–but I think there’s a better case to be made for the inevitability of Revolution after the vote than after 1775.
The larger issue is the fact that the men who were actually there, and who voted for independence in the first place, didn’t date the Revolution from the 2nd. They dated it from the 4th.
Why, exactly, should I take the backdating reconfigurations of modern historians–or, worse yet, modern textbook writers–over the plain testimony and continued support of the people who actually fomented the revolution in question?
Not only was July 4 celebrated as the day Revolution actually started–Revolution, not revolt in the hopes of getting British concessions, which was one other thing the events of 1775 might have resulted in–but it was celebrated with much more fanfare and public passion than anything we’ve ever seen in our lifetimes.
One of the most distinct memories I have of reading any piece of literature is a scene from Henry James’s The Bostonians, where Olive and her feminist allies take time out from their attempts to stage a sexual revolution to march around their little house in Cape Cod with sparklers proclaiming the wonders of the Glorious Fourth.
Maybe it’s just that I can’t imagine the Feminist Majority Foundation doing that today.
Relegating the July 4th date to a silly fairy tale that is of no real importance anyway does a lot of bad things. First, it’s not historically accurate–and it’s certainly not “the truth.” Second, it leaves those schoolchildren unable to penetrate vast parts of our culture that have not gone away.
And third, and most important, it is one of those things working to obliterate that culture.
And yes, I know a lot of people would like that culture obliterated–it’s all racism, sexism and homophobia anyway, and the Revolution was just about a lot of rich white guys trying to protect their property, and–well, you know the drill.
But those things aren’t the truth anyway, and it giving them to sixth graders as if they were you do something with immediate ramifications: you raise generation after generation of citizens who see no reason why THEIR money should go for supporting THOSE people–after all, what do we have to do with them?
I hear a lot these days on the subject of how awful we’ve become since the Great Depression, when we were all willing to pull together and help each other out through the bad times.
But the Generation that lived through the Great Depression and the Second World War was taught all those silly fairy tales about the national story, and came out of it as one people, all Americans together.
The bottom line is this: you can have a national story that every schoolchild learns, and a national story that will recruit those schoolchildren into a defense of their own culture.
Or you can have every man, woman, and child for him and herself.
For what it’s worth, Charlou said a lot when she was here, and I still don’t know which of the fourteen things a day she posted she was supposed to be right about.
So I’ll just let that one go.
And go make more tea.
It’s the beginning of a new term, with exactly one week gone, and I am where I always am–getting royally, throat-achingly sick. The throat thing is, of course, soreness, which is how I always start getting sick. My older son used to get strep every new term, and I’m just glad that hasn’t been the problem with me.
And I will admit, I did expect something more from that other issue–it was, after all, presented in a publication widely regarded as “liberal,” and the liberal side in the US is not usually fond of saying that US workers will have to adapt themselves to living and working like the Chinese in order to make a living.
And yet, I’m not sure what other conclusions could have been drawn from the issue presented the way it was. Robert had a suggestion, but it wasn’t a suggestion the article actually made.
Since reading that article, though, I’ve been having the very weird experience of watching the evening news and the nightly political opinion shows (this year: Bill O’Reilly and Rachel Maddow. I miss Keith Olberman) and thinking about how incredibly insular American discourse is at this point in time.
We talk about one tax policy or another, one education policy or another, but it’s all as if we were the only people in the room.
Which brings me to the people in my rooms, most of them 18 years old, and most of them so dismally ignorant of just about anything that I end up not knowing what I can say and be understood.
AB posted “Charlou was right,” and that’s fine, except that he didn’t tell us right about what.
And if he meant right about the idea that we can’t teach American students simple, elementary facts about history and culture–I don’t believe it.
Of the things my students this year do not, in the majority, know:
1) The American Revolution was started by the signing of the Declaration of Independence on July 4, 1776
2) That it was fought against England
3) That it was fought because America was then an English colony and wanted to stop being one.
Now this is not, as the saying goes, rocket science. This is the kind of thing most American kids of my generation knew by the time they were six or seven, if only because they receited little poems about it and took part in pageants about it in kindergarten, or because they went to Fourth of July parades where some guy dressed up in a tricorner hat and made a speech about it.
It seems to me that what is missing these days is not some kind of big directed teaching effort to get kids to memorize one thing or another, but an environment where certain kinds of things are part of the everyday fabric of school life.
The issue is not whether we can pound kids into committing a string of facts and dates to memory, but whether we can make sure they know certain things–certain things central to our existence as a nation.
And we do that–or we always did–by not leaving the information to the history units in our classrooms. We do it by including the same information in English (short stories and novels about the Revolutionary War and poetry like The Midnight Ride of Paul Revere), in art (drawing pictures of events in the Revolutionary War) and on and on and on.
Certain kinds of information therefore become part of the atmosphere every kid breathes.
And we don’t do that any more.
And there’s another thing–some information is not optional. We cannot survive as a country unless we can pass that information down to our children.
If Charlou was right that we can’t stuff kids’ heads full of facts that they don’t want to know because those facts are “irrelevant to their lives,” then we’re going to die, and that’s the end of it.
The goal of education is not to adapt the definition to what we’ve decided kids just naturally can and want to learn, but to find a way to get them to learn things that are absolutely vital to our survival and theirs whether they want to learn those things or not.
And I still say we did a pretty good job of it for generation after generation.
Those kids of my father’s and my father in law’s generation who mostly didn’t have the opportunity to “go to college” all knew this stuff.
To say that kids today can’t learn it is to say that a) we’ve gotten stupider over time or b) to use code for race in a way that’s very ugly indeed.
And, for what it’s worth, I think there’s a lot of that going on.
In the meantime, the kids in my classrooms who are being given the wonder “opportunity to go to college” are, of course, not being given the opportunity to go to college at all.
At best, most of them are being given the opportunity to get that old eighth grade education, just four years late and at a terrific costs both to the taxpayers and their own debt burdens.
We are constantly and unconsciously assuming equivalences.
The issue is not who has, and who has not, graduated from high school.
It is not who has, or who has not, had the “opportunity to go to college.”
The issue is who has and who has not reached the relevant skill levels in literacy, numeracy, and general knowledge.
We say “graduated from high school” as if getting a diploma from a school guarantees that the student has reached those levels–but it doesn’t. Year after year, schools graduate students like mine, who cannot write a grammatic sentence, cannot understand written material at the level of an editorial in USA Today. can add three two digit figures in a row, don’t know what continent China is on, and come up with the kind of answers my kids did about the American Revolution (it started after the American Civil War, probably in around 1840).
Robert wanted to know where the Jane had gone who thought that this present educational (especially college-educational) system was ready for collapse.
I haven’t gone anywhere–THIS part of the system is ready for collapse, because the bottom line is that the competencies aren’t arbitrary.
They are what those kids actually need to function on the job and in life as citizens.
If they don’t have those competencies, it won’t matter if they have a shiny new diploma saying they’re “college graduates.”
On the basics, I see a lot of businesses up here trying to run tutoring classes in basic English, especially, for their new hires.
With “college” eating up six to eight years and costing a bomb in the process, I don’t think it’s going to be long for those same businesses to decide they need another sorting mechanism for entry level positions–one that, for instance, actually tells them what those new hires can really do.
Whether that’s going to translate to a collapse further up the line, I think it will.
It will just take longer.
Okay, I’ll admit it.
I expected that last comment thread to go on much longer than it did, and to generate a lot more heat.
But since it didn’t, I’d like to ask a question.
If you read that article Michael Fisher posted–the one on working in the US and in China–what did you get out of it?
My tendency was to go, “well, if they Chinese workers are more flexible, better trained and more self disciplined than the American workers are, the Chinese workers deserve to win.”
This seems to me so obvious, that I have a hard time figuring out what else I’m supposed to think, or what it is I’m supposed to support doing to change things.
The American engineer highlighted in the article doesn’t want to work the way a Chinese worker works. He wants week ends and vacations and time with his family and all the rest of it, and that’s going to be fine as long as somebody does not come along who is willing to forgo those things for the sake of beating out the competition.
We certainly can’t stop the Chinese from working the way they do, so what exactly was I supposed to take away from that piece?
I logged on to FB this morning to find that a number of people had posted links to this article
It’s a review of Charles Murray’s new book, Coming Apart: The State of White America 1960-2010
in a British newspaper called The Telegraph.
Some of you may remember Charles Murray. A few decades ago, he wrote a book called The Bell Curve, with a colleague named Richard Hernstein. That book posited that since intelligence is (demonstrably) at least 50% due to heredity, and since the genes for intelligence did not appear to be equally distributed by race, the chances were good that the percentage of each race found at each incremental level of intellectual ability would not be the same as their percentage in the population at large.
Unlike a lot of people who went yelling and screaming–he’s saying black people are stupid!–about this book, I actually bothered to read it, not once, but twice.
And I had some issues with the analysis M and H put forward.
But that’s not what I want to talk about here.
What I want to talk about here is this: a number of people who did the yelling and screaming, and who declared M and H to be nothing but racist bastards for writing The Bell Curve, not only posted links to this Telegraph article, and seem to be indicating that they largely approve of the message of it.
Now, I haven’t read the new book–it isn’t even out yet–but on the basis of the review, it seems to be this: intelligence is significantly hereditary, and in the world we live in now, smart people marry smart people. They then produce smart children. Those people have smart children.
Smart people are better able to handle the advanced scientific, technological and literacy-based skills necessary to holding jobs and other positions at the very top of the academic and economic pyramid.
They are therefore becoming a “cognitive elite”–and a hereditary one at that, blocking off any chance anybody else has to rise in the social hierarchy.
Now, I’ve got a lot to say about this, but here’s the obvious thing:
1) You can’t have it both ways. This is essentially the same message Murray was delivering in The Bell Curve, just starting at the other end of the problem. You can’t call the man a racist son of a bitch for saying what he said in the first book and then hail him as a prophet for saying this. One thesis assumes the other–the thesis of Coming Apart cannot be true unless the thesis of The Bell Curve is also true.
2) My guess is that when I read this, I’m going to have the same problems with the analysis that I had with The Bell Curve, largely having to do with the way he is defining “intelligent” and the way in which “intelligence” is in fact recognized as intelligence in this society at this time.
But looking past that sort of thing for a moment, and assuming we’re all on the same page when it comes to defining intelligence–which we’re probably not–may I please point out that the idea that there’s a “cognitive elite” is nothing new?
There has always been a cognitive elite. Smarter people have always done better than less smart people, all things being equal.
But smarter people often do better than less smart people even when all things aren’t equal. I went to Vassar because a woman whose name I share looked around a hellhole of a little fishing village on a remote Greek island, decided she wasn’t having any, and devised and carried out a plan to get the money to come to America.
Her parents were sending her brothers, but they wouldn’t send a girl. Her parents sent her brothers steerage. When she finally got on a boat with her own ticket, bought with money she’d made herself from a little business she’d started herself, she had a stateroom.
She was nineteen.
Of course, that story is about someone who was not just a smarter person, but had another whole set of qualities–and if I go off on that tangent someday, I’d say that the other qualities are, in fact, much more important than the intelligence.
In the meantime, I would just like to say that life is not fair. Some of us are born prettier, smarter, with better concentration or a natural sense of the underlying music–with things that are just better than other people get the shot at.
But this is not new, and recognizing it does not change anything about the world we live in.
What is new is
3) A social and economic sorting process that is increasingly, and increasingly monopolistically, located in the schools.
The number of ways that this is a really, really, really bad idea and ought to stop would take about forty volumes to list, but we should notice that the big problem with this is that it is not actually a system for sorting out who is more intelligent than whom.
It will do a certain amount of that, of course–but it will also exclude vast segments of the more-intelligent population who have other (and often valuable) traits that don’t fit well with schools. That’s why there’s always a solid chunk of kids who have board scores near or at perfect and grades that in the toilet.
Of course there’s always the possibility that
4) As the world evolves, the level of intelligence, skill, talent and self discipline necessary to function in it even mininally will also rise, and a hunk of people at the bottom of those levels will no longer be able to get by at all.
Or, to put it more bluntly, it may be the case that the level at which somebody would be deemed “mentally handicapped” would rise.
In some ways, I think that we can say that that is demonstrably the case even now.
In most ways, though, I think the fear is misdirected.
It’s certainly true that you need to be far more intelligent (and all the rest of that stuff) to be a general practictioner, never mind a specialist, and that half the people admitted to med school in 1920 probably couldn’t get through the doors now.
But it is also the case that jobs at the other end that once required at least a minimum of skill (cashier, for instance) have been deliberately reconfigured in a way that means you need less and less intelligence and other skills to fill them.
And day to day living operations–keeping your bank account sane, or banking at all, for instance–have been digitized to make it possible for people to function even if they’re barely literate or numerate at all.
There’s good money to be made in figuring out ways to accommodate people who otherwise couldn’t keep up.
But, of course, that isn’t Murray’s worry, and it isn’t the worry of the people who posted this article over and over again. Their worry
5) is that we’ll come to a day when the genetics have rigidified, all the smart people will be married to other smart people and producing only smart children. All the stupid people will be married to other stupid people. We will forever afterward be living in a world of hereditary caste, where there will be no escape for the people on the bottom of the latter, because they’re all congenitally dumb and can produce only congenitally dumb children.
I have a whole set of objections to that scenario–in fact, I think it’s fundamentally ridiculous–but I should note that IF that was the case, if group X was really, actually, smarter and more skilled than group Y, and that intelligence and skill meant they could perform more important functions for society at large…then yes, that’s what ought to happen, and the only just outcome would be to let them be that cognitive elite.
But I’m not worried about that, because the fear is based on a profound misunderstanding of statistics.
Here, however, is what I’m worried about:
6) Beyond all the arguments about students and teachers, testing teachers, testing students, blah, blah, blah, there is the reality of life in some of our schools.
There is, for instance, the phenomenon of “service classes.” A service class is a class a student can take for half credit, in which he or she does none of the academic work, but instead helps the teacher by fetching supplies, cleaning up the classroom, and doing other chores.
The class and the grade appears on the transscript as “Algebra I” or “American History,” with no indication that the student has done no academic work whatsoever.
I first heard about this practice in an article by Jonathan Kozol, and I thought he was making it up. In the years since, I’ve asked almost all my classes–and yes, the practice does exist, at the very least in New York, New Jersey, and Connecticut.
It’s been outlawed, but it still exists.
It might explain why, in my class the other day, half the students didn’t know that the Declaration of Independence was signed on the Fourth of July, and could not tell me what the American “war of independence” was about.
Let’s start worrying about this crap before we work ourselves up about a “cognitive elite” who are simply born smarter than those people over there.
6) I’m with Michael on two points–I don’t think testing teachers does much good, and I don’t think performance evaluations of anybody for anything are much other than a joke.
And I’d LOVE to dump all those “bogus education degrees” and bring in highly educated people with real degrees and pay them like real professionals.
But the reason that won’t happen has nothing to do with evil, tight-fisted Republicans.
If we tried to make such a change, we’d have to fire better than 90% of ever public school teacher now working.
The vast majority of them are required to have those bogus education degrees to be allowed to teach at all. Even candidates who start out with a “real” degree are usually forced to take a MAT (master of arts in teaching) or other education certification requirements before they’re allowed in a classroom.
And the teachers’ unions scream bloody hell at even the slightest deviation from their stranglehold over certification.
I’m not expecting it’s going to happen soon.
From what I understand, something seems to have gone wrong with the blog site since the last time I posted, and several people found that there was no way to post comments to that last post.
I assume this couldn’t have been a general problem with the site, since people continued to post comments to the post before that post, but I don’t know.
It would be good to know if this is something I need to get fixed, so I’d appreciate it if as many people as possible could post comments to THIS post, if it’s possible, or send me an e-mail saying it couldn’t be done.
The comments don’t have to be “real” comments. They just have to say something like “hey, I got on!” or whatever.
That way, I’ll know whether I actually need to do something, or if that was just a glitch in that particular post.
I’ll admit that a couple of things went oddly wrong in the writing of it, but I didn’t take them seriously at the time.
In case you wonder where I’ve been, the term started yesterday–the real term, with students in classrooms–and I’ve got a book due March 15, so I’ve been a little distracted.
I’ve also been indulging in odd little bits of nostalgia, or something.
There are literally thousands of books lying around in my house. Most of them are more or less recent–that is, no more than 25 years old, and therefore bought or given to me since I married Bill–but a remarkable number are in fact leftovers from my childhood.
I’ve still got a fairly extensive collection of Nancy Drew, for instance, including the copy of The Ghost of Blackwood Hall, the first book I ever got to pick out for myself, which my mother bought for me at Malley’s in New Haven when I was six. It was her prize for me for being “good” at the eye surgeon’s office when we went in to see about getting my crossed eyes fixed.
That’s one of those things. My mother had crossed eyes as a child, in an era when surgery for that sort of thing was (at least) not common, and she’d been so traumatized by it that she insisted on “doing something” about mine as soon as she saw them.
It’s interesting to me, because hers corrected themselves by the time she hit high school, and so did those of at least one of her brothers. In all likelihood, mine would have, too. She wasn’t willing to wait.
A couple of days ago, I stumbled across another book I remember from my childhood–William F. Buckley’s God and Man at Yale.
It was published the year I was born, and I bought my first copy of it when I was somewhere between ten and fourteen, which was also the time when I had my first magazine subscriptions, to The New Yorker and National Review.
Before you get the idea that I was a preternaturally precocious politco and a conservative to boot, what interested me about National Review and God and Man At Yale both was not the politics, but the Yale.
Or, at least, what I imagined Yale to be.
Yale’s undergraduate college was, at that time, all-male and showing no signs of being interested in admitting women. It wouldn’t begin to admit women until the year after I graduated from high school–and yes, I applied and got turned down.
But to me, Yale was a set of beautiful college Gothic buildings stretched out across the center of New Haven, Connecticut, where I would sometimes walk when I had some time in that place to myself. I would buy books at the Yale Co-op and then go wandering around listening to people talk.
And they did talk. Undergraduates walking on sidewalks arguing about Locke and Hume, Henry James and Jane Austen.
If Buckley’s book is to be believed, I either got lucky and caught the good stuff in a sea of the mediocre and trivial, or I only remembered the stuff I heard that made me happy.
Anyway, that’s what I was looking for–a place where people read books the way the people I knew watched television, a place where people talked about Locke and Hume the way the people I knew talked about each other.
In one way, I got very lucky indeed.
It was an era before the endless celebrity gossip we’re inundated with at the present, so at least I didn’t have to put up with chatter about the sex life of Pat Boone. Or Elvis.
God and Man at Yale is a strange little book in a lot of ways.
It was a significant best seller almost immediately, in spite of the fact that it does not even pretend to be a discussion about general trends in education or to be equally applicable to all colleges and universities.
Instead, it’s an essay on political and religious life at Yale by a recently graduated alumnus, and it’s guiding thesis is that Yale alumni should take a more active part in the running of the university than they do.
It is a book, in other words, in the middle of the great transformation of universities from being collegialities to being institutions run for the benefit of faculty alone.
And sometimes it can be difficult to understand what’s going on, because the vision of the nature of the university is so different than anything we’d had since, at least the Sixties, that I had to keep adjusting and readjusting my sense of what is “normal” in academic life.
Maybe the best way to say it would be that Buckley’s Yale is more like Sayers’s Oxford in Gaudy Night than it is like Yale, or anywhere else, today.
But other things struck me in reading this book, and mostly they had to do with a feeling of being out of time.
In some ways, Buckley’s Yale is far more familiar than it ought to be.
1) We seem to be making a lot of the same arguments that were being made right after the way, and making them in largely the same terms.
There is, for instance, the endless talk about “income inequality,” which appears to have been a catch phrase even then, and to have many of the same defenses.
There’s also a lot about how really, the day when individuals to do for themselves is long gone, and in the increasingly complex world we need increasingly complex government to protect us from “monopolies,” which (also like now) aren’t actually monopolies but just businesses large enough to “influence the market.”
2) In cases where the terminology has changed, I still found it familiar, because it is largely the terminology used by Ayn Rand to describe the relationship between the diffenent sides of the political divide.
In other words, we aren’t talking about “left” and “right” or “liberal” and “conservative,” but about “individualist” and “collectivist” theories of government, society and human nature.
Atlas Shrugged had not been written at the time this book was published, so I have to assume that the terms were used because they were the terms that were current at the time Buckley (and later Rand) published.
Rand used these terms all her life. They are not the terms you will find in Buckley’s work later in his career. His vocabulary moved with the times.
But it’s interesting, nonetheless, because so much of Buckley’s analysis of the nature and workings of “collectivist” thought is nearly identical to Rand’s, with the obvious exception of their differences in regard to the nature and import of Christianity.
And that’s interesting because Buckley had nothing but contempt for Rand as a writer, a philosopher, or a human being, and wasn’t shy of saying so.
National Review savaged Atlas Shrugged when it was published, and professed itself astonished that anybody would read the thing. Maybe, the magazine opined, it was the people wanted “the dirty bits.”
To Buckley, of course, Christianity brought two things to the table that were absolutely necessary to the foundation and maintenance of a free society: the concept of the individual human being as being of infinite worth and value in his individuality (that is, not as a member of a group), and b) the concept of “rights” as being something prior to and superior to social institutions meant to observe or violate them.
Rand, of course, saw Christianity as inherently collectivist in its valoration of altruism and insistance that men and women bow to the dictates of a God rather than define their own values and morality.
Back in the days when politics was about something more than an endless war over class markers–Volvos! Nascar! Chicken-fired steak! Brie!–it was, I think, a pretty good shorthand way of explaining the differences between conservatism and libertarianism.
3) Virtually nothing at all has changed in the last fifty years in the names the Left calls the Right.
Attacks by the Right have changed considerably, and the focus of right wing criticism of the Left had done a near 180 degree turn from the Sixties.
I could take almost any of the reviews of God and Man at Yale, though, and publish them tomorrow, and you wouldn’t be able to tell when they were published.
I don’t know exactly what I think about that. The Right suffers, I think, from the loss of its high intellectual end, and conservatism especially suffers from the recent dearth of voices like Buckley’s, which wanted to preserve Western Civilization and not just “they way we did things in Hope, Mississippi when I was six.”
There is, however, something kind of odd about a Left that seems to be frozen in time, and that time being close to a hundred years ago now.
4) And, as a note, rereading this book gave me a piece of information I hadn’t had before.
One of my favorite organizations on the planet is called (these days) the Intercollegiate Studies Institute.
It publishes books and lectures and does studies and runs a website all in the aid of the Great Tradition and classical Liberal Arts Studies.
It has become, these days, “agnostic” about evolution, of course–because in the Politicization of Everything, all that matters is that we each consistantly take sides.
But aside from that, the organization makes me very happy most of the time, and it appears in Buckley’s book in its original form as an organization on the Yale campus meant to bring Yale back to the Great Tradition.
So maybe, when I was fourteen, I wasn’t so far off in thinking that Yale presented a possible avenue into my fantasy world of people who lived for books and ideas, where wanting those things made you cool instead of stupid.
Of course, these days, Yale likes to turn down multi hundred million dollar bequests to found departments of Western Civilization–so there’s that.
And they did turn me down not just the once, but eventually three times in all, which says something else, I suppose, about the both of us.
I’ve got to go correct student papers.
I know, I know. I should have started this post earlier today, but I was too fascinated by the event playing out on my television set–a My Little Pony marathon, which is the way Greg wants to spend his birthday.
If you’ve never seen this thing, all I can tell you is that it ought to be classified as a form of terrorism. It’s so twee, I wanted to throw acid in its face.
On the other hand, it did put an end to the manic dancing around the living room declaring himself “an adult!” I have been informed that neither I nor anybody else is any longer allowed to see his grades or discuss his progress without his permission, or talk to his doctors unless he says I can, or…
I have pointed out that I’m still paying for everything, and if I wasn’t he’d be in the soup–but apparently that isn’t mentioned in the law.
He looked it up on the Internet.
So, I retreated in here, prepatory to making the kid the biggest batch of curry I ever have.
While I’m waiting for that, though, I thought I’d clear up some things–especially some things about “nuns.” The quotes are necessary, because the women most of us have learned to called “nuns” are not “nuns” under Roman Catholic Canon Law–they’re only “religious sisters.” The term “nun” is restricted to women who take solemn vows and are therefore cloistered–they don’t teach or nurse or do any other work in the world.
But first, a bit of cultural adjustment is in order–in the US, there is no legal requirement that any business provide any kind of benefit for their employees. Some businesses offer employee pensions, some don’t. Some offer health care coverage, some don’t.
The Obama health care bill requires employers with 50 employers or more to offer health insurance–but like a lot of other things about the bill, it’s being contested in the courts. We’ll see how that works out.
But, back to the nuns.
The women we call nuns didn’t exist in the Catholic Church until the Counterreformation. Until then, all orders of religious women were cloistered. The women who joined those orders did not teach or nurse or do any of the things we think of “nuns” as doing.
And, to this day, the women who teach and nurse and all the rest of it are not nuns in the Catholic Church–they are “women religious” or “religious sisters,” and they take what are known as “simple” rather than “solemn” vows.
It took nearly a century for the Vatican to approve of such orders. Before that, it tended to feel that “women religious” working in the world would give rise to scandal. They weren’t being entirely paranoid. The promiscuity of nuns was one of the great charges of the Protestant reformers.
On top of that, a religious order is not an employer. Women do not join religious orders, and religious orders do not admit women, in order to do some kind of secular work.
Religious orders exist “for the greater glory of God,” and their primary function is always the worship and honor of God. Anything else they do is secondary.
Every once in a while I’ll run across an essay by a secular writer who is totally shocked to have found out just this about Mother Teresa–she didn’t really go out to help the poor! she thought it was more important to worship God!
Yep. That’s how that works.
And the orders are very careful to explain to young women wanting to join that although their “apostolate” may be teaching or nursing, joining the order does NOT mean that you will ever get to be a teacher or a nurse.
Young women who join religious orders are required to spend at least some time on probation, between 3 and 9 years, depending on the order. First they are postulatnts. Then they are novices. The they are “tertiary professed” (they make a vow to the God and the Church to stick with the order and live under its rule for three years). Only after all that do they take “final vows,” in which they promise to stay with the order and live under its rule for life.
Final vows are considered to be a serious thing in the Catholic Church, and even now it can be very difficult to get released from them. For most orders, it would require an okay from Rome. If you just walk out, you are functionally excommunicate–you can no longer receive the sacraments in a Catholic Church. And it’s the kind of excommunicate you can’t get rid of just by going to Confession.
Consider the fact that you CAN get rid of the automatic excommunication after an abortion just by going to Confession.
Up until the 1970s, the system worked on the same lines as it had since the 18th century and it worked rather well.
You joined the order at 18, right out of high school, and spent the first three years (one postulant, two novice) at the motherhouse. Then the order packed you off to education–nurse’s training, teacher training, college pre-med (several orders trained some of their sisters as doctors) and medical school, whatever.
And the order paid for it. All of it. Every last dime.
The order then sent you out to the institutions it ran, or to institutions with which it was contracted (Catholic hospitals run by other religious orders, for instance).
You got paid absolutely nothing. You got room and board, and that was it. You were expected to have no money on your person unless you were authorized to have it for some authorized, specific purpose.
In most cases, your order wasn’t paid for your services, either. It took in donations, and kept the institutions running on those. In the case of parish schools, it got the convent and room and board for the sisters, and otherwise charged–again–absolutely nothing.
For years, this made it possible for the Catholic Church to provide services for poor and working class people at incredibly low cost. Tuition at parish parochial schools was rock bottom cheap, so that even families with six or ten children could get them through. Catholic colleges were often rock bottom cheap, too–the Jesuits used to make a point of it.
Anyway, you went on that way until you got old, and then when you were ready to retire you went back to the motherhouse and lived out the rest of your life. Your food and shelter and books and companionship and medicine were all taken care of.
And that ran into a wall after Vatican II, when sisters began leaving their orders in droves, and the supply of young women looking to joing up slowed to a trickle.
Without younger sisters to support older sisters, there was often nobody to support older sisters in orders that had never thought twice about money and had never charged much of anything for their services.
Which meant, of course, that they also hadn’t paid into the Social Security System.
So, these day, the orders charge just like anybody else, parochial schools are expensive and a Catholic hospital is run pretty much like any other hospital–and, if you ask me, the world is a poorer place for it.
But no, I don’t think it’s “disgraceful” that religious orders don’t provide pensions for nuns or monks or sisters who leave, whether they’ve done a job or not.
They signed up to pray, not to work. It wasn’t a job to begin with.
I think I hear a break in the My Little Pony thing…
Well, what can I say? The simplest explanation for why I’m writing this this early in the morning is that it’s that time of year again. The new Gregor is due in about ten weeks, and I somehow have to cut most of the manuscript before St. Martin’s will be willing to publish it.
That’s how I write. I do a first draft by just going at it and not really thinking about what I’m saying. The draft ends up being 800 or 900 pages long. Then I cut away everything that doesn’t look like an elephant.
At least, I hope I do.
Cutting is actually a lot harder than writing. When you cut, especially when you cut a lot, you have to make sure you’re not cutting things that need to be left in in order for the book to make sense.
With Cheating at Solitaire, I managed to cut out the stuff that explained the title of the novel. Originally, there was a fair amount about how these Pop-Tart starlets “bought” friends–acquired entourages by paying all the bills, by putting people on payroll, and that kind of thing.
That was the point of the title. These people look surrounded by friends, but in fact they don’t really have any.
But none of that made it into the book, and so for weeks afterwards, I got a little rainful of e-mails wanting to know what the title meant.
It was a shame, really, because it was a very good title.
With this new one, I’m unlikely to have a problem with the stuff that explains the title–maybe because it can’t be explained–but there will be other things, and I absolutely guarantee you that I’ll cut at least one clue necessary to working out the mystery.
This will result in a plaintive note from the copyeditor telling me that whatever is going on on page 264 makes no sense.
She’ll be right, too.
The other thing that gets to me is that when I write books, I read, and I read a lot. I’ve usually got some social issue I’m trying to think through by putting it in various people’s heads.
I think that probably makes me less of a stellar seller than I might be, and my guess is that I’m even less of one because I try–if I’m going to do an issue at all–to see all the sides of it and not make one side look good and the other look stupid.
This not only makes people on every side angry, it makes them angry for exactly the same reason–left or right, liberal or conservative, chances are that you’ll think I’m on the other side.
Of course, sometimes (as in Blood in the Water, which comes out in a couple of months) I’m not looking at any social issues at all.
And sometimes, I’m not looking at any as the focus of the book, but one or another creeps up in the private life of a character.
And sometimes, an issue that comes up in the construction of a character leads me to read things that lead me to read other things that end me up with an issue I didn’t even know existed–and that doesn’t make the book, but it does make me pause.
This time, the issue that isn’t part of a book concerns an organization called The Clergy Project, founded by the evolutionary biologist and New Atheist Richard Dawkins.
You can find its web site here:
I first heard about it from an article in Freethought Today, which is the monthly newspaper of the Freedom From Religion Foundation. The FFRF are the people who go around demanding to be allowed to place a Freethought sign next to any Christmas creche on public property.
The law says the creche can only be there if there is an “open forum” that allows the presentation of all points of view. FFRF puts up a sign from another point of view.
Every year or so, Bill O’Reilly goes ballistic about this, as if it were something new. I think FFRF has been putting up a sign in the Wisconsin capitol building for close to two decades.
At any rate, somebody sent me a complimentary copy of the December issue of Freethought Today, and there was the article about the Clergy Project.
If I understand it correctly, The Clergy Project tries to be a kind of support group and decompression chamber for members of the clergy who no longer believe.
The reason for providing a support group for clergy is that clergy, unlike ordinary believers, often can’t just pick up stakes and go. They’ve got jobs, families, pension plans, entire lives invested in their religion.
Walking away can be not only intellectually disorienting–change your mind on something central like this sometimes; it’s kind of like being high, but in a bad way–but materially disastrous.
A man whose education consists of a string of theology degrees and who is already fifty or so years old is not going to have an easy time in the secular job market.
Back in the 1960s and 1970s, there were similar organizations set up to help women leaving Roman Catholic religious orders in the wake of Vatican II. The issues were, I think, similar, if not necessarily a matter of the loss of faith.
A woman who had spent twenty years of her life wearing Medieval (literally) clothing and arranging her life to bells had a certain amount of decompressing to do before normal life could feel normal.
I don’t think The Clergy Project is a bad idea. In fact, I think it’s largely a good one, and my guess is that it comes in handy for a number of people who, having found themselves adrift, need help to negotiate their way into another way of life.
What struck me, however, was The Clergy Project’s claim that its members include many active members of the clergy in all denominations.
Active means that these men and women are still occupying the pulpits of their churches, that they are preaching the Gospel and giving “spiritual counseling,” even though they no longer believe in God or think there’s anything like a spirit to be counseled.
The old nun’s organizations were based on the needs of women who wanted to leave their orders, but most of those women had not lost their faith.
I’ve known a number of ex-nuns in my life, and most of them have been devout and enthusiastic Catholics. It was life in a religious community they couldn’t handle. They had no problems with God.
The issues would be a lot dodgier, I would think, for the men and women in Dawkins’s organization.
It is specifically an organization for people who have lost their faith. They do not believe in God any more. In some cases, they have arrived at the conclusion that faith itself is a malevolent thing.
What I think I’m having trouble understanding is how they manage their day to day lives in respect to their work–how they go on with the sermons, the counselings, the Sunday Schools, and all the rest of it.
I know that it’s not really feasible for some of them to just get up and leave. There are all those material considerations involved. Walking away from an entire way of life and everybody you’ve known in it, risking your marriage and your relationships with your children, is not something most of us could do, ever, no matter what the issue. I certainly don’t know if I could.
But maybe because I think of religion as something fundamental to a personality–religion or the lack of it, I suppose–I have a hard time understanding just how such men and women go about doing the daily routine once they no longer believe.
It seems to me that it would take an almost superhuman effort to go on saying words you felt were meaningless or pernicious.
And yet, obviously, people do it and go on doing it for years.
And that makes me wonder about something else–how many people are there in other lines of work who have analogous problems, psychologists who no longer believe in psychology, teachers of Women’s Studies who have come to believe identity politics are a crock, apostles of the market who think capitalism is an anarchic mess and prophets of socialism who have become convinced that the world is full of welfare queens.
Let’s face it. An awful lot of what we all do these days depends on complicated webs of assumptions and conventional wisdom that do not, on examination, have much to do with the real world. We tend to notice the disconnect when it occurs in our opponents’ thinking, but not when it occurs in our own.
But reality will intrude, and sometimes it does, and it lands us–
Well, we live in an age of taking sides.
You’re with us or you’re against us. If you so much as hint that the other side might be making a point, or be honest about what they say they believe–well, obviously you’re one of THEM, and to be banished from polite company ever after.
You don’t have to be a religious believer, or a clergyperson, to lose your life when you have a change of heart.
I wonder how many of us are out there now, repeating the same old same old, because not to would be to cut ourselves off from friends, family, maybe even work.
Which only goes to show.
I can be depressing no matter what the topic.
It occurs to me, looking through these posts, that I’ve been no fun lately. Sometimes I think I’ve been no fun for years, but I don’t have the patience to go looking for that many blog posts.
And, to tell you the truth, I’m not feeling all that upbeat today, either, although I at least know what I’m cooking for dinner, which is better than I’ve been doing for the last week.
Mostly I think I’ve just been wondering how people manage when they don’t think about the things I think about as a matter of course. Politics. Literature. America’s Next Top Model.
I also keep running across people who declare that they don’t think about anything. I don’t mean that they’re stupid–even stupid people think about things; stupidity is not usually about not thinking but about not thinking well–but that they claim to be more or less blank, with the internal screen turned off most of the time, unless something comes up that gets their attention.
A part of me is stubbornly convinced that this is not actually possible, no matter how often these people tell me it is. I’m not sure my brain ever turns all the way off, even when I sleep. God only knows my dreams seem to be both convoluted and bizarre.
On the other hand, the things I do thing about–other than the usual worrying–don’t seem to make much sense either.
The New Hampshire primary has come and gone, for isntance, and I paid a good deal of attention to it, but I can’t really think why.
My tendency is to feel that the Republicans are not serious about fielding a candidate for this election.
I think this because there are credible Republican candidates out there, but none of them is running–and the candidates that are running are, with the exception of Romney, just completely bizarre.
Granted that Michelle Bachmann faced an unusually hostile press, she also conducted her campaign with less competence than the average candidate for Student Council President.
And Newt, God bless him, has more baggage than a Fifth Avenue luggage store.
As for Romney, he’s so blow-dried and plastic, he looks like he was manufactured last week in Taiwan. And it’s not like he has a lot to say.
Every once in a while, one of the candidates I have no use for otherwise will come up with an idea that I really love, but I will ultimately appear to be the only person listening to it.
If the press had spent less time making googling the more exotic definitions of Rick Santorum’s last name, they might have taken note of his signature tax plan, which would a) reduce tax brackets to 2, 10% and 28% and b) eliminate ALL deductions except for home mortgage, children, charitable contributions, retirement savings, and health care expenses.
This ought to be a very interesting idea to both sides of the political divide. If it is what it says it is, it will not only simplify the tax code so that ordinary people could actually understand their income tax forms without shelling out their cash for high or low level accountancy advice, but it would constitute the single largest increase in taxes for the rich since the income tax was introduced in the first place.
I don’t know the particulars, however, because we’re all too busy wondering when an abortion is really an abortion.
And I wouldn’t vote for Santorum if you paid me money and made me Queen of England, but I’d like to hear that idea discussed, floated out there, discussed, considered.
Sometimes the things I think about are just sort of oddly formless. I think about mystery novels a lot, of course, but I also wonder why so many of the new ones are so oddly drifty. They’re not fair play, exactly, and cozy isn’t the issue–you can get lots of well plotted fair play cozy mysteries.
No, in a lot of the ones I see lately, it’s almost as if nobody–writer, editor, reader–is much interested. And I’m including readers in this, because a number of them sell reasonably well, even if they don’t end up on the NYTs Best Seller List.
In a couple of cases, I think I can explain it by the fact that the series characters are very engaging. I actually have one series I follow mostly because I want to know What They’re Going To Do Next. In other cases, I just don’t get it.
But then, I’m very aware of the fact that I have no real sense of why novels sell and why they don’t.
This is especially true in mystery novels, because my taste in mystery novels is very specific and not the usual sort of thing.
And I don’t usually pick books to read because they’re best sellers. I don’t reject them for that reason, either. It’s just that I don’t usually know what’s on the lists. I read sort of the way I write. I like what I like the way I like it, and then I don’t worry about it.
But Robert’s post about how to have a best seller got me wondering, because those aren’t the things I thought were popular.
The thing about writing a novel where everybody is better off now that the villain is dead, though, is what I think of as one of the perennial problems of the mystery genre.
The simple fact of the matter is that most people who end up murdered by something other than a random robbery end up murdered for a reason. They tend to be unpleasant people in a number of ways.
You can, of course, construct a mystery where the victim is the saintly old lady at the end of the block whose house happens to sit on a fortune in gold or oil and who gets killed as a way to rob her blind–but you really can’t do that over and over again, and it’s not all that realistic even once.
People do not usually commit murders willy nilly, or on automatic pilot. They usually have to be pushed to the wall. If your victim is the sort of person who can push somebody to the wall, the chances are he’s making a lot of people miserable, and not just the murderer’s.
And there is, of course, the perennial issue between legal and absolute justice.
There’s some good Christie–and good P.D. James–putting forward the proposition that the law must prevail, and justice is not done if it isn’t, no matter how vile a person the victim might be.
But there is also a fair amount of very good work–Christie, again, with Murder on the Orient Express–putting forth the proposition that justice is larger than the law, and sometimes needs to prevail in spite of the law.
It’s the old “would you have killed Hitler?’ question, although the victims are almost never Hitler.
I haven’t really noticed an increase in this kind of thing recently, although I may just have been reading the wrong books. My problem with the premise tends to be that it’s been overdone to death.
For what it’s worth, there’s a book out there, The Blue Diary, by Alice Hoffmann, that does a sort of interesting rift on this sort of thing.
In it, it turns out that one of the town’s most respected and beloved citizens, a faithful husband, good father, community leader and volunteer, hard worker and all the rest of it–
It turns out that this man had, some 25 years before, strangled to death his then-girlfriend and fled the scene. The police have been looking for him ever since.
And in the beginning, the situation goes the way these situations tend to go. There is a great public outcry against arresting this guy and trying him–after all, it was all those years ago, and his life has been exemplary ever since, he really is a good person, a better person than most people.
The longer the situation lasts, however, the more–well, I don’t know how to describe it. The book came out at about the time that one of the old Weathermen was discovered living under an assumed name somewhere, and this book always seemed to me to be Hoffmann’s rif on that.
She’s not sympathetic.
Maybe the truth is that the fictional construct of the murderer whose murder does not define him–whether because he killed somebody heinously awful, or because he lived an exemplary life ever afterwards–is just that, a fictional construct.
I told you I was drifty today.
Okay, I couldn’t think of a decent post title this morning–sometimes it gets like that.
And I have been thinking a lot about the next part of the Defense, which starts with the Enlightenment and goes from there.
In an odd way, that big blackout came at just the right time–I had just gotten to the point where I was going to look at the next place where it looks as if a reintroduction of the Liberal Arts led to a better overall moral practice in society at large, and then the lights went out.
It sometimes feels like some kind of cosmic metaphor.
But let me get back to the issue that is still, oddly, at hand.
Mique says that there are a lot bigger problems than the fact that agencies composed of unelected functionaries get to make “regulations” that are in fact laws—such as earmarks, for instance.
But I don’t agree, and for a number of reasons.
First, though, let me say that I don’t think it is possible to have ANY kind of law, no matter how passed, that would not result in lawsuits at at least some time and place. As long as citizens are allowed to challenge laws in court, they will.
And I think that’s fine.
Making sure that all laws are non-contradictory with other laws and as clear as possible is, however, the necessary condition of making them just.
The difference between earmarks and agency-issued “regulations” is, however, fundamental–earmarks are bad practice, but they’re within the paramaters of democratic government. Agency “regulations” are not. By whatever name we call them, they are laws issued by unelected bodies.
That is the very essence of what democratic government is not. The very same law–for instance, that raw meat and cooked meat cannot be stored in the same container in a restaurant kitchen–is acceptable when passed as legislation by elected representatives and unacceptable when issued as a regulation by unelected agencies.
The issue is in the nature of the imposition of the law, not the content of it.
Earmarks, on the other hand, are just the latest wrinkle on an old problem with the structure of the United States government–and part of the issue is that the problem is not entirely a problem.
In the US, the chances that a Congressman will be elected have very little to do with his or her positions on big issues like abortion or the war in Iraq, and much more to do with what is called “constituent service.”
If you have a problem with the Social Security people, or the EPA, or any other government agency, what you do is call your Congressperson’s office. That office then “looks into it” for you–and quite often solves the problem one way or the other.
A Congressman is reelected largely on the basis of what his constituents can say he is doing for them. Poor constituent service will kill a Congressman’s reelection no matter what his politics. That’s why Waterbury, CT dumped Gary Frank for a Democrat, even though Frank’s ideological positions were MUCH closer to those of his constituents than Chris Murphy’s will ever be.
But “constituent service” can also mean bringing projects to your district that help the local economy or create jobs.
Before earmarks became the method of choice for this, any Congressman trying to get some benefit for his district had to contend with the opposition of lots of other Congressmen who also wanted benefits for theirs or who wanted the same benefit–and with a process that could make the entire issue a little too public come the next election.
Earmarks provided a way to get nearly everybody everything they wanted–the Congressman and his constituents got their projects and could go back to their districts claiming to have “done something” for the voters, and a lot more of them got done than would have been done otherwise because the process of approving them wasn’t as public and therefore didn’t generate as much heat as they might have.
This is as true of cases where the earmark expends funds on giving grants or favor to private companies as when they direct federal projects to a specific area. Private companies employ people in the district.
There were other methods of doing the same thing in the past, and we’ll get rid of the earmarks and something else will show up to do the same thing in the future.
The only way to stop the process–the ONLY way–would be to prevent the US government from making such grants to anybody, anytime, at all.
And that would stop lots of things both sides want–it would put an end to government investments in green jobs and new technologies as well as grants to Congresswoman Smith’s favorite strip mining company.
So today it’s earmarks, and tomorrow it is something else. But earmarks do not in any way violate the fundamental legitimacy of a democratic government, and rule by unelected agencies does.
A system whereby laws are enacted not by the representatives of the people but by functionaries who have not been elected to anything, and where the content of those laws is not negotiation and compromise by the elected representatives of the people but designated “experts” who are presumed to know better than the citizens they rule–
Such a government is no longer of the people, by the people and for the people. It’s an oligarchy, and a real one–far more real than “the corporations are running our lives.”
The corporations aren’t. The agencies increasingly are, and they are increasingly bold about instituting regulations they know perfectly well the mass of people will oppose.
And the consequences are considerably more frightening than anything I can think of that a corporation could do to me.
Take, for instance, a little item in my latest copy of Reason magazine. Reason is the monthly publication of the Reason Foundation, the country’s largest libertarian organization and one that–so oddly!–seems to spend a big whacking hunk of its time on personal liberty issues instead of valorizing businessmen and dreaming of a free market utopia.
Gee, I wonder how that happened.
Anyway, the item was this–and the people here from Pennsylvania can check it out for me.
Pennsylvania has a “child abuse registry” rather like the “sex offender registry.” It’s a public list of people who have committed child abuse.
Except–80% of the people on it have been convicted of no crime whatsoever. They have been placed on the list by state social service agencies, who are not required to presume their innocence, get a warrant for the search of private houses, adhere to a beyond a reasonable doubt standard of proof, or in any other way recognize what are supposed to be the accused Constitutional rights. What’s more, they base their “findings” on regulations that have been passed by no legislature and standards that they are allowed to institute largely without democratic oversight–and they are completely and utterly unaccountable to anybody.
They can’t even be sued.
(It was a PA case where a six year old boy was removed from his parents house on charges of sexual abuse that turned out later to have been made by a neighbor with a grudge, then put into foster care where he was raped by his foster father and contracted HIV. The parents challenged the state law that forbids suits in such cases, and Sandra Day O’Connor cast the deciding SCOTUS vote against the parents on the assumption that the need to protect ‘the children” meant that suits could not be allowed. Everybody makes mistakes.
And that’s why I’m not in favor of “tort reform.” We already have tort reform for most government agencies.
It’s also why I was never a big fan of Sandra Day O’Connor.)
Anyway, I’m not worried about earmarks, and I know that getting governments to pass laws and forbidding agencies to issue them won’t solve all our problems.
But a government passing laws is a legitimate democratic system, no matter how far it gets its head up its ass.
A government where laws are issued by agencies is not a democratic system at all.
And one where such agencies cannot be held accountable on any level–cannot even be sued when they do something as life-destroying as the two illustrations above–is beyond just out of control.