Tuesday, May 9, 2017

There Will Be Blood

You may remember the movie, There Will Be Blood, from 2007. In case you haven't seen it, it's a movie about an oil entrepeneur in the 1930s, who becomes rich over the course of the film by doing awful things. At the very end, he snaps, and it culminates in this famous scene:


A lot of people don't like this scene because it's overwrought, although I disagree with that assessment because I have witnessed people (both in person and via video) undergoing psychoses and emotional breakdowns who acted in a similar manner to this. It has been years since I last watched the whole movie, but it brought up something that has been on my mind lately. This movie seems to be about ruthlessness or American cynicism or capitalism, and that's probably what its makers intended, but to me, it looks like an indictment of the optimistic modernism of the early 20th century. This is not incompatible with (what I assume are) its intended themes.

In a nutshell: there was an optimism in the United States in the early 20th century that we'd reached the end of history, and There Will Be Blood showed the final outcome of that optimism. Let me explain.

Whenever I read contemporary American texts from the early 20th century, I can almost feel the smugness radiating off the screen, or page, as the case may have it. People then were convinced that they had found The One Way Forward for humanity, that a benign liberal humanism would create a perfect world. There was a lot of Utopian thinking, a lot of faith in progress. The Theory of Evolution was still new and technology was advancing by leaps and bounds. Skyscrapers, a kind of symbol of the 20th century that set it apart from the 19th, were already being built in New York and Chicago. Thanks to technology, man was finally master of his own destiny.

One found this same optimism in Europe up until World War One; after World War One, the Europeans became disillusioned with modernity, T.S. Eliot wrote The Hollow Men and everybody realized that scientific progress was being used to make horrifying weapons and not Utopias. The Europeans saw their countries ravaged by the first World War, and the ironic pessimism of postmodernity was already in its nascent stages. In the United States, however, the first World War was something for returning veterans and newspapers, not a visceral event that everybody saw firsthand. There was no blood and thunder on the shores of the United States itself. Americans wouldn't see that firsthand until the 21st century when a couple of planes were flown into the World Trade Center, an event that would sound the death-knell for American optimism. We are still working through that process, but it has begun, although not everyone can see it yet.

There Will Be Blood is a lovely film because it shows the final outcome of this. The unfettered freedom of the individual, the secular liberalism, and the optimistic humanism all culminate in a charnel orgy where the most vicious person claws their way to the top. You can almost see the beast emerging from the optimism of modernity as Daniel Day-Lewis, a rich pragmatic oil tycoon, hobbles forward, face twisted in rage while pointing at himself and screaming "I am the Third Revelation!"

The really sad thing is that American capitalism was better than the alternatives. Neither European fascism nor Communism were able to provide better societies than American capitalism. In fact, they were even worse. All three were different attempts at creating a workable modern world. The aforementioned death of American optimism has led to an amusing phenomenon where ineffectual revolutionary movements have popped up, complete with clueless college students proclaiming the virtues of anarchism engaging in "anti-fascist" demonstrations. Sorry, guys, I think you got the wrong century. Nobody wants it.

The crossroads the world is coming to is the point where modernity itself is seen to be defective, and we realize that the whole modern project was unworkable to begin with. I do not know what will replace it, but we can't keep it up forever. The capitalistic American liberalism has only proven superior to Communism and fascism as a result of its ability to maintain a more open society and, more importantly, the fact that it has simply survived longer. Whether it's Josef Stalin smoking a Herzegovina cigarette while signing off on a list of gulag-bound political enemies, or Hitler cramming people into ovens, or an oil tycoon beating some poor loony preacher's head in with a bowling pin, modernity always seems to terminate the same way; the beast emerges and devours.

Tuesday, May 2, 2017

Misanthropy, pt. 8 - Homogenized and Full of Rats



An organism that breeds quickly and eats anything you put in front of it will survive nearly anywhere, because the requirements for its survival are simple. It doesn't need a lot of specific conditions in place to keep breeding. Such an organism can eat nearly anything and keeps breeding regardless of whether the environing area can support its offspring, so it spreads like the Plague and soon it's everywhere. When there is little food, mass die-offs happen, and then the process repeats as soon as a new source of food is found. Yeast reproduce quickly until the alcohol they produce kills them, for example. By contrast, organisms that are more complex and have specific needs are easy to kill, and much harder to keep alive. A bird that nests only in mature trees and has a specific diet cannot live in the little copse in your suburb, because it must have very specific conditions in order to stay alive. This is one reason you see the same kinds of animals over and over in suburban neighborhoods, because only species that can eat leaves and garbage and live anywhere can live in such a place. You see deer, squirrels, rabbits, the same few species of bird, and maybe one or two other kinds of animal that can eat anything and breed quickly. If you want a lot of biodiversity, you need a complex ecosystem with lots of niches to fill, so those specialist organisms that have specific survival requirements can have a place to fit in. There is a thing in biology called the competitive exclusion principle, which states that, if two species are competing over one resource, one of them will be driven to extinction; accordingly, if there are very few ecological niches, very few kinsd of resources, you end up with very few species. For example, if all you have is huge amounts of one kind of food, the landscape is eventually dominated by one species that eats only that kind of food. If you homogenize the world, if you turn everything into a giant suburb, you lose the specialists, the species that occupy unique niches. A homogeneous ecosystem with a low degree of complexity will support fast-breeding omnivorous pests and little else.

The point of this little philosophical excursion into ecology is that, taken as metaphor, this can illuminate something about human society, but first, we have to lay a little more groundwork. When we began to industrialize in the 1800s, we decided that standardization as the coolest thing ever and began standardizing everything. Fast forward a century or two and we've standardized everything - everything! Whether or not you're a good person is decided with reference to your "mental health," which has, in the United States anyway, its own special rubric (the DSM). Whether or not you're a good thinker is decided by whether you jumped through the appropriate hoops to get the right certificates. This standardization has helped in some areas (medicine, technical fields) and absolutely killed others (everything else).

Now, given the two observations above, what should we suspect about human society? As I said above, homogeneous ecosystems favor fast-breeding omnivorous pests; a simple homogeneous environment favors vermin because it lacks the nuance to support any organism more complex, which makes such environments boring and depressing. And, since industrialization, human society is just such an environment, in a metaphorical sense. So, what kind of person is dominant in our time? Sure, there are thinkers and writers and leaders, but they're all just people who jumped through the right hoops. A rat with a professional certificate is still just a rat.

Thoughts On Writing

These are not hard and fast rules. These are just ideas you could use, if you felt so inclined.
  1. One way to express yourself is to forget about expressing "yourself" and start expressing things. Consciously putting yourself into your writing is a niche skill for ironic comedians or people writing autobiographies. If you want the "real you" to come through, focus on making the most objective expression you can of things outside of yourself. If you're describing a fictional character, describe them as though writing an objective expression of a real person. Your personality (the thing you want to express so badly) will come through from writing this way.
  2. When a comic book artist draws a brick wall, they sometimes forego drawing every brick in favor of drawing a cluster of bricks here and there. Looking at the picture, you get the impression that the entire wall is made of bricks. This is a useful method for descriptive writing. You needn't list every detail. Just describe the most salient aspects of the thing, and the reader's mind will complete the picture. If I tell you that I'm in an old library, and that there's a person sitting at an antique wooden desk in that library behind a pile of dusty books, then what does that person look like? In all likelihood, you've already decided that the desk has carven claw feet and that the person behind it is either a wizardly old man or an old lady who looks like a Victorian schoolteacher.
  3. The surest way to screw up when you're making art is to do whatever you think people expect you to do, rather than making work that reflects what's going through your head on a day-to-day basis.

Friday, March 24, 2017

"It's a social construct!"

Suppose I point at an object. A normal, everyday object, like a door. I ask you, "Is that an elephant?"
You ask me, "Are you drunk? Are you high? Are you insane?"
"I'm not crazy," I assure you, "This is philosophy."
You sigh in resignation.
"No," you say, "That's not an elephant."
When you say, "That's not an elephant," that is all you mean: that's not an elephant, you lunatic. You don't mean, "That's not something that we attach the arbitrary linguistic sign 'elephant' to as a consequence of the use of that sign in a language game," or any other such piece of philosophical analysis. You just mean that it's not an elephant.

But there's a problem. If I ask that fatal question, "What do you mean by that?" then we can get caught up in all manner of irrelevancies. You might respond, "It's not a creature with a trunk and four legs and gray skin." And I might decide to be a smartass and say, "So if I cut off an elephant's trunk, it's no longer an elephant?" This can go on and on. I'm not saying that we shouldn't get into such semantic tangles because they're annoying. Such debates are important! The abortion debate is, for many people, a semantic question about the meaning of the word, "baby." But that's not what I want to focus on here.

The first concept I want you to grasp, here, is a little meta to the whole discussion about whether or not something is a baby or an elephant or whatever. The first concept is this: we can argue over whether my car is still a car when it's disassembled and in pieces in my garage. But we know that an oak tree is not a car. More importantly, when we say that the car I'm leaning against isn't an oak tree, we're not talking about words. Most importantly, some things are definitely cars, and some things are definitely not cars. I want you to digest that. Now, for the hard step: I want you to combine those two ideas. First, that there are some things that are definitely oak trees and some that are definitely not, and second, that when we say that, we're not talking about words. What happens when you combine those two ideas?

What happens is this: you realize that, just because something is vague, doesn't mean that it's not real. Yeah, there are borderline cases, like the dis-assembled car, where something is kind of a car and kind of not a car. But that doesn't mean that there is no such thing as a car. It doesn't mean that cars aren't real things or that cars, to use a somewhat abused turn of phrase, are socially constructed. It means that there are cars, and things that aren't cars, and things that are kind of cars.

There's a problem, though. Remember a few paragraphs ago, when I said that the best way to end up in an endless semantic tangle was to ask, "What do you mean by that?" Them we end up debating about words. Now, there's nothing wrong with debating about words. The fact that we do that isn't a problem. The problem is this: we can abuse reasoning about words to say absurd things. Philosophers get accused of doing this a lot, but really, philosophers spend most of their time trying to escape this problem, not create it. If I ask you what you mean by "That's not an elephant," and you respond by listing the attributes of "elephant" as "Creature with four legs and a long nose," I can be a smartass and ask if an anteater is an elephant. The fatal step, the real problem, is this: I can then use that line of reasoning to decide that there's no such thing as an elephant, and if you insist that there is, I'll demand that you nail it down precisely. And no matter how precise your definition is, I can keep on finding counterexamples and tangling you up until you give up, and possibly throttle me out of frustration. It takes a lot of verbal acumen for someone to stop me in my tracks by saying, "You've gotten the cart before the horse. Just because we understand something through language doesn't mean that that thing is as arbitrary as language, any more than seeing you with my eyes means that you exist only inside of my eyes."

I want to stress, again, that there is nothing wrong with asking for a precise definition. While this essay wouldn't cut it in an academic journal, I am, all the same, being somewhat more careful with my words here than is needed for ordinary conversation. I am doing so because it is useful for the purposes of this essay. And sometimes, you really do need to nail things down to a high degree of precision. But that step in reasoning at the end, where I claim that there's no such thing as an elephant, is still wrong. It's a fallacy. A mistake in reasoning. If my thinking leads me to reject the existence of elephants, then I've screwed up somewhere, full stop, even if I try to weasel out of it with some hair-splitting about how I really do believe that there are elephants, even though I don't. And, to the credit of those pesky philosophers everyone hates for doing this, they do, in fact, have a name for this mistake: "The Sorites Fallacy," for the Sorites Paradox that outlines this very problem. It's a very old idea. That people still make this mistake is a painful indicator that we moderns are perhaps not as clever as we'd like to think. We're still making the classic philosophical mistake of identifying the way we know about something with that very thing. More than twenty centuries later, all of the dreadfully clever and enlightened moderns laugh at the ignorance of the past while making shadow puppets in Plato's cave.

The reason that the phrase "socially constructed" gets so much hate is because, when people argue that such-and-such is socially constructed, they often abuse this line of reasoning and commit the fallacy above. Somewhere along the line, a few French philosophers and their followers rediscovered this problem and, while pointing out some of the legitimate challenges it raises, still committed the classic postmodernist error of getting far too excited about a relatively minor problem. This French school of philosophy became enormously influential, and, at the time of this writing, the humanities departments of most universities in the Western world are saturated by this kind of thinking, without even knowing it. Liberal arts students, or, as N+1 Magazine likes to call them, "The Theory Generation," are indoctrinated into a particular worldview based upon this very narrow era of philosopy. As the indoctrination progresses, they are told that they are learning "critical thinking" from well-meaning professors. Your average university students thinks that their outlook is just what how enlightened people think. They're convinced that, if you confront your deeply-held assumptions about how things are, you will think the way they do. "If you'd just think for yourself, you'd agree with me!" They don't realize they've accepted a particular worldview, because they were told that they've transcended worldviews. They don't think they've been indoctrinated, because they were told that they were learning how to avoid being indoctrinated. They don't think, because all matters of thought have been settled for them. And if you point this out to them, if you suggest the idea that perhaps they're doing the very thing they accuse everyone else of doing, they'll give you an ironic smile and very patiently say, "Well, you haven't really thought it through."

Wednesday, March 22, 2017

Misanthropy, pt. 6 - Inclusion

The fastest way to ruin anything is to include everyone. I can give you a conceptual argument here and point indirectly at the principles from which this follows, but that approach doesn't work for most people. Your average Joe wants to see examples and work backwards from that - this, incidentally, is why your average Joe is so bad at math and deductive reasoning. Out of deference to Joe, I'm going to start with some examples. It's in list-format, thus easily-digestable and familiar so you don't have an excuse not to read it.

Bear in mind that the following is not a facile critique of consumerism. It cuts to a deeper level, i.e. inclusiveness as the parent of consumerism.

  1. Public spaces. When I was young, I used to ride my bike several miles out of town to a little park, nestled in a valley by a tiny redneck village, because it was my favorite park. It was my favorite park because you could walk around without seeing depressing things, like used condoms and beer bottles. It was my favorite park because when I went there, I was often the only one around, so I didn't have to deal with suspicious people asking what a long-haired teenage boy in a Black Sabbath shirt was doing in an ostensibly public park. It was my favorite park because there were no well-worn jogging trails or people in shorts and Underarmor shirts jogging around with iPods complaining about the mosquitos biting them on their intentionally exposed skin. It was my favorite park because its unpopularity meant it was not landscaped or manicured, so you occasionally saw "unacceptable" things like fallen trees, and even (gasp!) dangerous things like snakes or steep hills. It was my favorite park because it was sanitary in the right way (no cigarette butts or Budweiser cans), and also unsanitary in the right way; snakes, animal crap, and rotting tree-trunks that you had to climb over to continue. Ever try climbing over a rotting tree-trunk? You learn to test it with your foot first to see if it will hold your weight. Navigating a place like that is a skill, which makes it inconvenient for people who want to lose weight by riding around the place on an expensive name-brand mountain bike.

    This touches on another problem of inclusion; including everyone leads to a kind of entropy. Once the general public has claimed a space as its own, it becomes homogenized. How many manicured suburban parks have been created by indignant suburbanites complaining about the bugs and fallen trees? There are millions of places like that - and any place that isn't like that quickly homogenizes into another milquetoast outdoor rec-center, as soon as the hordes of respectable people find it.

    Saddest of all is this: that little park I went to wasn't even that wild! It was just a park that the really important people had forgotten about, and thus served as a mildly out-of-line little retreat. At the borders of our all-inclusive paradise, one occasionally finds something that is, optimistically speaking, mildly interesting compared to the stuff you find in the center.

  2. Popular music. How often do you hear the phrase, "Top 40s shit"? Too often. Much (though not all) popular music is designed to appeal to the greatest number of people, and, in doing so, lacks any interesting qualities besides surface-level novelty. Occasionally, you get a "sad" song in minor key (novelty) or a song with a sitar in the background or something. But strip away that single embellishment and you see the same skeleton as you do in every popular song. I don't mean this in a strictly music-theoretic sense (e.g. chord progressions), but in the sense that what you always find, at bottom, is a cheap trick designed to stick in your head long enough to make you buy an album or a concert ticket.

    Of course, there are rock bands with interesting musical ideas, and hip-hop artists with interesting lyrics, and so on. But these are inevitably niche artists; King Crimson can't approach the popularity of Miley Cyrus, precisely because bands like King Crimson insist on making music that won't appeal to everyone who hears it.

    It's true that the classical music of yesteryear was popular music in some sense. Mozart composed background muzak for rich people's parties. But even that did not bear the burden of appealing to tens of millions of people within ten seconds of coming on the radio. The speed of transmission granted by modern technology, especially radio, made it possible to accumulate enormous profits from including anybody who could afford a radio, or had a friend who could afford one.

    As said earlier, though, consumerism is a symptom, not the disease. The Soviet Union forced even talented composers like Shostakovich to create shallow tripe designed to appeal to huge numbers of people. In fact, an aspect of Soviet musical doctrine was the vilification of "formalism", or music that was considered too complex to appeal to the proletariat. The prevalence of this phenomenon outside of a capitalist society points to a deeper root for the problem than capitalism or consumerism.
  3. Subcultures. One reason that hipsters are so widely reviled is that they tend to suck the life out of authentic ideas by wearing them ironically, thus reducing them to yet another empty fashion statement. Fashion, taken in a broad sense (not just clothing but fashions in general) is what most of us use for distraction. By the time you realize it's a farce, another fashion has arisen to distract you a while longer. But in order to be effective, fashion has to be completely exoteric; anyone who can afford to buy the New Thing is admitted, and those who do so have nothing in common besides having the New Thing.

    We are accustomed to rolling our eyes at the youth who abandon a subculture as soon as it becomes popular. But the reason that people do this is not just to be rebellious and contrarian and cool, although that is certainly part of it. Being contrarian is part of the deal, but a bigger part of it is identity. The identity that one acquires from participating in a subculture is only worthwhile because that subculture is not an all-inclusive carnival that accepts anyone with the twenty-two-fiddy needed for admission. The identity of a subculture only means something when the subculture is a group for only a few people with very specific interests. Once the general public stampedes in, it just becomes another idle distraction.

    Again, we are accustomed to rolling our eyes and dismissing such things as "Cool Kids' Clubs" for people who desperately need an identity to cling to. What we never ask is, "Why do adults feel the need for a Cool Kids' Club?". It's not just immaturity on the part of the people who do it (although it is also that). It is, in large part, because those of us born into an age of inclusion are desperate for anything that we can authentically call our own. We didn't rebel as teenagers solely because of a hormonal imbalance or bitterness toward our parents. We rebelled because the teenage years were when we first began to sense that something was amis.
Now that you've read the handy-dandy examples presented in whiz-bang list format, I'll go ahead and lay out the basic principle. The following is dense and may require several read-throughs to fully digest, but I have faith in my readers.

Entities - from a physical object like a tennis ball to an abstract thing like a musical subculture - are defined by their boundaries. A boundary delineates what is and is not included in that entity. Boundaries can be vague, like the question of where blue ends and yellow begins, but the existence of borderline cases does not mean that everything is the same. We can argue over whether or not a dis-assembled chair is still a chair, but the thing I'm sitting on right now is definitely a chair, and a blue whale swimming around somewhere in the Pacific right now is definitely not a chair. You can have borderline cases, and you can quibble over how "real" the categories are, but the fact remains that, if you remove all of the boundaries of an entity, you remove the entity.

The preceding paragraph is, no doubt, enough to set off the "non-socially-acceptable idea" alarm bells in the heads of many educated people. The first response will be calculated disdain, but as this idea gains more traction - and it will - it will have to be confronted. The next step will be quibbling over whether or not we should care about categories, and that resistance will quickly be swept aside by the simple pragmatic fact that people want meaning, and modernity can't provide it. No amount of Sophistical obfuscation can quiet the psychic scream born of regimented egalitarian nihilism. Of course, the academics will not take note of this fact, to their great disadvantage.

Saturday, March 18, 2017

Misanthropy, pt. 5 - Top Monkey

Hell isn't other people. Hell is yourself. - Wittgenstein

If you're a bitter person, you may be tempted to interpret all human behavior as posturing for power and status. Once you stop being bitter, or become less bitter, you'll take a more relaxed view, but honesty still dictates that most of the stuff people do involves posturing. Watch a group of friends sitting around a table and talking. When they tell stories, when they tell jokes, when they give their opinions, when they decide what bar to go to, there is always posturing involved.

There's more to it, of course. Human motivation is complex. If I give change to a homeless guy, perhaps I'm doing it to look good to the people around me or cement myself as powerful giver and the homeless guy as weak receiver. But I am also doing it because he looks hungry and I figure he could use a cheeseburger. Our actions don't always boil down to just one thing. That realization will help you stop being so paranoid and bitter if you're willing to internalize it. You have to know when to put your foot down and when to let it go. You have to accept that power dynamics are part of human interaction, and if you're going to have a friend group then you have to spend some time maintaining your status, as simian as that task is. But it becomes less loathsome when you realize that the power dynamics are only part of the deal, albeit a very large part. You will be less disgusted by people, and human interaction will seem less revolting if you realize that it's not all about being top monkey.

But there's a catch.

The catch is that, sometimes, status isn't 100% of everything, but it's still 95% of everything, and that's not much better. And these situations, especially in terms of employment, are situations that cannot be avoided. The problem is that much of what you want out of life is tied to your status, and if you don't have much patience for posturing, you're gonna be limited in what you can have. That's just how the chips fall.

This line of thinking inevitably runs into the practical issue. At that point, somebody will say this: "Look, I know you think you're too good to do the tribal ego-dance with the rest of us, but you're not. So come down off of your cloud and play top monkey. You're just afraid that you're going to lose. It's just weakness."

Up to a point, this is a legitimate response. If you're so terrified of what people do to each other that you can't leave your house, you have a problem. They're not gonna eat you. The response is legitimate, but only up to a point. It is true that you have to play the status game to function. But - and there is no nice way to say this - the status game is just ridiculous. It's silly. I win, and now I'm sitting on my monkey-throne, wearing my monkey-crown, and I have the status. I won the trophy. I got the sticker. I'm king of the monkeys. Hooray! This is the kind of thing you hear from bitter people, true, but people don't say that posturing is silly because they're bitter. It's the other way around: they're bitter because they think posturing is silly. And when a bitter person says that posturing is silly, they're right!

More than silly, it's also corrosive. Anyone with significant life experience can tell you that competition for status brings out the worst in people. A preoccupation with increasing status is the first step to becoming evil. A corporate executive who founds a nature preserve as a PR move, and then dumps toxic waste upstream of that very nature preserve because it's cheap and technically legal, is a good example of this. So is the girl blowing her boss in the bathroom to get  a promotion. So are the schoolyard bully and the absent work-a-holic father. So are cog children.

Remember what I said a few paragraphs ago, about how our actions seldom boil down to just one thing? Bitterness is sometimes the product of a distorted, skewed outlook. But very often, it is also the product of being honest with oneself. You can stop being bitter once you figure out how to accept the truth without being bitter. In this case, the way you do that is by taking the wide view. You have to see how power dynamics fit in with the rest of human behavior. You have to understand that posturing is part of the package, but not the whole thing.