Tuesday, May 9, 2017

There Will Be Blood

You may remember the movie, There Will Be Blood, from 2007. In case you haven't seen it, it's a movie about an oil entrepeneur in the 1930s, who becomes rich over the course of the film by doing awful things. At the very end, he snaps, and it culminates in this famous scene:


A lot of people don't like this scene because it's overwrought, although I disagree with that assessment because I have witnessed people (both in person and via video) undergoing psychoses and emotional breakdowns who acted in a similar manner to this. It has been years since I last watched the whole movie, but it brought up something that has been on my mind lately. This movie seems to be about ruthlessness or American cynicism or capitalism, and that's probably what its makers intended, but to me, it looks like an indictment of the optimistic modernism of the early 20th century. This is not incompatible with (what I assume are) its intended themes.

In a nutshell: there was an optimism in the United States in the early 20th century that we'd reached the end of history, and There Will Be Blood showed the final outcome of that optimism. Let me explain.

Whenever I read contemporary American texts from the early 20th century, I can almost feel the smugness radiating off the screen, or page, as the case may have it. People then were convinced that they had found The One Way Forward for humanity, that a benign liberal humanism would create a perfect world. There was a lot of Utopian thinking, a lot of faith in progress. The Theory of Evolution was still new and technology was advancing by leaps and bounds. Skyscrapers, a kind of symbol of the 20th century that set it apart from the 19th, were already being built in New York and Chicago. Thanks to technology, man was finally master of his own destiny.

One found this same optimism in Europe up until World War One; after World War One, the Europeans became disillusioned with modernity, T.S. Eliot wrote The Hollow Men and everybody realized that scientific progress was being used to make horrifying weapons and not Utopias. The Europeans saw their countries ravaged by the first World War, and the ironic pessimism of postmodernity was already in its nascent stages. In the United States, however, the first World War was something for returning veterans and newspapers, not a visceral event that everybody saw firsthand. There was no blood and thunder on the shores of the United States itself. Americans wouldn't see that firsthand until the 21st century when a couple of planes were flown into the World Trade Center, an event that would sound the death-knell for American optimism. We are still working through that process, but it has begun, although not everyone can see it yet.

There Will Be Blood is a lovely film because it shows the final outcome of this. The unfettered freedom of the individual, the secular liberalism, and the optimistic humanism all culminate in a charnel orgy where the most vicious person claws their way to the top. You can almost see the beast emerging from the optimism of modernity as Daniel Day-Lewis, a rich pragmatic oil tycoon, hobbles forward, face twisted in rage while pointing at himself and screaming "I am the Third Revelation!"

The really sad thing is that American capitalism was better than the alternatives. Neither European fascism nor Communism were able to provide better societies than American capitalism. In fact, they were even worse. All three were different attempts at creating a workable modern world. The aforementioned death of American optimism has led to an amusing phenomenon where ineffectual revolutionary movements have popped up, complete with clueless college students proclaiming the virtues of anarchism engaging in "anti-fascist" demonstrations. Sorry, guys, I think you got the wrong century. Nobody wants it.

The crossroads the world is coming to is the point where modernity itself is seen to be defective, and we realize that the whole modern project was unworkable to begin with. I do not know what will replace it, but we can't keep it up forever. The capitalistic American liberalism has only proven superior to Communism and fascism as a result of its ability to maintain a more open society and, more importantly, the fact that it has simply survived longer. Whether it's Josef Stalin smoking a Herzegovina cigarette while signing off on a list of gulag-bound political enemies, or Hitler cramming people into ovens, or an oil tycoon beating some poor loony preacher's head in with a bowling pin, modernity always seems to terminate the same way; the beast emerges and devours.

Tuesday, May 2, 2017

Misanthropy, pt. 8 - Homogenized and Full of Rats



An organism that breeds quickly and eats anything you put in front of it will survive nearly anywhere, because the requirements for its survival are simple. It doesn't need a lot of specific conditions in place to keep breeding. Such an organism can eat nearly anything and keeps breeding regardless of whether the environing area can support its offspring, so it spreads like the Plague and soon it's everywhere. When there is little food, mass die-offs happen, and then the process repeats as soon as a new source of food is found. Yeast reproduce quickly until the alcohol they produce kills them, for example. By contrast, organisms that are more complex and have specific needs are easy to kill, and much harder to keep alive. A bird that nests only in mature trees and has a specific diet cannot live in the little copse in your suburb, because it must have very specific conditions in order to stay alive. This is one reason you see the same kinds of animals over and over in suburban neighborhoods, because only species that can eat leaves and garbage and live anywhere can live in such a place. You see deer, squirrels, rabbits, the same few species of bird, and maybe one or two other kinds of animal that can eat anything and breed quickly. If you want a lot of biodiversity, you need a complex ecosystem with lots of niches to fill, so those specialist organisms that have specific survival requirements can have a place to fit in. There is a thing in biology called the competitive exclusion principle, which states that, if two species are competing over one resource, one of them will be driven to extinction; accordingly, if there are very few ecological niches, very few kinsd of resources, you end up with very few species. For example, if all you have is huge amounts of one kind of food, the landscape is eventually dominated by one species that eats only that kind of food. If you homogenize the world, if you turn everything into a giant suburb, you lose the specialists, the species that occupy unique niches. A homogeneous ecosystem with a low degree of complexity will support fast-breeding omnivorous pests and little else.

The point of this little philosophical excursion into ecology is that, taken as metaphor, this can illuminate something about human society, but first, we have to lay a little more groundwork. When we began to industrialize in the 1800s, we decided that standardization as the coolest thing ever and began standardizing everything. Fast forward a century or two and we've standardized everything - everything! Whether or not you're a good person is decided with reference to your "mental health," which has, in the United States anyway, its own special rubric (the DSM). Whether or not you're a good thinker is decided by whether you jumped through the appropriate hoops to get the right certificates. This standardization has helped in some areas (medicine, technical fields) and absolutely killed others (everything else).

Now, given the two observations above, what should we suspect about human society? As I said above, homogeneous ecosystems favor fast-breeding omnivorous pests; a simple homogeneous environment favors vermin because it lacks the nuance to support any organism more complex, which makes such environments boring and depressing. And, since industrialization, human society is just such an environment, in a metaphorical sense. So, what kind of person is dominant in our time? Sure, there are thinkers and writers and leaders, but they're all just people who jumped through the right hoops. A rat with a professional certificate is still just a rat.

Thoughts On Writing

These are not hard and fast rules. These are just ideas you could use, if you felt so inclined.
  1. One way to express yourself is to forget about expressing "yourself" and start expressing things. Consciously putting yourself into your writing is a niche skill for ironic comedians or people writing autobiographies. If you want the "real you" to come through, focus on making the most objective expression you can of things outside of yourself. If you're describing a fictional character, describe them as though writing an objective expression of a real person. Your personality (the thing you want to express so badly) will come through from writing this way.
  2. When a comic book artist draws a brick wall, they sometimes forego drawing every brick in favor of drawing a cluster of bricks here and there. Looking at the picture, you get the impression that the entire wall is made of bricks. This is a useful method for descriptive writing. You needn't list every detail. Just describe the most salient aspects of the thing, and the reader's mind will complete the picture. If I tell you that I'm in an old library, and that there's a person sitting at an antique wooden desk in that library behind a pile of dusty books, then what does that person look like? In all likelihood, you've already decided that the desk has carven claw feet and that the person behind it is either a wizardly old man or an old lady who looks like a Victorian schoolteacher.
  3. The surest way to screw up when you're making art is to do whatever you think people expect you to do, rather than making work that reflects what's going through your head on a day-to-day basis.

Tuesday, April 25, 2017

There Are No Rules


  1. Humans make rules to keep themselves in line. The "Thou shalts" of the Old Testament are rules. Compare St. Paul: "For me all things are permissible, but not all things are beneficial." This contrast is manifest in this statement: there are no rules, only consequences.
  2. Rules are useful fictions. If a behavior is bad most of the time, then we prohibit it. It's usually bad to speed through a red light, so we make a rule against that. But you would speed through a red light to prevent an arsonist from burning a school with children inside. The rule is a fiction; we only follow it because of the consequences. (This is complicated, though, because a society of people that adheres strictly to rules will experience many good consequences, provided the rules are well-crafted)
  3. What is the difference between a good person and an evil one? If I am evil, I will only consider the consequences that impact me and not the ones that impact other people; if I am stupid, I will fail to even consider those; if I am crazy, I will purposefully cause harm to myself. One can switch between these three at different times and in different places. Many people, perhaps even most people, are all three: crazy, stupid, and evil. Notice that evil is compatible with gaming the rule-system while stupidity and craziness are compatible with blindly following rules.
  4. Remember Kierkegaard's Knight of Faith and Nietzsche's Ubermensch! Neither has much regard for rules. We slip into slave morality and despair when we develop a belief in rules over consequences.

Sunday, April 23, 2017

Intellectual Time-Bomb

The internet age has created an intellectual time-bomb, which is already in the process of detonating and will continue to do so for some decades. Events like this play out over a long time. The basic propositions that will make this clear to you are as follows:
  1. The study of the humanities in general, and philosophy in particular, confers, among other things, a deep and nuanced understanding of the propagation of ideas and heightened abilities of persuasion. This does not come from studying some specific set of texts, but simply from being immersed in the study of said subjects for a long period of time (i.e. five to ten years). The study of logic and rhetoric is key.
  2. There internet has granted heightened access to the corpus of philosophical and other humanist work, allowing people to study the humanities without the need for university tuition or approval.
  3. The internet has made observation of, and participation in, the ideological and cultural milieu easier than at any prior point in history.
Elaboration on Proposition 1: a philosopher is, in some cases, a lonely soul brooding over abstruse problems in solitude. However, there is another side to philosophy, which is effective speaking and persuasion. It has been thus from Socrates down through the present day. More importantly, philosophical study will show you how ideas are challenged, how they decay, and how they change. There has been much ado about "memetics" over the past decade or so, but memetics is just one lens for understanding the play of ideas in a society. Any philosopher worth his salt can, given access to enough information (easily available now, thanks to the internet), trace the history of an idea, observe its origins, and, in many cases, predict its possible outcomes with a fairly high degree of accuracy. There is no complete, quantified natural science of the spread of ideas and manipulation of culture, which means that steering a society is thus more art than science, but with elements of both. In many cases, it resembles alchemy or the occult, and one sees this reflected in terms like "meme magic."

Elaboration on Proposition 2: there was a time when the only people with access to the knowledge requisite for understanding (and thus, manipulating) society, culture, and ideology, were academics and people with expensive degrees. This is no longer the case. The internet has made it possible for anyone with a modicum of intelligence and a strong Will to Power to educate themselves in philosophy and other related subjects and gain the knowledge previously confined to an elite intellectual class, with little to no need for participation in the university system. There are already a few such individuals; one of them is writing this essay. Anybody with a political axe to grind can teach themselves everything they need to know.

Elaboration on Proposition 3: there was a time when the only people you could influence without a lot of money or a prestigious social position were personal acquaintances. This is no longer the case. Anybody who is willing can teach themselves how social media works, and begin spreading their ideas this way. This normally amounts to a sap with no influence shouting into the wind. However, once an individual establishes even a small presence and studies the right subjects, they wield disproportionate power relative to their place in society. This is indirect or "soft" power. One need not "go viral" or invest any money. All that is required is a modicum of intelligence, a substantial education in the humanities, and sheer, bullheaded persistencce. This last is in place whenever someone is passionate enough to feel the need to engage in such activity, and this is quite often the case with political ideologues.

A few minutes' reflection on the above makes the point clear: the internet age has created thousands of people who are disgruntled with the status quo and also highly adroit at disrupting established thought patterns by means of social media. The recent crackdowns on "fake news" and "extremism" are inadequate and ultimately futile responses from established institutions to a threat that will never go away. The genie is out of the bottle, the milk's out of the glass, the cat's out of the bag, and the internet is literally swarming with people who are hungry for political disruption and know exactly how to accomplish it. The political Right, in particular, has become brutally effective in this regard. Hillary Clinton outspent Donald Trump during the 2016 American Presidential election by roughly a factor of 2, and still lost the election. Armies of establishment propagandists cannot overcome the untold numbers of people who spread ideological propaganda in their free time. The former must be paid; the latter do it for free, and there are more of them. The outcome is clear.

My own position is that many older and established institutions will slide into irrelevance, including academia, the media, and, with advancing technology, untold numbers of centralized services (taxi companies give way to Uber). So if we are to keep philosophy alive, it will have to be done outside of the academy. I came to this conclusion independently, but certain academics have also recognized this and are not reluctant to say so, although the fact that they do so anonymously is telling. They have a few ideas about how to maintain philosophy, and so do I. My own vision must wait for another entry, however.

Saturday, April 22, 2017

What's it good for?

So, if you ask a professional philosopher about wisdom or the good life, they will thrust their nose in the air and inform you disdainfully that philosophy isn't about being "wise" or any such rubbish. It's about dealing with a bunch of technical problems made up by academics so they could have a job. The professional philosopher does not care why they get paid, so long as they do. As the years have gone by, I have increasingly lost patience with that kind of thing. If you're a geeky analytical guy like me, that stuff is fun, but making a life out of it? No, thank you. Sure, people like Descartes and Spinoza and Hobbes were spending their time on obscure and difficult subjects, but there were reasons for it besides "I want to wear a suit and get paychecks, but not have a real job."

So what's philosophy good for? I could let go of the word, "philosophy," but I don't want to do that. Those ancient Greek guys in togas and the people who replied to them down through the ages were doing something important, and the people who assumed the title of "philosopher" around the turn of the 20th century aren't doing it any more. Perhaps an academic would ask me what I think that important thing is, and I confess I can't rigorously define it in a way that would be acceptable in your very prestigious department. But your attitude says everything, Mr. Academic. You want to go to conferences and get paychecks, and be around clever and hard-working opportunists like yourself - and that's it.

What's it good for? Well, being reflective is good, if you want to become wise. I'm not wise, but I think philosophy may help with that. It's the meaning of the word, dumbass: "love of wisdom." Academic philosophy has outlived its usefulness. You can do philosophy of science all you want, but the physicists still don't respect you. Neither does anyone else outside of your field, really. The clock is ticking. Give it 50 years, if that.

Friday, April 21, 2017

Genuine

What does it mean to be genuine? I guess it means unmediated behavior, not consciously planned, not an affectation. If I ask you to tell me your favorite food, and you respond by saying "I like stale cheeseburgers from McDonald's" in an attempt to be funny and ironic, then that's not genuine. If you reply, "I dunno, steak I guess?" and really mean (?) that, then you're being genuine. If I tell you to cut it out and tell me what you actually like, then I'm asking you to be honest, and honesty is connected to genuineness. The idea is to say something just to say it, rather than seeking a certain reaction from the other person.

There's another side to this, which is that a lot of people have, in their minds somewhere, this little box of goodies they like to think of as "The Real Me." There are certain sets of behaviors and opinions that such people believe to be authentically theirs. When such a person asks you to be genuine with them, they are assuming that you have a similar box of goodies somewhere, and they want to see what's in it. If you do not think of yourself as possessing such a box of goodies, then you'll be a little thrown off when someone asks to see The Real You. Many of these people approach this in a less than healthy way.

This is where it gets hairy. Social behavior is just that: social. You act differently around Friend A than you do around Friend B. Which one is more authentic? Trick question: neither. All of your social behavior is, in some aspects, a performance. Sure, there's such a thing as dishonesty, where you conceal your real intentions from the other person. But if you make your intentions clear, and the other person acknowledges that you are being honest, then what else is there to do? There is a turn of phrase, "You can't get blood from a turnip." You're a turnip, and they're trying to wring blood out of you. "Show me your box of goodies!" they yell, and all you can do is shrug helplessly and say, "There's nothing in here, man."

The same thing incidentally happens when people are asking you to display an emotional reaction when you're not having one. Someone asks, "How does that make you feel?" regarding something that doesn't have an emotional effect on you. "How do you feel about so-and-so?" they might ask, and you simply don't have any feelings about so-and-so, besides "Meh." Then you're caught in a dilemma. You can fake an emotional reaction, at which point they'll detect the fakery and ask for your real reaction, or you can honestly tell them that you haven't got a reaction, at which point they'll get angry at you for not having an emotional response to something they care about and begin demanding that you display some kind of emotion ("Show me the goodies!"). What to do?