Friday, March 31, 2017

The Big Problem

The brain is a gift. But it's also a curse, and may well do us in - not through war, necessarily, but through an intrinsic flaw in having such a complex cognitive system. There is a tragic flaw in humans, a disconnect in our nature that leads to our troubles. I think that this flaw is at the root of the vast majority of human problems.

Humans, you see, have big brains. Huge brains. Huge, complicated, sophisticated, extremely powerful brains. The human brain is the most devastating natural weapon on Earth. Chimpanzees are smaller than us but twice as strong; our muscles are weak because so much of our metabolic output goes to feeding our gigantic brains. Yeah, a chimpanzee can maul a human. And humans, if we felt so inclined, could waste the entire natural range of the chimpanzee with a salvo of nuclear weapons. Sorry, chimps, no contest. Even a group of primitive humans with spears can take down an elephant or a mammoth, build walls to keep the predators out, build guns to kill one another. Warfare is terrifying because no creature is as capable of killing you as another human. When you go to war against a group of humans, you have decided to enter combat with the most dangerous animal on Earth. A chimpanzee will chew your face off; a human will just drop a thermobaric explosive on your neighborhood and you will die, either in the resulting fireball or by being hurled into a burning wall by the shockwave. There are no chimpanzees that can kill me right now, because there aren't any for miles around. But there are several powerful humans who could decide to push a button and very quickly murder me and everyone around me. You're scared of the coyotes you hear outside at night? Pfft. Watch your next door neighbor.

(This is one reason that sports like boxing and MMA are so amusing; two weak primates beating on one another, and a bunch of other weak primates acting tough about it. If it were a real fight, then one of the challengers would produce a pistol and end the bout in 5 seconds.)

And the human brain can do other things, too, like writing Beethoven's 7th Symphony or painting the Mona Lisa or go to the fucking moon. Or create our own little worlds, in movies and books, and tell other people about them. We can even create little worlds inside of our heads, and use those worlds as shorthand for the real world, which is how we get around - and therein lies the problem. You must understand that humans explain things to themselves. We have little mental models of how things work. We make little maps of the world.

When I turn the key in my car's ignition, and the car starts, I explain this to myself by saying that turning the key in the ignition made the car start. I have a little map, a little mental model, in my head, and that little mental model says that turning the keys in the ignition makes the car start. This belief is deeply ingrained, so if I turn the key in the ignition, and the car doesn't start, I assume that there is something wrong with the car, not that I am wrong about how keys and ignitions and cars work. I assume there's something wrong with the car, not with my beliefs about how cars work. When I go up a hill in my car and my ears pop, I assume that it's because of a pressure difference. If I don't feel my ears pop, I assume that there's something wrong with my ears, not that I was wrong about how ears work and how pressure affects them. I don't assume that I was wrong, but that my ears were wrong. More realistically, I might assume that my ears did pop, and that I just didn't notice because I wasn't paying attention. When I put water in the freezer, and the water doesn't freeze, I assume there's something wrong with the freezer, because my mental model says that, at the atmospheric pressure of my dwelling, water freezes at (more or less) 0 degrees Celsius which is 32 degrees Fahrenheit, and I'm not going to drop that belief if it makes more sense to say that my freezer has a problem.

This highlights a basic truth about human cognition: when things don't go as we expect, when the car doesn't start or our ears don't pop or the water doesn't freeze, we have two options. Our options are either to believe either that something about the circumstances was different from what we assumed, or that our mental model was wrong. The big problem with humans is that sometimes, we choose wrong when we make this choice. Sometimes we alter our mental model when we ought to blame circumstances, and sometimes we blame external circumstances when we ought to change our model.

Enter hubris.

Many, though not all, religions have a similar theme. The theme is that humans do things that are bad or wrong when we place ourselves at the center of the universe, when we assume that we must be right, when we assume that we must be good, when we assume that it must be so because we think it so. Christianity calls this "pride" and says that it's the deadliest sin - "Satan is a liar!" In Buddhism and Indian religions, you'll hear talk about the ego and the veil of Maya (illusion) and similar ideas, all of which come down to people mistaking their little mental model for the genuine article. Hubris is the primary human failing, our tragic flaw - and I note that the original Greek word for a tragic flaw, Hamartia, meant "to miss the mark" or "to err."

You have to be careful here. You have to understand that this is not just a panegyric on open-mindedness or a schoolmaster's admonition to be reasonable. I'm not just telling you to keep an open mind, or to critique your cultural context, or whatever. I'm telling you that you are a finite and tiny part of a huge universe, and that, if you jump rank, you're going to pay for it. Open-mindedness and critical thinking and reasoning and logic and awareness of cognitive biases are all well and good, but they will ultimately fail unless they are grounded in humility. I am telling you to be humble. Contrary to popular belief among "educated" people, there is a natural order to things, and it is reflected in the fact that your mind is far less complex than the real world. Ignore this at your own peril.

Tuesday, March 28, 2017

The Populist Wave (tantrum warning: politics)

Looking at the state of politics in Europe and many of its colonial exponents, one sees a number of parallels to the early 20th century - and, for that matter, to the 1960s. I have heard the phrase (attributed to Mark Twain, but God knows if he really said it), "History doesn't repeat itself, but it rhymes." The present age is the beginning of a new stanza, and its structure will be a variation on the structure of the previous stanza.

1. The Early 20th Century
The early 20th century was a time of upheaval between two rival ideological currents, culminating in two World Wars. I am not asserting that World War III is around the corner, or that we're all staring down Armageddon next Tuesday, of course, only noting that upheavals tend to come to a head at some point, whether that takes the form of warfare or something else. From the late 19th century up through the middle of the 20th, we see the great powers at a crossroads. On the one side, we have a definite Marxist current; we are all well aware of the Marxist revolutions in Russia and Asia, but there were Communist parties operating throughout Europe and the Americas. On the other side, we have the fascist movements, which were also common throughout Europe and the Americas. The big one was Nazism, of course, but most of us under-educated Americans are at least dimly aware of Mussolini's Italian fascism and Franco's fascist Spain. Those with a little more education are aware of the Hispanophone Catholic fascism of Augusto Pinochet's Chile, and the presence of other such movements in South America.

The point of the preceding paragraph is to point out something that many in our age miss: the struggle in the early 20th century was not between a monolithic fascist Axis and democratic Allies, but a struggle between two rival ideologies that fought it out in dozens of countries, culminating in a fight between Allied powers and Axis powers. There was a British Union of Fascists. There was a German Communist Party. Two ideologies were struggling for supremacy in many nations, and this struggle was one of many factors that shaped the alliances in World War II.

Democracies were largely in the middle, but ended up allying with the Communists for largely geopolitical not ideological reasons. Any pretension on the part of Americans or Brits to having always been fervent anti-fascists is just a pose. Winston Churchill, to Mussolini: "If I had been an Italian, I am sure I would have been entirely with you from the beginning to the end of your victorious struggle against the bestial appetites and passions of Leninism." (one of many sources for this quote) Churchill, Truman, and Roosevelt were all playing the geopolitical game, and acted as a deciding factor in the struggle between fascism and Communism.

Another way to put it is that we, the public, have an incomplete picture of the early 20th century. The incomplete picture goes like this: fascists take over some countries, Communists take over other countries, and the democracies and Communist states duke it out with the fascists. This is true, so far as it goes, but we have to remain aware of the ideological and cultural struggles that helped to shape that conflict.

The aftermath is well known, of course; the Marxist states either turned into hybrid planned-economy capitalist systems (China) and tin-pot dictatorships (North Korea, etc.), or just went completely off the rails and shriveled (USSR). This left us with a monopolar world, in Europe and the United States at any rate, where a sort of social capitalism leaning either toward capitalism (USA) or socialism (Western Europe) prevailed. Democratic consumer-socialism, or something of that nature, had won.

We can argue, of course, over the purity of the current system, whether it's purely domineering and exploitative for the purposes of capitalism or contains some element of social progressivism. But I think it's safe to characterize it as a kind of hegemonic globalism, where the military primacy of wealthy countries (particularly the United States) allows for a world order that sees the slow spread of a particular way of life. Even in the exploited third-world nations, we still see a rapidly-expanding middle class and increasingly omnipresent technology (India, China). And the stated intention of the United States to bring democracy to countries around the world only reinforces this view. The social-capitalist system, with its bulwark of hegemonic globalism, has won so far.

The family of systems embraced in Western Europe, former European colonies like the United States and Canada, Japan, South Korea, and so on, would probably escape a strict definition, save that they are all relatively democratic with extensive social welfare and a high degree of private enterprise. The countries that use this family of systems are roughly equivalent to what journalists refer to in cloying, lofty terms as "the free world." I will call this family of systems neoliberalism, since that seems to be the term that best fits the phenomena I have described. I willingly admit that this term is vague, and that there are serious and substantial questions as to where and how it applies, but for the purposes of this essay, it will be enough. I can't define every nuance of neoliberalism, but if you think of it as encompassing a number of different mainstream political orthodoxies, you'll be close. It's not a single ideology, but refers more to the present system and its ability to define what is and is not acceptable. It consists of a vague family of ideas that nonetheless defines the political mainstream in much the same way that topography and the presence of trees define a forest biome.

2. The Present Day
Today, we see a definite paradigm shift going on, but this time, the playing field is different. In the first place, there is no "middle ground" with two opposite poles. Rather, what we see is one ideology (the prevailing neoliberalism) that stands in the center, surrounded on all sides by an enormous and highly diverse field of challengers that are typically subsumed under the "populist" label. Right off the bat, the terminology is an issue, because "populism" here is a blanket term that basically means "anything that's not neoliberalism."

(There is, of course, the old revolutionary left, but said revolutionary leftism exists in much the same sense as a species in a zoo "exists" despite being extinct in the wild. A single shop in California run by the IWW does not a worldwide proletarian movement make. And a few aging, senile professors quoting Proudhon does not constitute a new radical front. Pity them; they started a revolution and nobody came.)

2.1 The Political-Intellectual Complex
One need only look at the number of Supreme Court justices whose law degrees came from Ivy League schools, for instance, to see this in action. Another example is the media. The Guardian, the BBC, MSNBC, and CNN all seem very different from one another - different, that is, until you read Russia Today or al-Jazeera. Upon reading those sources, we are immediately struck by the homogeneity of the media in the free world. There seems to be a diversity of opinions in our media, until we are exposed to sources outside of our geopolitical sphere. All of the sudden, our free media seems to be marching in lockstep, albeit a non-centrally-coordinated lockstep. To quote one of those aging leftist professors,
The smart way to keep people passive and obedient is to strictly limit the spectrum of acceptable opinion, but allow very lively debate within that spectrum – even encourage the more critical and dissident views. That gives people the sense that there’s free thinking going on, while all the time the presuppositions of the system are being reinforced by the limits put on the range of the debate.
 - Noam Chomsky
Enter "populism." With the advent of the internet age, and the slow buildup of dissatisfaction with the current system, we begin to see sudden eruptions of political unrest from outside of the neoliberal system. To put it simply, between the indifferent machinations of global market force and the broad availability of information (I would not have had access to Russia Today before the internet came about), the smallfolk sniffed the air and concluded that there was "something rotten in the state of Denmark."

The "populist" label bears further discussion. The use of the term by media outlets and academic writers is both instructive and highly amusing. In the first place, the term "populist" is completely inaccurate. Time magazine alleges that the right-wing populist Tea Party was a creation of the Koch brothers, which leads one to wonder what is so populist about a Tea Party created at the behest of the ruling class. On the other hand, the Trump phenomenon was emphatically not a creation of the Kochs, given the frequency with which Trump and the Kochs are at loggerheads with one another. What we have here, it seems to me, is an entrenched political-intellectual complex, or a broad (informal, incidental, cultural) cooperation between government, academia, journalism, and media. This is not to be taken in a conspiratorial sense. There is no conspiracy. There is only an interplay of institutions that regresses to a status quo.

(If this is hard to understand for my readers, think of the feminist conception of "The Patriarchy" with which my readers will no doubt be familiar. There's no single group controlling things, only a broad group of people with a set of tendencies that drive them in a more-or-less similar general direction..)

As stated before, the so-called "populist wave" is really an umbrella term for anything lying outside of neoliberal orthodoxy - an orthodoxy that is, counterintuitively enough, quite loosely-defined. The main components of this populist wave are discontent with the present system, and distrust of formerly authoritative institutions owing to the recent explosion of available information via the internet. One sees the label, "fake news," as an attempt to patch the leaking Titanic with a band-aid.

The neoliberal orthodoxy is, as stated before, loosely-defined and kept in place through the technique mentioned by Chomsky. As a result, one occasionally finds outliers, even among the established media outlets. There are plenty of hoaxes on the internet, but the term "fake news" bespeaks an increasing anxiety among established institutions. Regardless of whether or not the term is malicious or deceptive, it is still an attempt to maintain credibility and legitimacy in the eyes of an increasingly disenchanted public.

Now that I've criticized some of the rhetorical (perhaps propagandist) baggage surrounding the term "populist," I would like to examine the nature of the "populist wave" itself.

2.2 The "Populist Wave"
The populist wave is more like a populist sea, in the center of which floats the leaky hulk of neoliberalism. The surf is becoming rougher by the day.

The most striking feature of the present populist wave is that it is largely a conservative counter-culture. I use the word "largely" because there are notable exceptions, such as the left-populists who supported Bernie Sanders in the 2016 American Presidential Election. But even these left-populists find themselves more comfortable with right-populists than with the neoliberal establishment; one need only look at the number of disaffected Bernie Sanders voters defecting to Trump to understand this.

However, this conservative counter-culture is not like the fascist movements of the 1930s, or even a conservative bizarro-reflection of the revolutionary Left at the same time. If anything, it resembles the beat and hippie movements. The conservative counter-culture consists largely of disaffected middle-class youth. Millennials are overwhelmingly left-leaning, but the generation following them is actually more conservative than the older generations. More importantly, however, is this: the right-wing component of the "populist wave" appears to be far more active than its left counterpart. A cursory listen to the editorializing of large newspapers treats us to an unholy bloodcurdling scream of horror at the right-wing populist wave threatening to engulf us all. Trump's election, and the Democratic party's black-balling of Sanders, serves as a sign that this populist wave (really non-establishment wave) has teeth, and that it leans decidedly to the right.

Much like the beatniks and hippies, this is a primarily cultural movement, which is a fact that seems to avoid most pundits. The political consequences, such as Trump's election, Brexit, and so on, are just institutional wave-crests of a deeper cultural current. This explains the confusion of many people in grasping exactly how the populist wave works; one looks only at the ripples on the surface, and does not take into account what's going on underneath.

3. What's next?
How this all will pan out is anyone's guess. The swarm of anti-establishment ideologies currently competing for dominance, with the establishment and one another, is difficult to predict. The one factor that I can predict with any degree of confidence is that the present paradigm will have vanished when the feeding-frenzy stops.

However, I have an inkling, a suspicion, a hunch about what will cause one of those movements to succeed. The problem with all of them is fundamental; they know what they dislike, but they don't know what they do like. They all know what they don't want, but not what they want. The political-intellectual complex cannot survive, and many of its competitors are not fully compatible with one another. So, what determines which competitor wins?

Simple: the ideology that offers a coherent, pragmatic, workable alternative to the present paradigm will carry the day. It's that simple. Right now, everybody is dissatisfied, and wants something different. The writings on this very blog are laced with said dissatisfaction. We all hate the way things are, except those few people who get rich off of the present situation (there are very few such people). The ideology that wins will be the one that can offer a constructive alternative vision. That's the key.

Monday, March 27, 2017

Playing Stupid

A really common reaction you'll get from people when you point out a reality they don't like is that they'll pretend to be too dumb to grasp what you're saying. It's known as "playing stupid."

The scenario is this: you are having an argument with another person. The argument can be about whether or not Obama was a good president, or whether it's ethical to eat meat, or if it's worth it to get a degree in art history - anything, really. It doesn't matter what subject you're arguing about, as long as it's emotionally loaded. Anyway, if you argue your opponent down, they will eventually run out of arguments. They'll run out of points. When you get to the point where they've unambiguously lost the argument, they'll no longer have a response. That's when they play stupid. When they're out of arguments, they just play dumb.

You'll make the last argument where it's clearly shown that what they hold to be true, is not really true. You lay it all out, carefully, demonstrating the clear line from A to B that shows that they're wrong. And they won't get it. More accurately, they will get it, but they'll pretend that they don't. They'll make a confused face. They'll have trouble hearing you ("Huh? Huh? Huh?"). They'll ask you to repeat yourself a few times. Arguments that they grasped effortlessly a few minutes ago are suddenly too complex for them.


When someone plays stupid, they don't cross their eyes and say "Duhhhh, me no understand." That obviously wouldn't work. Instead, they'll act as if you're talking nonsense, which is a subtle signal that they don't understand what you're saying. Part of playing stupid is a subtle passive-aggressive maneuver where they pretend that you're not making any sense. There is a layer of deception here; the person is pretending not to understand you, but they do so by acting as if you're not making sense, because "That doesn't make any sense" is conversationally an equivalent to "I don't understand that."


This is where playing stupid moves to the next stage. After acting confused, suddenly pretending to have trouble hearing you ("Huh? Whaaaaat?"), they'll begin to act as if you're just speaking gibberish. They'll throw their hands up. They'll roll their eyes. They'll give you the condescending "Oh, you" jibe. This is a case of projection; literally pretending to be stupid doesn't work when you know damn well that they're not stupid (even very intelligent people do this), so instead they project it onto you. Basically, they make themselves too dumb to understand what you're saying, and then act as if you're the idiot. And if you ask them to paraphrase you, they'll respond with a strawman. When they summarize your argument, they'll make it quite obvious that they didn't listen to you at all, and get your argument completely wrong. And if you say, "That's not what I said," they'll respond, "Yes it is." They will become completely oblivious to what they're doing. And they'll tune you out as well.


Again, I have to stress that the people who do this, very often, are not really stupid people. In fact, I have known highly intelligent people who play stupid. It's a common thing to do. It's the number one defense against dangerous ideas: defensive fake stupidity. Another point: the person who plays stupid does not do so consciously, or at least, they don't seem to. They don't say to themselves, "Uh-oh, I lost the argument, time to play stupid." They're aren't thinking to themselves, "Haw haw! Now I've got him confused because he can't figure out that I'm just pretending not to understand!" No, it's all very subconscious. They covertly flip the switch on their ability to process arguments, turning off their power of reasoning, while looking away from the hand that's doing it, and then forgetting that they did so. Then they ignore you.

I'm not a psychologist. But I'm going to do some armchair psychology here, with full deference for the fact that I could be wrong. I'm not asserting this as a truth, but it's my best explanation for why people play stupid. My explanation is this: playing stupid is rooted in denial, which is the typical human response to being wrong when you don't want to be wrong. If you're presented with a line of reasoning that leads to a conclusion you don't like, and you know that the line of reasoning isn't flawed, then the only way to protect yourself is to temporarily become too stupid to understand the reasoning. If I can't understand the argument that proves me wrong, I don't have to acknowledge that I'm wrong.

To go a little deeper into this: humans have little "maps" in their minds, that they use to navigate the world, just like a person can use a real map to navigate terrain. They have little mental models, little virtual worlds that they use to understand the real world. This is all well and good for a cognitively complex species such as humans. The trouble is that the map is not a strict matter of data and reasoning, but also includes values. If I draw a map of a piece of terrain, it's because I wanted to draw that map, because my values compelled me to draw it. Our maps are tangled up with the things we want. The things we want, our values, our goals, our morals, are all part of the map, all help shape the map, all factor into how the map is drawn. The way I want things to be affects my perception of how things are. This doesn't mean that we can never be objective, of course; the couch I'm sitting on right now is, in fact, here, even when I'm not looking at it. The issue, though, is that those squishy subjective values can get in the way of seeing things the way they are. Because values are part of those little maps in our heads, we can get attached to our maps. Because the way we want things to be affects our perception of how things are, we can get awfully attached to certain things being true. So when someone points out that our maps aren't accurate, and painstakingly demonstrates why, and we don't have a counterargument, what can we do? Simple: fail to grasp the argument, and you don't have to change your map.

Modern society inadvertently encourages this behavior with its tendency toward specialization. If you have a modicum of intellect, you know that human knowledge is so wide-ranging that nobody can know everything. For example, if you become a lawyer, you may become a copyright lawyer, or a patent lawyer, or a lawyer who specializes solely in understanding the laws surrounding water rights in some county in southern California. Because everything is so specialized, we are subliminally encouraged to ignore anything outside of our own specialty. This gives people an out. If you make an argument that they want to ignore, they can just wave their hands and say, "Well, I don't understand that, because it's outside of my field of expertise." This has a falsely-modest flavor; "Oh, you're so smart, I can't understand your argument that proves me wrong!" Again, passive-aggression in defense of willful delusion. When you live in a society that encourages you to think inside of a bubble, it makes inside-the-bubble thinking much easier to perpetuate. When you're trained, informally, to ignore stuff that falls outside of your area, you have a ready-made excuse to ignore any argument you don't like by deciding, arbitrarily, that the argument falls outside of your area.

Deceptions of this kind are most effective when they're coupled with self deception. Notice that, when a person plays stupid, they momentarily convince themselves that they don't understand what you're saying. There's no easier way to tell an effective lie than to momentarily believe it, and humans are masters of lying and deception. Clever monkeys.

Friday, March 24, 2017

Progress

When I make progress on something, I either made a positive change or brought it closer to completion. Social progress exists, if by "progress" you mean "fixing problems." This bare thesis, though, runs the risk of becoming a troll truism and motte-and-bailey argument quite quickly if we don't keep a handle on it. So, in the classic tradition of over-educated slackers who take an analytical scalpel to everything in sight, let me dissect this idea.

First of all, if you want to say that progress is a result of solving problems, then it has to be possible to fail. The present (Western) society has, by many measures, solved all kinds of problems. In their great generosity, progressives will tell you all kinds of thing that our current society fails to do. Strangely enough, though, the listed failures inevitably consist of modern projects that we have not pursued with sufficient vigor. We never get anything wrong that somebody else gets or has gotten right; we only get things wrong because we haven't gone far enough. We're the pinnacle of achievement, by our own standards, like everyone else. This is a particularly dicey point; naturally, one uses one's own standards when evaluating things for oneself (what else?), but there's something fishy about an argument that claims superiority for a society based on that society's standards.

This is where the motte-and-bailey argument comes in. In case both of the people who will read this are too deprived of leisure time to read the essay linked to above, I will summarize it: a motte-and-bailey argument is where you have a strong ridiculous claim and a weak trivial one. You make the strong ridiculous claim. When your opponent counters by pointing out that the strong ridiculous claim is, in fact, ridiculous, you re-counter by saying, "No no no, I'm just saying [weak trivial claim]." You then go on to say, "Because of [weak trivial claim], it follows that [thing that follows from strong ridiculous claim but not weak trivial one.]"

For example, I claim that our physical bodies would not exist unless we perceived them through a social lens [strong ridiculous claim]. When you rightfully tell me I'm full of horse shit, I respond by saying, "No! I'm only saying that we understand bodies through a social lens." [weak trivial claim] You agree to this, because it's damn-near common sense for anyone who has thought about the issue. I then go on to say, "Therefore, not only gender, but biological sex itself, is socially constructed [follows from strong ridiculous claim], because biological sex is understood through the lens of gender [follows from weak trivial claim]."

Oops.

Progress follows a similar line. The progressively-minded philosophers of our age like to openly and ostentatiously repudiate the idea of progress as a grand teleological plan operating throughout history. Most of them will also reject the idea that progress is the completion of society by bringing it in line with a transcendent sense of the Good; we moderns have advanced too far to believe in such a thing. They define a very weak, trivial notion of progress, and then go on to argue as if the bit about the transcendent Good were true. Motte, meet bailey.

This is not an exaggeration; I had one professor who claimed that, if a society went through two states, A and B, with A occurring first and B second, and if B were better than A by the standards of A, then that society had progressed. There was nothing in this notion of progress that would prevent us from arriving at a medieval state of affairs, and it really didn't differ from a simple description of bare changes rather than progress. But he defined it this way, because he needed a notion of progress for his variety of philosophical pragmatism to work.

The people who are most concerned with being and appearing as "adults" tend to be children, and this applies to grown-up children as well. The hard truth we have to face is that our society, while good in many respects, and also superior to past societies in many respects, perhaps, maybe, possibly, has a few problems besides not pursuing its current goals hard enough. This opens up a can of worms, such as the question of what is virtue and what is the good life. Those are difficult questions, which I daresay that we aren't too grown-up to ask.

"It's a social construct!"

Suppose I point at an object. A normal, everyday object, like a door. I ask you, "Is that an elephant?"
You ask me, "Are you drunk? Are you high? Are you insane?"
"I'm not crazy," I assure you, "This is philosophy."
You sigh in resignation.
"No," you say, "That's not an elephant."
When you say, "That's not an elephant," that is all you mean: that's not an elephant, you lunatic. You don't mean, "That's not something that we attach the arbitrary linguistic sign 'elephant' to as a consequence of the use of that sign in a language game," or any other such piece of philosophical analysis. You just mean that it's not an elephant.

But there's a problem. If I ask that fatal question, "What do you mean by that?" then we can get caught up in all manner of irrelevancies. You might respond, "It's not a creature with a trunk and four legs and gray skin." And I might decide to be a smartass and say, "So if I cut off an elephant's trunk, it's no longer an elephant?" This can go on and on. I'm not saying that we shouldn't get into such semantic tangles because they're annoying. Such debates are important! The abortion debate is, for many people, a semantic question about the meaning of the word, "baby." But that's not what I want to focus on here.

The first concept I want you to grasp, here, is a little meta to the whole discussion about whether or not something is a baby or an elephant or whatever. The first concept is this: we can argue over whether my car is still a car when it's disassembled and in pieces in my garage. But we know that an oak tree is not a car. More importantly, when we say that the car I'm leaning against isn't an oak tree, we're not talking about words. Most importantly, some things are definitely cars, and some things are definitely not cars. I want you to digest that. Now, for the hard step: I want you to combine those two ideas. First, that there are some things that are definitely oak trees and some that are definitely not, and second, that when we say that, we're not talking about words. What happens when you combine those two ideas?

What happens is this: you realize that, just because something is vague, doesn't mean that it's not real. Yeah, there are borderline cases, like the dis-assembled car, where something is kind of a car and kind of not a car. But that doesn't mean that there is no such thing as a car. It doesn't mean that cars aren't real things or that cars, to use a somewhat abused turn of phrase, are socially constructed. It means that there are cars, and things that aren't cars, and things that are kind of cars.

There's a problem, though. Remember a few paragraphs ago, when I said that the best way to end up in an endless semantic tangle was to ask, "What do you mean by that?" Them we end up debating about words. Now, there's nothing wrong with debating about words. The fact that we do that isn't a problem. The problem is this: we can abuse reasoning about words to say absurd things. Philosophers get accused of doing this a lot, but really, philosophers spend most of their time trying to escape this problem, not create it. If I ask you what you mean by "That's not an elephant," and you respond by listing the attributes of "elephant" as "Creature with four legs and a long nose," I can be a smartass and ask if an anteater is an elephant. The fatal step, the real problem, is this: I can then use that line of reasoning to decide that there's no such thing as an elephant, and if you insist that there is, I'll demand that you nail it down precisely. And no matter how precise your definition is, I can keep on finding counterexamples and tangling you up until you give up, and possibly throttle me out of frustration. It takes a lot of verbal acumen for someone to stop me in my tracks by saying, "You've gotten the cart before the horse. Just because we understand something through language doesn't mean that that thing is as arbitrary as language, any more than seeing you with my eyes means that you exist only inside of my eyes."

I want to stress, again, that there is nothing wrong with asking for a precise definition. While this essay wouldn't cut it in an academic journal, I am, all the same, being somewhat more careful with my words here than is needed for ordinary conversation. I am doing so because it is useful for the purposes of this essay. And sometimes, you really do need to nail things down to a high degree of precision. But that step in reasoning at the end, where I claim that there's no such thing as an elephant, is still wrong. It's a fallacy. A mistake in reasoning. If my thinking leads me to reject the existence of elephants, then I've screwed up somewhere, full stop, even if I try to weasel out of it with some hair-splitting about how I really do believe that there are elephants, even though I don't. And, to the credit of those pesky philosophers everyone hates for doing this, they do, in fact, have a name for this mistake: "The Sorites Fallacy," for the Sorites Paradox that outlines this very problem. It's a very old idea. That people still make this mistake is a painful indicator that we moderns are perhaps not as clever as we'd like to think. We're still making the classic philosophical mistake of identifying the way we know about something with that very thing. More than twenty centuries later, all of the dreadfully clever and enlightened moderns laugh at the ignorance of the past while making shadow puppets in Plato's cave.

The reason that the phrase "socially constructed" gets so much hate is because, when people argue that such-and-such is socially constructed, they often abuse this line of reasoning and commit the fallacy above. Somewhere along the line, a few French philosophers and their followers rediscovered this problem and, while pointing out some of the legitimate challenges it raises, still committed the classic postmodernist error of getting far too excited about a relatively minor problem. This French school of philosophy became enormously influential, and, at the time of this writing, the humanities departments of most universities in the Western world are saturated by this kind of thinking, without even knowing it. Liberal arts students, or, as N+1 Magazine likes to call them, "The Theory Generation," are indoctrinated into a particular worldview based upon this very narrow era of philosopy. As the indoctrination progresses, they are told that they are learning "critical thinking" from well-meaning professors. Your average university students thinks that their outlook is just what how enlightened people think. They're convinced that, if you confront your deeply-held assumptions about how things are, you will think the way they do. "If you'd just think for yourself, you'd agree with me!" They don't realize they've accepted a particular worldview, because they were told that they've transcended worldviews. They don't think they've been indoctrinated, because they were told that they were learning how to avoid being indoctrinated. They don't think, because all matters of thought have been settled for them. And if you point this out to them, if you suggest the idea that perhaps they're doing the very thing they accuse everyone else of doing, they'll give you an ironic smile and very patiently say, "Well, you haven't really thought it through."

Wednesday, March 22, 2017

Misanthropy, pt. 6 - Inclusion

The fastest way to ruin anything is to include everyone. I can give you a conceptual argument here and point indirectly at the principles from which this follows, but that approach doesn't work for most people. Your average Joe wants to see examples and work backwards from that - this, incidentally, is why your average Joe is so bad at math and deductive reasoning. Out of deference to Joe, I'm going to start with some examples. It's in list-format, thus easily-digestable and familiar so you don't have an excuse not to read it.

Bear in mind that the following is not a facile critique of consumerism. It cuts to a deeper level, i.e. inclusiveness as the parent of consumerism.

  1. Public spaces. When I was young, I used to ride my bike several miles out of town to a little park, nestled in a valley by a tiny redneck village, because it was my favorite park. It was my favorite park because you could walk around without seeing depressing things, like used condoms and beer bottles. It was my favorite park because when I went there, I was often the only one around, so I didn't have to deal with suspicious people asking what a long-haired teenage boy in a Black Sabbath shirt was doing in an ostensibly public park. It was my favorite park because there were no well-worn jogging trails or people in shorts and Underarmor shirts jogging around with iPods complaining about the mosquitos biting them on their intentionally exposed skin. It was my favorite park because its unpopularity meant it was not landscaped or manicured, so you occasionally saw "unacceptable" things like fallen trees, and even (gasp!) dangerous things like snakes or steep hills. It was my favorite park because it was sanitary in the right way (no cigarette butts or Budweiser cans), and also unsanitary in the right way; snakes, animal crap, and rotting tree-trunks that you had to climb over to continue. Ever try climbing over a rotting tree-trunk? You learn to test it with your foot first to see if it will hold your weight. Navigating a place like that is a skill, which makes it inconvenient for people who want to lose weight by riding around the place on an expensive name-brand mountain bike.

    This touches on another problem of inclusion; including everyone leads to a kind of entropy. Once the general public has claimed a space as its own, it becomes homogenized. How many manicured suburban parks have been created by indignant suburbanites complaining about the bugs and fallen trees? There are millions of places like that - and any place that isn't like that quickly homogenizes into another milquetoast outdoor rec-center, as soon as the hordes of respectable people find it.

    Saddest of all is this: that little park I went to wasn't even that wild! It was just a park that the really important people had forgotten about, and thus served as a mildly out-of-line little retreat. At the borders of our all-inclusive paradise, one occasionally finds something that is, optimistically speaking, mildly interesting compared to the stuff you find in the center.

  2. Popular music. How often do you hear the phrase, "Top 40s shit"? Too often. Much (though not all) popular music is designed to appeal to the greatest number of people, and, in doing so, lacks any interesting qualities besides surface-level novelty. Occasionally, you get a "sad" song in minor key (novelty) or a song with a sitar in the background or something. But strip away that single embellishment and you see the same skeleton as you do in every popular song. I don't mean this in a strictly music-theoretic sense (e.g. chord progressions), but in the sense that what you always find, at bottom, is a cheap trick designed to stick in your head long enough to make you buy an album or a concert ticket.

    Of course, there are rock bands with interesting musical ideas, and hip-hop artists with interesting lyrics, and so on. But these are inevitably niche artists; King Crimson can't approach the popularity of Miley Cyrus, precisely because bands like King Crimson insist on making music that won't appeal to everyone who hears it.

    It's true that the classical music of yesteryear was popular music in some sense. Mozart composed background muzak for rich people's parties. But even that did not bear the burden of appealing to tens of millions of people within ten seconds of coming on the radio. The speed of transmission granted by modern technology, especially radio, made it possible to accumulate enormous profits from including anybody who could afford a radio, or had a friend who could afford one.

    As said earlier, though, consumerism is a symptom, not the disease. The Soviet Union forced even talented composers like Shostakovich to create shallow tripe designed to appeal to huge numbers of people. In fact, an aspect of Soviet musical doctrine was the vilification of "formalism", or music that was considered too complex to appeal to the proletariat. The prevalence of this phenomenon outside of a capitalist society points to a deeper root for the problem than capitalism or consumerism.
  3. Subcultures. One reason that hipsters are so widely reviled is that they tend to suck the life out of authentic ideas by wearing them ironically, thus reducing them to yet another empty fashion statement. Fashion, taken in a broad sense (not just clothing but fashions in general) is what most of us use for distraction. By the time you realize it's a farce, another fashion has arisen to distract you a while longer. But in order to be effective, fashion has to be completely exoteric; anyone who can afford to buy the New Thing is admitted, and those who do so have nothing in common besides having the New Thing.

    We are accustomed to rolling our eyes at the youth who abandon a subculture as soon as it becomes popular. But the reason that people do this is not just to be rebellious and contrarian and cool, although that is certainly part of it. Being contrarian is part of the deal, but a bigger part of it is identity. The identity that one acquires from participating in a subculture is only worthwhile because that subculture is not an all-inclusive carnival that accepts anyone with the twenty-two-fiddy needed for admission. The identity of a subculture only means something when the subculture is a group for only a few people with very specific interests. Once the general public stampedes in, it just becomes another idle distraction.

    Again, we are accustomed to rolling our eyes and dismissing such things as "Cool Kids' Clubs" for people who desperately need an identity to cling to. What we never ask is, "Why do adults feel the need for a Cool Kids' Club?". It's not just immaturity on the part of the people who do it (although it is also that). It is, in large part, because those of us born into an age of inclusion are desperate for anything that we can authentically call our own. We didn't rebel as teenagers solely because of a hormonal imbalance or bitterness toward our parents. We rebelled because the teenage years were when we first began to sense that something was amis.
Now that you've read the handy-dandy examples presented in whiz-bang list format, I'll go ahead and lay out the basic principle. The following is dense and may require several read-throughs to fully digest, but I have faith in my readers.

Entities - from a physical object like a tennis ball to an abstract thing like a musical subculture - are defined by their boundaries. A boundary delineates what is and is not included in that entity. Boundaries can be vague, like the question of where blue ends and yellow begins, but the existence of borderline cases does not mean that everything is the same. We can argue over whether or not a dis-assembled chair is still a chair, but the thing I'm sitting on right now is definitely a chair, and a blue whale swimming around somewhere in the Pacific right now is definitely not a chair. You can have borderline cases, and you can quibble over how "real" the categories are, but the fact remains that, if you remove all of the boundaries of an entity, you remove the entity.

The preceding paragraph is, no doubt, enough to set off the "non-socially-acceptable idea" alarm bells in the heads of many educated people. The first response will be calculated disdain, but as this idea gains more traction - and it will - it will have to be confronted. The next step will be quibbling over whether or not we should care about categories, and that resistance will quickly be swept aside by the simple pragmatic fact that people want meaning, and modernity can't provide it. No amount of Sophistical obfuscation can quiet the psychic scream born of regimented egalitarian nihilism. Of course, the academics will not take note of this fact, to their great disadvantage.

Sunday, March 19, 2017

A Day In The Life


You wake up in the morning. You live in a Western country and it's March, so you have shifted your clock forward by an hour for some reason. There is a vague justification for this as a way to reduce the use of electric lights or something, but you just know it as one more arbitrary thing you have to do because everyone else does it. No point in complaining, though. Teenagers complain about how arbitrary and pointless everything is. You're an adult, so you just accept it.

You get up and open your fridge to make yourself a sandwich. The cheese slice packaging says “cheese product” rather than “cheese,” and the chocolate milk has a little fine print on the label advertising that it's made with 10% real chocolate! You think about getting some tuna to put on the sandwich, and, rubbing your eyes, remember that you're out. It's fine – you read in the news earlier that 60% of the tuna that Americans eat is not really tuna, but a blend of other fish and perhaps things that are not fish at all, so God knows what you'd have been eating had you had any “tuna” laying around. You open your fridge for some bologna instead, and look at the label; “blend of pork, chicken, beef...” the list goes on. The label also alerts you that the bologna “may contain” a number of other things, and is, in any case, mostly water.

After coffee and a cigarette, you leave for work. Whatever you do for work, the basic premise is the same; you do some tasks. The tasks are laid out and standardized, and your job is to do them quickly and efficiently enough that you don't get fired and replaced with someone who is willing to work harder for the same amount of money. You don't resent this, accepting it as a reality of life. Your coworkers are the sort of people you'd rather not be around. You perhaps speak with your supervisor, whose desk is made of wood chips compressed together with some kind of synthetic glue and overlaid with a plating made to look like real wood. Your supervisor is an inoffensive guy. He's perfected that plastic smile and professional demeanor that people learn when they get an office job. He's a nice enough guy and doesn't make you do too much arbitrary pointless shit just to impress his supervisor. He just wants to keep his job, and you understand that. You do what he asks. You like the guy.

Throughout the workday, your mind wanders. Where will you go next in life? You haven't got a family or even any pets. If you wanted, you could chuck it all and go live on the other side of the country, where everything would be exactly the same as it is here. You could go back and get your degree and get a job like the one your supervisor has. You could try and find someone to marry, but that sounds like a bad idea in this day and age. Marriage is out-dated anyhow. Everyone is doing that poly-whatsit stuff and the old-style relationships are becoming a thing of the past. Even if you're monogamous, it'll last a few years and then you'll both find some other kind of novelty to stimulate yourselves for a while. You could join the military, and get deployed to some place with lots of rocks and sand dunes because somebody in a suit who comes from old money thought it would be a good idea to send you there, but at least the danger would be real. You shake your head. No, it all takes too much effort. This is easier.

You come home. Your apartment is silent. You drink some beer out of a plastic bottle and go to bed.

Saturday, March 18, 2017

Misanthropy, pt. 5 - Top Monkey

Hell isn't other people. Hell is yourself. - Wittgenstein

If you're a bitter person, you may be tempted to interpret all human behavior as posturing for power and status. Once you stop being bitter, or become less bitter, you'll take a more relaxed view, but honesty still dictates that most of the stuff people do involves posturing. Watch a group of friends sitting around a table and talking. When they tell stories, when they tell jokes, when they give their opinions, when they decide what bar to go to, there is always posturing involved.

There's more to it, of course. Human motivation is complex. If I give change to a homeless guy, perhaps I'm doing it to look good to the people around me or cement myself as powerful giver and the homeless guy as weak receiver. But I am also doing it because he looks hungry and I figure he could use a cheeseburger. Our actions don't always boil down to just one thing. That realization will help you stop being so paranoid and bitter if you're willing to internalize it. You have to know when to put your foot down and when to let it go. You have to accept that power dynamics are part of human interaction, and if you're going to have a friend group then you have to spend some time maintaining your status, as simian as that task is. But it becomes less loathsome when you realize that the power dynamics are only part of the deal, albeit a very large part. You will be less disgusted by people, and human interaction will seem less revolting if you realize that it's not all about being top monkey.

But there's a catch.

The catch is that, sometimes, status isn't 100% of everything, but it's still 95% of everything, and that's not much better. And these situations, especially in terms of employment, are situations that cannot be avoided. The problem is that much of what you want out of life is tied to your status, and if you don't have much patience for posturing, you're gonna be limited in what you can have. That's just how the chips fall.

This line of thinking inevitably runs into the practical issue. At that point, somebody will say this: "Look, I know you think you're too good to do the tribal ego-dance with the rest of us, but you're not. So come down off of your cloud and play top monkey. You're just afraid that you're going to lose. It's just weakness."

Up to a point, this is a legitimate response. If you're so terrified of what people do to each other that you can't leave your house, you have a problem. They're not gonna eat you. The response is legitimate, but only up to a point. It is true that you have to play the status game to function. But - and there is no nice way to say this - the status game is just ridiculous. It's silly. I win, and now I'm sitting on my monkey-throne, wearing my monkey-crown, and I have the status. I won the trophy. I got the sticker. I'm king of the monkeys. Hooray! This is the kind of thing you hear from bitter people, true, but people don't say that posturing is silly because they're bitter. It's the other way around: they're bitter because they think posturing is silly. And when a bitter person says that posturing is silly, they're right!

More than silly, it's also corrosive. Anyone with significant life experience can tell you that competition for status brings out the worst in people. A preoccupation with increasing status is the first step to becoming evil. A corporate executive who founds a nature preserve as a PR move, and then dumps toxic waste upstream of that very nature preserve because it's cheap and technically legal, is a good example of this. So is the girl blowing her boss in the bathroom to get  a promotion. So are the schoolyard bully and the absent work-a-holic father. So are cog children.

Remember what I said a few paragraphs ago, about how our actions seldom boil down to just one thing? Bitterness is sometimes the product of a distorted, skewed outlook. But very often, it is also the product of being honest with oneself. You can stop being bitter once you figure out how to accept the truth without being bitter. In this case, the way you do that is by taking the wide view. You have to see how power dynamics fit in with the rest of human behavior. You have to understand that posturing is part of the package, but not the whole thing.


Tuesday, March 7, 2017

Cog Children

He broke off, and she fancied that he looked sad. She could not be sure, for the Machine did not transmit nuances of expression. It only gave a general idea of people - an idea that was good enough for all practical purposes... Something "good enough" had long since been accepted by our race.
- E.M. Forster, The Machine Stops
If you spend enough time hanging around people raised by the upper-middle to upper class, you will eventually run into someone who was raised like an employee, by parents who thought of themselves as supervisors, whom I call cog children. Everything in such a person's childhood was transactional. Good parents teach their children how the world works and how to handle money, but there are families of the upper half of society for whom this is the primary objective of parenting; teaching the child to make money. Of course, the parents don't think of it in those terms, but as teaching the child to be "successful", which means the same thing. This is done by constantly reminding the child of two things. First, that they are only as valuable as what they can produce, and second, that they are expendable. Cog children, in my experience, tend to do well professionally, because they grew up learning to be employees. They also tend toward having a short fuse and a marked dependence on alcohol or antidepressants, because they grew up learning to be employees. The few that don't do well professionally typically either self-destruct or end up living the most counter-cultural and unconventional lifestyles they can conceive of, partially to feed their thirst for an experience outside of the corporate world that they effectively grew up in, but mostly to spite their parents.

Modernity's favorite thing to create are single-use items, good for some specific function and useless for anything else. The gears in a machine that have no value outside of the gear assembly, the aluminum can that you throw out as soon as you're done with it, and cog children are all examples of this. They were raised in a manner that fostered any trait conducive to making money, and only those traits. As a result, you will find that cog children are superficially charming, or at least acceptably stiff and professional, because that is how one acts in an office job. You will find that they are hard-working, clever, opportunistic, organized, and conscientious, at least, when something's in it for them. They present well and are alert to other people's expectations of them. Otherwise, their salient personality trait is a subtle (sometimes not so subtle) undercurrent of competitiveness in everything they do, which is why they're so defensive, neurotic, and insecure.

For those of us who are not cog children, they present an alternately pitiable, irritating, and dangerous presence in our lives. You learn to see the warning signs; the fake niceness and plastic smile that reminds you of a front-desk receptionist in a way that makes your skin crawl, the aggression ranging from subtle prickliness to outright snarling which emerges in response to any perceived challenge, and so on. Some cog children manage to deprogram themselves, but the ones who cannot do so see life as one long struggle to claw your way to the top. The first thing they do upon interacting with you is size you up as a potential competitor, and if you qualify, they take an aggressive stance. If not, they become disdainful or dismissive. It is best to play possum and take the second option, because then they'll leave you alone.

As previously stated, cog children are one more obstacle in the way of temperaments not well-suited to navigating modernity. Like so many such obstacles, the recommended course of action is disengagement: avoid!

Thursday, March 2, 2017

Authentic

What's "authentic?" That peace-sign pendant you bought is an authentic hippie pendant if it was hand-made in the 1960s by a really stoned guy sitting in a microbus parked by a pond in Oregon somewhere. It's authentic if it came from the right place, at the right time, under the right circumstances. You could make another pendant that looks just like it, but it wouldn't be authentic, because just looking right or being made of the right material isn't enough. It's about where it came from. It's about the origin. Authenticity is something that can't be reproduced, like the origin or history of that hippie pendant, or the ring you got your spouse at marriage, or the watch your father gave to you. It could be a (relatively) cheap tungsten ring, or a semi-expensive watch with some plastic components, but that doesn't matter, because the other rings and watches like it aren't the authentic ones.

Authenticity tends to make people mad. A few examples:
  1. A girl I knew once told me a story, about how the redheaded girls at her elementary school were being picked on for being redheads. She suggested to the redheaded girls that they make a club for themselves, where they could all hang out together. A non-redheaded girl reported to the teachers that the redheads had made a special club that they couldn't be in, and the teacher broke up the club. 
  2. At my own elementary school, during group exercises, the teachers sometimes specified that you couldn't be in a group with the person you always hung out with, with your friends. Those groups, those close bonds that united people at the cost of excluding others, those friendships, had to be dissolved by fiat.
  3. There are people who will become defensive if you tell them that you own a purebred dog. They might cite the genetic defects that commonly occur in purebred dogs, but if you listen closely, you'll hear a note of sneering resentment there - and the bit about genetic defects is there to rationalize the resentment. There's a certain joy in knocking you off of your (putative) high horse. You own a purebred dog, huh? You think you're better than me? How dare you!
Authenticity, simply put, is the enemy of equality. Equality means that we all have equal value, and it's something we all tacitly accept. And that means that nothing can be authentic, because the only way for a thing to be authentic is if it can't be reproduced through a mechanical process, and if it can't be reproduced that way, then we can't all easily have it. And so authenticity raises our hackles, because I'm as good as you.

(It never occurs to anyone that we could just stop assigning a dehumanizing numerical value to people and just treat things qualitatively - we live in an age of computers, after all.)

This is why hipsters exist; we, as people, want uniqueness and authenticity, but we're born in an age where equality is something like a sacred scripture. Hipsters get mocked a lot because a hipster is someone who cares more about appearing authentic for reasons of social status than with actually being authentic. But this is an inevitable consequence of modernity; authenticity is socially valuable, but it competes with equality, so we settle for the appearance of authenticity rather than authenticity itself. Of course, this, in itself, is inauthentic. Enter the hipster's overblown sense of irony. If you can obscure your real intentions behind a thick facade of convoluted irony, then you never have to face up to the hard question of whether you value equality more than authenticity or vice-versa.

You ought to prefer authenticity. Will you care about other people's opinions? Sure you will; you're a human, thus a social animal. But work hard to minimize how much that affects your behavior. Over time, this fundamentally alters how you see things, because you have a long history of prying yourself away from the mechanistic way of thinking. They can mock you, of course; let them. Much like trying to spit at the sun, their spite will return onto their faces.

Wednesday, March 1, 2017

Misanthropy, pt. 4 - "Mad at the World"

"He's mad at the world."

We assume that the person in question is dysfunctional somehow. He's mad at the world because he was mistreated at some point, or because he's immature, or because he hasn't accomplished anything, or because he hasn't had enough "experiences." One thing we don't do is ask, "Is he mad at the world for a good reason?" It's hard for us to ask that question for two reasons, one psychological and the other situational.

The the psychological problem is that a lot of us have successfully created defense mechanisms to cope with the parts of modern society that are dysfunctional. If these defense mechanisms are unhealthy - that is, if they involve lying to ourselves - then we will immediately decide that anyone who is mad at the world has an internal problem and is mad for no good reason, because that upsets our illusion that everything is a-OK. The correct course of action here is to replace the unhealthy and dishonest defense mechanisms with healthy honest ones that allow us to accept the problems of the world without letting them get us down rather than pretend they're not there, but that's hard work so most of us opt to be lazy and just lie. This is why, when people see that you're dissatisfied with the way society works, they assume that you're neurotic or somehow impaired, rather than accepting that you haven't yet found out a way to deal with it while being true to yourself; this second possibility doesn't occur to them because they've never tried to do so themselves. And God help you if you explain what you're doing, because then they'll get really nasty.

The situational problem is a little more tricky. The problem is that it's hard to notice something that is always in front of your eyes. If you went around your whole life with green goggles that were sutured to your face, and you weren't aware of them (perhaps registering them as part of your body), then you would not notice that everything looked green, because it would never have occurred to you that things could be any other color. If someone asked if you could only see green, you might scratch your head and say, "What does 'green' mean?" It's much the same way with modernity. Every criticism of modernity has the flavor of an empty cliche or a truism for this reason. If someone points out that having your entire life hemmed in by arbitrary bureaucratic rules and lists is demeaning, dehumanizing, depressing, and a whole bunch of other words beginning with de-, you might just shrug it off as a cliche because those arbitrary rules and impersonal lists are just an accepted part of your existence. And you've given up on change; after all, this is all you've ever known. But historically speaking, it hasn't been this way for very long.