by Dr. Joseph Suglia

When did writing stop having to do with writing?  Of the many attempts to communalize literature, none is more dangerous than the sway of the current ideology: the consensus, and consciousness, that writing has nothing to do with writing.  You will hear readers talk about “plot” (in other words, life).  You will hear them talk about the “author.”  But writing?  Writing has nothing to do with writing.  No one cares whether a book is well-written anymore.

* * * * *

Mark Z. Danielewski is not very much interested in language.  He cares more about graphics than he does about glyphs.  No words live in his House of Leaves.  It is a house of pictures, not of words.  It is a house in which words only exist as blocks of physical imagery.

Allow me to cite a few not unrepresentative sentences/fragments from House of Leaves:

1.) “A hooker in silver slippers quickened by me” [296].  Danielewski, scholar, thinks that “to quicken” means “to move quickly.”

2.) “Regrettably, Tom fails to stop at a sip” [320].  I convulse in agony as I read this sentence.

3.) “Pretentious,” too often, is American for “intelligent.”  It is a word that is often misapplied.  However, in the case of House of Leaves, it must be said that Danielewski uses German pretentiously.  In a book that is littered with scraps of the German language, shouldn’t that language be used properly?  “der absoluten Zerissenheit” [sic; 404 and elsewhere — a Heideggerean citation] should read “die absolute Zerissenheit“–the genitive is never earned.  “unheimliche vorklaenger” [sic; 387] should read “unheimliche Vorklänge” and does not mean “ghostly anticipation.”  Whenever Danielewski quotes the German, he is being pretentious–that is, he is pretending to know things of which he knows nothing.

It is impossible to escape the impression that Mark Z. Danielewski does not want to be read.  Noli me legere = “Do not read me.”  The House of Leaves is a book at which to be looked, not one that is to be read.  Its sprawling typographies and fonts distract the reader from the impoverished prose.

Words are reduced to images, to pictures.

* * * * *

When did writing stop having to do with writing?  When novels became precursors to screenplays.  With the rise of mainstream cinema came the denigration of literature.  The visual overthrew the verbal.  Around the same time, imaginative prose began to be dumbed well down.  There are two infantile reductions at work, both of which are visible in House of Leaves: a dumbing-down of language and an accent on the optical (as opposed to the verbal).

Such infantile reductions are everywhere in evidence whenever one picks up a contemporary American novel.  We can thank America for the coronation of the idiot and for an all-embracing literary conformism.  Even stronger writers, these days, morosely submit to the prevailing consolidation of a single “literary style.”  A style that, of course, is no style at all.  And these same writers, listlessly and lifelessly, affirm in reciprocal agreement that the construction of a well-wrought sentence isn’t something worth spending time on.  Or blood.

How self-complacent American writers have become!  The same country that produced Herman Melville, William Faulkner, and Saul Bellow has given birth to Mark Z. Danielewski.  Nothing is more hostile to art than a culture of complacency.

There was, I’m sure, something very refreshing about Charles Bukowski in the 1970s, when the vestiges of a literary academism still existed.  Mr. Bukowski, I am assuming, would be dismayed to uncover the kindergarten of illiterate “literati” to which he has illegitimately given birth.  His dauphin, Mark Z. Danielewski.

Weaker students of literature might feel invigorated by the Church of Literary Infantilism, yet even they know that the clergy engenders nothing sacred or profane.  This explains their virulent defensiveness when anyone, such as myself, dares to write well or explore another writer’s engagement with language.  “Writing doesn’t matter,” you see.  They have never luxuriated in the waters of language; they have never inhabited a world of words.  Words don’t interest them; people do.  And literary discussions have degenerated to the level of a bluestockinged Tupperware party.  If you like the main character, the book is “good.”  If a book is warm and friendly, that book is “good.”  If a book reassures you that you are not a slavering imbecile–that is to say, if you can write better than the book’s “author”–that book is “good.”  If a book disquiets you or provokes any kind of thought whatsoever, that book is “bad.”  If a book has an unsympathetic main character, that book is “bad.”  If a book is difficult to understand, that book is “bad,” and so forth and so on.  Whatever exceeds the low, low, low standards of the average readership, in a word, is blithely dismissed as “bad.”

Things grow even more frightening when we consider the following: These unlettered readers are quickly transforming into writers.  That would be fine if they knew how to write.  And if the movements of language were valued, culturally and humanly, their noxious spewings would find no foothold.  The literature of challenge has been supplanted by the litter of the mob, with all of its mumbling solecisms and false enchantments.  The problem with mobs, let us remind ourselves, is that they efface distinctions.  They do everything in their power to make the distinguished undistinguished.  And so instead of James Joyce, we have bar-brawling beefheads (e.g. Chuck Palahniuk), simian troglodytes (e.g. Henry Rollins), and graphic designers / typographists (e.g. Mark Z. Danielewski).

Instead of poeticisms, we have grunts.  We have pictures.  We have graphic design and cinema.

* * * * *

Someone said to me: “I am a good writer, but I don’t know how to spell.”

Someone said to me: “No writer is better than any other.”

* * * * *

America is responsible for the production of more linguistic pig **** than any other country in the world.  There is absolutely nothing surprising about this statement.  After all, America is the only country that celebrates stupidity as a virtue.  How could things be otherwise?

At the poisonous end of the democratization process, which is indistinguishable from the process of vulgarization, every jackass on the street sees himself as an “author.”  His brother, his grandmother, and his step-uncle: they, too, regard themselves as “authors.”  After all, they think–inasmuch as they are capable of thinking–“Writing has nothing to do with writing.  If Mark Z. Danielewski can be published, so can I!”  (Yes, their desire is “to be published,” as if their lives would be inscribed on the page, disseminated, filmed, and thus rendered meaningful.)  We live in an age of all-englobing and infinitely multiplying cyber-technologies, where stammering imbeciles mass-replicate their infantile scribbles, but let us not deceive ourselves: If a “writer” is simply one who writes, then they are writers; however, one should reserve the word “author” only for those who are profoundly committed to the craft of verbal composition.

* * * * *

Judging from a purely technical point of view, House of Leaves is consistently faulty, fraught with excruciating Hallmark banalities and galling linguistic errors.  Hipster Mark Z. Danielewski is seemingly incapable of composing a single striking or insightful sentence.  It astonishes me that anyone ever considered his tinker-toy bromides to be publishable.  The House of Leaves is a house that is neither well-appointed nor ill-appointed.  It is simply not appointed at all.

* * * * *

Who cares about language anymore?  No one in America even questions the assumption that good writing does not matter.  And this assumption is no longer limited to America–a horrific logophobia is spreading throughout the globe.  The impetuses that motivate this tsunami of “literary” vomit are the following ideological assumptions: The fallacy that 1.) everyone is entitled to be an author (this is a particularly nasty perversion of the democratic principle) and that 2.) the visible improves on the verbal.  American letters have been reduced to the gibbering and jabbering of semiliterate simpletons, driveling half-wits, and slack-jawed middlebrows.  It’s only a matter of time before the English stop caring about language, as well.

When you live in a culture of complacency, a culture of appeasement, a hypocritical culture that assures you that you write well even if you don’t, there is only one way out.  There is nothing for the strong and serious student of literature to do but to write for himself, to write for herself, for his or her own sake.

Joseph Suglia

Analogy Blindness: I invented a linguistic term. Dr. Joseph Suglia


Over the years, I have invented a number of words and phrases.  Genocide pornography is one that I am especially proud of (cf. my essays on Quentin Tarantino); anthropophagophobia is another word that I coined, which means “the fear of cannibalism” (cf. my interpretation of Shakespeare’s As You Like It).  I would like to introduce to the world (also known as Google) a new linguistic term:

analogy blindness (noun phrase): the inability to perceive what an analogy represents.  To be lost in the figure of an analogy itself, while losing sight of the concept that the analogy describes.


The Analogist: Polygamy is like going to a buffet instead of a single-serve restaurant.  Both are inadvisable.

The Person Who Is Blind to the Analogy: People love buffets!


The Analogist: Being taught how to write by Chuck Palahniuk is like being taught how to play football by a one-legged man.

The Person Who Is Blind to the Analogy: A one-legged man who knows how to coach football?  That’s great!


The Analogist: You should not have reprimanded her in such a rude manner for taking time off from work.  You treated her as if she were guilty of some terrible offense, such as plagiarism.

The Person Who Is Blind to the Analogy: But plagiarism is bad!


Derived from Hui-neng: When the wise person points at the Moon, the imbecile sees the finger.

Joseph Suglia

A commentary on HUMAN, ALL-TOO-HUMAN by Nietzsche / MENSCHLICHES, ALLZUMENSCHLICHES: Nietzsche and Sam Harris / Nietzsche on Women / Was Nietzsche a sexist? / Was Nietzsche a misogynist? / Nietzsche and Sexism / Sam Harris and Nietzsche / Sexism and Nietzsche / Misogyny and Nietzsche / Nietzsche and Misogyny / Nietzsche and Sexism / Nietzsche and Feminism / Feminism and Nietzsche / Friedrich Nietzsche on Women / Friedrich Nietzsche and Sam Harris / Is Sam Harris Influenced by Nietzsche?


A commentary by Joseph Suglia

MAM = Menschliches, Allzumenschliches. Ein Buch für freie Geister (1878); second edition: 1886

VMS = Vermischte Meinungen und Sprüche (1879)

WS = Der Wanderer und sein Schatten (1880)

The following will not have been an interpretation of Nietzsche’s Human, All-Too-Human.  It will have been a commentary: Comment taire? as the French say.  “How to silence?”  In other words: How should the commentator silence his or her own voice and invisibilize his or her own presence in order to amplify the sound of the text and magnify the text’s image?

An interpretation replaces one meaning with another, or, as Heidegger would say, regards one thing as another.  A commentary adds almost nothing to the text under consideration.

Nietzsche’s Psychological Reductionism and Perspectivalism

Human, All-Too-Human is almost unremittingly destructive.  For the most part, it only has a negative purpose: to demolish structures and systems of thought.  However, there is also a positive doctrine within these pages, and that is the doctrine of total irresponsibility and necessity (to which I will return below) and the promise of a future humanity that will be unencumbered by religion, morality, and metaphysics.

In the preface of the second edition (1886), Nietzsche makes this thrust and tenor of his book clear with the following words: The purpose of the book is “the inversion of customary valuations and valued customs” (die Umkehrung gewohnter Wertschätzungen und geschätzter Gewohnheiten).  The highest ideals are reduced to the basest human-all-too-humanness of human beings.  This is a form of psychological reductionism: Once-good values (love, fidelity, patriotism, motherliness) are deposed.  The man who mourns his dead child is an actor on an imaginary stage who performs the act of mourning in order to stir up the emotions of his spectators—he is vain, not selflessly moral.  The faithful girl wants to be cheated upon in order to prove her fidelity—she is egoistic, not selflessly moral.  The soldier wants to die on the battlefield in order to prove his patriotism—he is egoistic, not selflessly moral.  The mother gives up sleep to prove her virtuous motherliness—she is egoistic, not selflessly moral [MAM: 57].

The inversion of valuations leads to an advocacy of the worst values: vanity and egoism (but never the vaingloriousness of arrogance, against which Nietzsche warns us for purely tactical reasons).  As well as lying.  Nietzsche praises lying at the expense of the truth to the point at which lying becomes the truth, and the truth becomes a lie that pretends that it is true.  This, of course, is a paradox, for anyone who says, “There is no truth, only interpretations of truth” is assuming that one’s own statement is true.

Again and again, Nietzsche phenomenalizes the world.  Appearance (Schein) becomes being (Sein): The hypocrite is seduced by his own voice into believing the things that he says.  The priest who begins his priesthood as a hypocrite, more or less, will eventually turn into a pious man, without any affectation [MAM: 52].  The thing in itself is a phenomenon.  Everything is appearance.  There is no beyond-the-world; there is nothing outside of the world, no beyond on the other side of the world, no επέκεινα.

As far as egoism is concerned: Nietzsche tells us again and again: All human beings are self-directed.  I could have just as easily written, All human beings are selfish, but one must be careful.  Nietzsche does not believe in a hypostatized self.  Every individual, Nietzsche instructs us, is a dividual (divided against himself or herself), and the Nietzsche of Also Sprach Zarathustra (1883-1885) utterly repudiates the idea of a substantialized self.  To put it another way: No one acts purely for the benefit of another human being, for how could the first human being do anything without reference to himself or herself?: Nie hat ein Mensch Etwas gethan, das allein für Andere und ohne jeden persönlichen Begweggrund gethan wäre; ja wie sollte er Etwas thun können, das ohne Bezug zu ihm wäre? [MAM: 133].  Only a god would be purely other-directed.  Lichtenberg and La Rochefoucauld are Nietzsche’s constant points of reference in this regard.  Nietzsche never quotes this Rochefoucauldian apothegm, but he might as well have:

“True love is like a ghost which many have talked about, but few have seen.”


“Jealousy contains much more self-love than love.”

Whatever is considered “good” is relativized.  We are taught that the Good is continuous with the Evil, that both Good and Evil belong to the same continuum.  Indeed, there are no opposites, only degrees, gradations, shades, differentiations.  Opposites exist only in metaphysics, not in life, which means that every opposition is a false opposition.  When the free spirit recognizes the artificiality of all oppositions, s/he undergoes the “great liberation” (grosse Loslösung)—a tearing-away from all that is traditionally revered—and “perhaps turns [his or her] favor toward what previously had a bad reputation” (vielleicht nun seine Gunst dem zugewendet, was bisher in schlechtem Rufe stand) [Preface to the second edition].  The awareness that life cannot be divided into oppositions leads to an unhappy aloneness and a lone unhappiness, which can only be alleviated by the invention of other free spirits.

What is a “free spirit”?  A free spirit is someone who does not think in the categories of Either/Or, someone who does not think in the categories of Pro and Contra, but sees more than one side to every argument.  A free spirit does not merely see two sides to an argument, but rather as many sides as possible, an ever-multiplying multiplicity of sides.  As a result, free spirits no longer languish in the manacles of love and hatred; they live without Yes, without No.  They no longer trouble themselves over things that have nothing to do with them; they have to do with things that no longer trouble them.  They are mistresses and masters of every Pro and every Contra, every For and every Against.

All over the internet, you will find opposing camps: feminists and anti-feminists, those who defend religious faith and those who revile religious faith, liberals and conservatives.  Nietzsche would claim that each one of these camps is founded upon the presupposition of an error.  And here Nietzsche is unexpectedly close to Hegel: I am thinking of Nietzsche’s perspectivalism, which is, surprisingly, closer to the Hegelian dialectic than most Nietzscheans and Hegelians would admit, since they themselves tend to be one-sided.  In all disputes, the free spirit sees each perspective as unjust because one-sided.  Instead of choosing a single hand, the free spirit considers both what is on the one hand and what is on the other (einerseits—andererseits) [MAM: 292].  The free spirit hovers over all perspectives, valuations, evaluations, morals, customs, and laws: ihm muss als der wünschenswertheste Zustand jenes freie, furchtlose Schweben über Menschen, Sitten, Gesetzen und den herkömmlichen Schätzungen der Dinge genügen [MAM: 34].  It is invidiously simplistic and simplistically invidious to freeze any particular perspective.  Worse, it is anti-life, for life is conditioned by perspective and its injustices: das Leben selbst [ist] bedingt durch das Perspektivische und seine Ungerechtigkeit [Preface to the second edition].  A free spirit never takes one side or another, for that would reduce the problem in question to the simplicity of a fixed opposition, but instead does justice to the many-sidedness of every problem and thus does honor to the multifariousness of life.

There Is No Free Will.  Sam Harris’s Unspoken Indebtedness to Nietzsche.

Let me pause over three revolutions in the history of Western thought.

The cosmological revolution known as the “Copernican Revolution” marked a shift from the conception of a cosmos in which the Earth is the center to the conception of a system in which the Sun is the center.  A movement from geocentrism (and anthropocentrism) to heliocentrism.

The biological revolution took the shape of the theory of evolution (“It’s only a theory!” exclaim the unintelligent designers), which describes the adaptation of organisms to their environments through the process of non-random natural selection.

There is a third revolution, and it occurred in psychology.  I am not alluding to psychoanalysis, but rather to the revolution that predated psychoanalysis and made it possible (Freud was an admirer of Nietzsche).  Without the Nietzschean revolution, psychoanalysis would be unthinkable, and Twitter philosopher Sam Harris’s Free Will (2012) would never have existed.

I am alluding to the revolution that Nietzsche effected in 1878.  It was a silent revolution.  Almost no one seems aware that this revolution ever took place.

It is a revolution that describes the turning-away from voluntarism (the theory of free will) and the turning-toward determinism, and Nietzsche’s determinism will condition his critique of morality.  Nietzschean determinism is the doctrine of total irresponsibility and necessity.

[Let it be clear that I know that Spinoza, Hume, Hobbes, Schopenhauer, et al., wrote against the concept of the free will before Nietzsche.]

The free will is the idea that we have control over our own thoughts, moods, feelings, and actions.  It conceives of the mind as transparent to itself: We are aware in advance of why we do-say-write-think the things that we do-say-write-think.  This idea is false: You no more know what your next thought will be than you know what the next sentence of this commentary will be (if this is your first time reading this text).  It is only after the fact that we assign free will to the sources of actions, words, and thoughts.  Our thoughts, moods, and feelings—e.g. anger, desire, affection, envy—appear to us as isolated mental states, without reference to previous or subsequent thoughts, moods, and feelings: This is the origin of the misinterpretation of the human mind known as “the free will” (the definite article the even suggests that there is only one).  The free will is an illusion of which we would do well to disabuse ourselves.

We do not think our thoughts.  Our thoughts appear to us.  They come to the surfaces of our consciousness from the abysms of the unconscious mind.  Close your eyes, and focus on the surfacings and submersions of your own thoughts, and you will see what I mean.

This simple exercise of self-observation suffices to disprove the illusion of voluntarism.  If your mind is babbling, this very fact of consciousness refutes the idea of free will.  Mental babble invalidates the voluntarist hypothesis.  Does anyone truly believe that s/he wills babble into existence?  Does anyone deliberately choose the wrong word to say or the wrong action to perform?  If free will existed, infelicity would not exist at all or would exist less.  After all, what would free will be if not the thinking that maps out what one will have thought-done-said-written—before actually having thought one’s thought / done one’s deed / said one’s words / written one’s words?

Belief in free will provokes hatred, malice, guilt, regret, and the desire for vengeance.  After all, if someone chooses to behave in a hateful way, that person deserves to be hated.  Anyone who dispenses with the theory of the free will hates less and loves less.  No more desire for revenge, no more enmity.  No more guilt, no more regret.  No more rewards for impressive people who perform impressive acts, for rewarding implies that the rewarded could have acted differently than s/he did.  In a culture that accepted the doctrine of total irresponsibility, there would be neither heroes nor villains.  There would be no reason to heroize taxi drivers who return forgotten wallets and purses to their clients, nor would there be any reason to heroize oneself, since what a person does is not his choice / is not her choice.  No one would be praised, nor would anyone praise oneself.  No one would condemn others, nor would anyone condemn oneself.  Researchers would investigate the origins of human behavior, but would not punish, for the sources of all human thought and therefore the sources of all human behavior are beyond one’s conscious control / beyond the reach of consciousness.  It makes no sense to say / write that someone is “good” or “evil,” if goodness and evilness are not the products of a free will.  There is no absolute goodness or absolute evilness; nothing is good as such or evil as such.  There is neither voluntary goodness nor voluntary evilness.

If there is no free will, there is no human responsibility, either.  The second presupposes the first.  Do you call a monster “evil”?  A monster cannot be evil if it is not responsible for what it does.  Do we call earthquakes “evil”?  Do we call global warming “evil”?  Natural phenomena are exempt from morality, as are non-human animals.  We do not call natural phenomena “immoral”; we consider human beings “immoral” because we falsely assume the existence of a free will.  We feel guilt / regret for our “immoral” actions / thoughts, not because we are free, but because we falsely believe ourselves to be free: [W]eil sich der Mensch für frei halt, nicht aber weil er frei ist, empfindet er Reue und Gewissensbisse [MAM 39].  No one chooses to have Asperger syndrome or Borderline Personality Disorder.  Why, then, should someone who is afflicted with Asperger syndrome or Borderline Personality Disorder be termed “evil”?  No one chooses one’s genetic constitution.  You are no more responsible for the emergence of your thoughts and your actions than you are responsible for your circulatory system or for the sensation of hunger.

Those who would like to adumbrate Nietzsche’s “mature” thought should begin with Human, All-Too-Human (1878), not with Daybreak (1801).  Nietzsche’s critique of morality makes no sense whatsoever without an understanding of his deeper critique of voluntarism (the doctrine of free will): Again, the ideas of Good and Evil only make sense on the assumption of the existence of free will.

Anyone who dispenses with the idea of free will endorses a shift from a system of punishment to a system of deterrence (Abschreckung).  A system of deterrence would restrain and contain criminals so that someone would not behave badly, not because someone has behaved badly.  As Nietzsche reminds us, every human act is a concrescence of forces from the past: one’s parents, one’s teachers, one’s environment, one’s genetic constitution.  It makes no sense, then, to believe that any individual is responsible for what he or she does.  All human activity is motivated by physiology and the unconscious mind, not by Good or Evil.  Everything is necessary, and it might even be possible to precalculate all human activity, through the mechanics of artificial intelligence, to steal a march on every advance: Alles ist notwendig, jede Bewegung mathematisch auszurechnen… Die Täuschung des Handelnden über sich, die Annahme des freien Willens, gehört mit hinein in diesen auszurechnenden Mechanismus [MAM: 106].

If you accept the cruelty of necessity (and is life not cruel, if we have no say in what we think and what we do?), the nobility of humanity falls away (the letter of nobility, the Adelsbrief) [MAM: 107].  All human distinction is devalued, since it is predetermined—since it is necessary.  Human beings would finally recognize themselves within nature, not outside of nature, as animals among other animals.  I must cite this passage in English translation, one which is not irrelevant to this context and one which belongs to the most powerful writing I have ever read, alongside Macbeth’s soliloquy upon learning of his wife’s death: “The ant in the forest perhaps imagines just as strongly that it is the goal and purpose for the existence of the forest as we do, when we in our imagination tie the downfall of humanity almost involuntarily to the downfall of the Earth: Indeed, we are still modest if we stop there and do not arrange a general twilight of the world and of the gods (eine allgemeine Welt- and Götterdämmerung) for the funeral rites of the final human (zur Leichenfeier des letzten Menschen).  The most dispassionate astronomer can oneself scarcely feel the lifeless Earth in any other way than as the gleaming and floating gravesite of humanity” [WS: 14].

The demystification of the theory of free will has been re-presented by Sam Harris, who might seem like the Prophet of the Doctrine of Necessity.  Those who have never read Nietzsche might believe that Dr. Harris is the first person to say these things, since Dr. Harris never credits Nietzsche’s theory of total human irresponsibility.  If you visit Dr. Harris’s Web site, you will discover a few English translations of Nietzsche on his Recommended Reading List.  We know that Dr. Harris’s first book (unpublished) was a novel in which Nietzsche is a character.  We also know that Dr. Harris was a student of Philosophy at Stanford University.  He would therefore not have been unaware of the Nietzschean resonances in his own text Free Will.  Why, then, has Dr. Harris never publically acknowledged his indebtedness to Nietzschean determinism?

Nietzsche Is / Is Not (Always) a Misogynist.

In 1882, Nietzsche was sexually rejected by Lou Andreas-Salome, a Russian intellectual, writer, and eventual psychoanalyst who was found spellbinding by seemingly every cerebral man she met, including Rilke and Paul Ree.  Since the first edition of Human, All-Too-Human was published four years before, Salome’s rejection of Nietzsche cannot be said to have had an impact on his reflections on women at that stage in the evolution of his thinking.

Nietzsche is sometimes a misogynist.  But I must emphasize: He is not always a misogynist.

At times, Nietzsche praises women / is a philogynist.  To give evidence of Nietzsche’s philogyny, all one needs to do is cite Paragraph 377 of the first volume: “The perfect woman is a higher type of human being than the perfect man” (Das volkommene Weib ist ein höherer Typus des Menschen, als der volkommene Mann).  Elsewhere, Nietzsche extols the intelligence of women: Women have the faculty of understanding (Verstand), he writes, whereas men have mind (Gemüth) and passion (Leidenschaft) [MAM: 411].  The loftier term Verstand points to the superiority of women over men.  Here, Nietzsche is far from misogynistic—indeed, he almost seems gynocratic.

Nor is Nietzsche a misogynist, despite appearances, in the following passage—one in which he claims that women tolerate thought-directions that are logically in contradiction with one another: Widersprüche in weiblichen Köpfen.—Weil die Weiber so viel mehr persönlich als sachlich sind, vertragen sich in ihrem Gedankenkreise Richtungen, die logisch mit einander in Widerspruch sind: sie pflegen sich eben für die Vertreter dieser Richtungen der Reihe nach zu begeistern und nehmen deren Systeme in Bausch und Bogen an; doch so, dass überall dort eine todte Stelle entsteht, wo eine neue Persönlichkeit später das Übergewicht bekommt [MAM: 419].

To paraphrase: Nietzsche is saying that the minds of women are fluxuous and not in any pejorative sense.  He means that multiple positions coexist simultaneously in the consciousnesses of women.  Personalities are formed and then evacuate themselves, leaving dead spots (todte Stellen), where new personalities are activated.  This does not mean that the minds of women contain “dead spots”—it means that they are able to form and reform new personalities, which is a strength, not a weakness.  And yet does he not say the same thing about his invisible friends, the free spirits?  Free spirits are also in a state of constant flux, and their fluxuousness, while necessarily unjust to their own opinions, allows them to move from opinion to opinion with alacrity and to hold in their heads multiple opinions at the same time.  Free spirits have opinions and arguments, but no convictions, for convictions are petrific.  Free spirits are guiltless betrayers of their own opinions [MAM: 637] and goalless wanderers from opinion to opinion [MAM: 638].

Why would the substitution-of-one-position-for-another, intellectual inconstancy, be considered as something negative?  Is it not a trait of the free spirit the ability to substitute a new position for an older one with alacrity?  And is the free spirit not Nietzsche’s ideal human being—at least before the overhuman takes the stage?  Such is my main argument: Free-spiritedness is womanliness, and free spirits are womanly, if we accept Nietzsche’s definitions of “free-spiritedness” and of “womanliness.”

This is not to deny the strain of misogyny that runs throughout Nietzsche’s collected writings.  Yes, Nietzsche does write unkind and unjustifiable things about women—some of his statements about women are downright horrible and indefensible.  My objective here is to highlight the polysemy and polyvocality of his writing, its ambiguity.  For a further discussion of Nietzsche’s ambiguous representations of the feminine, consult Derrida’s Spurs, wherein he analyzes the figure of the veil in Beyond Good and Evil.

To say or write that Nietzsche is always a misogynist would be to disambiguate his work—if by “Nietzsche” one is referring to the paper Nietzsche.  (For a series of accounts of Nietzsche as a human being, see Conversations with Nietzsche: A Life in the Words of His Contemporaries, published by Oxford University Press.)  Nonetheless, let us pause over the historical, living human being Friedrich Nietzsche, who was male, and his relation to one historical, living human being, who was female: Marie Baumgartner, the mother of one of Nietzsche’s students and his sometime French translator.  In the original manuscript of Mixed Opinions and Maxims, the first appendix to Human, All-Too-Human, Nietzsche wrote: “Whether we have a serpent’s tooth or not is something that we do not know until someone has put his heel upon us.  Our character is determined even more by the lack of certain experiences than by what we have experienced” [VMS: 36].  In a letter to Nietzsche dated 13 November 1878, Marie Baumgartner wrote: “I would gladly have added to your very striking maxim: ‘a woman or mother would say, until someone puts his heel upon her darling or her child.’  For a woman will not silently allow something to happen to them that in most cases she patiently accepts for herself.”  Nietzsche was so affected by Baumgartner’s rather delicately worded suggestion that he modulated the text to reflect her proposal.  If Nietzsche regarded women as inferior (and he never did), why would he take seriously something that a female reader wrote about his manuscript—so seriously that he modified his manuscript to incorporate her words?  The fact that Nietzsche reflected Marie Baumgartner’s suggestion in the revision of his manuscript is evidence enough that he respected the intelligence of this particular woman—the grain of his own writing confirms that he respected the intelligence of women in general and even considered women in general to be more intelligent than men in general.

Nietzsche Was Not an Atheist, if by “Atheist” One Means “Someone Who Does Not Believe in God.”

Nietzsche tells us, in Paragraph Nine of the first volume, “Even if a metaphysical world did exist, it would be nothing other than an otherness [Anderssein] that would be unavailable and incomprehensible to us; it would be a thing with [purely] negative characteristics.”

My question (which has been inspired by Nietzsche) is the following: Why do we even care about the beyond?  Should questions such as “Is there life after death?” not be greeted with apathy?  Why are we engaged with such questions to begin with?  Do not such questions merit indifference rather than seriousness?

Questions such as “Does God exist?” and “Is there life after death?” cannot be answered scientifically or logically.  We do not require their answers in order to live.  All of us live out our lives without knowing the answers to such questions.  Not merely that: It is entirely possible to live out our lives without ever ASKING or PURSUING such questions—and would we not be better off for not having done so?

Let me put it another way: Do the questions “Why does the world exist?” and “Why is there being rather than nothing?” not presuppose a reason for existing and a reason for being?  I am looking at you, Heidegger.

The Nietzsche of 1878 is not an atheist, if by “atheist” one means “someone who does not believe in God.”  Those who contest the existence of a deity or deities are practicing a form of skiamachy.  Nietzsche, on the other hand, is someone who considers questions about the existence of God, or of any extra-worldly transcendence, to be superfluous.  Otherworldliness is not something that can be discussed, since it is purely negative.

Moreover, the Nietzsche of Human, All-Too-Human is not merely not an atheist.  He is also not a philosopher, if by “philosopher,” we mean someone who speculates about imaginary worlds / is an imaginary world-builder.  Nietzsche will not become a philosopher, speculative or otherwise, until the very end of his period of lucidity, with the doctrines of the Eternal Recurrence of the Always-Same and the Will to Power.

Nietzsche Contradicts Himself.  Often.  But This Is Not a Flaw in His Thinking.

Nietzsche contradicts himself—often—but this is not a flaw in this thinking.  He tells us to stop using the word “optimism” [MAM: 28] and then uses the word himself, without any perceptible irony, in other sections of the book.  After scolding us for believing in heroes, he warmly sponsors the “refined heroism” (verfeinerten Heroismus) of the free spirit who works in a small office and passes quietly into and out of life [MAM: 291].  In Paragraph 148 of the first volume, Nietzsche claims that the poet alleviates (erleichtert) life—this seems to contradict his claim, five paragraphs later, that “art aggravates the heart of the poet” (Die Kunst macht dem Denker das Herz schwer), that listening to Beethoven’s Ninth Symphony infuses the listener with the heavy feeling of immortality, with religious and metaphysical conceptions.  If Nietzsche contradicts himself, and he does, this is because free-spiritedness is multitudinous, multi-perspectival, self-contradictory thinking.  Free-spiritedness is multi-spiritedness.

Aphorisms Inspired by Nietzsche

On Religion and Politics

What is religious is political, and what is political is religious.

On Morality

Morality depends on opportunity.

On Communication

A word means something different to you than it does to me, which means that communication is impossible: Nothing is communicable save the power to communicate the impossibility of communication.  (Nietzsche suggests that the worst alienation is when two people fail to understand each other’s irony.)  Consciousness of this fact would liberate us from the bitterness and intensity of every sensation.

On Interpretation

The mind is geared not toward what has been interpreted, but toward that which has not been interpreted and might not even be interpretable.  Nietzsche: “We take something that is unexplained and obscure to be more important than something that has been explained and made clear” [MAM: 532].

On the Voice

We often disagree with someone because of the sound of his or her voice.  We often agree with someone because of the sound of his or her voice.

On Salvation

In a 1966 interview with Der Spiegel, Heidegger claimed: “Only a god can save us.”  This statement must be revised: Not even a god could save us now.

On Censorial America

In contemporary America, you may be prosecuted and persecuted for what you think, insofar as what you think is available in language.

Joseph Suglia

A Critique of David Foster Wallace: Part Two: A Supposedly Fun Thing That I Will Never Do Again / “E Unibus Pluram: Television and U.S. Fiction” / “Getting Away from Already Being Pretty Much Away from It All” / “David Lynch Keeps His Head”

An Analysis of A SUPPOSEDLY FUN THING THAT I WILL NEVER DO AGAIN (David Foster Wallace) by Joseph Suglia

I have written it before, and I will write it again: Writing fictionally was not one of David Foster Wallace’s gifts.  His métier was, perhaps, mathematics.  David Foster Wallace was a talented theorist of mathematics, it is possible (I am unqualified to judge one’s talents in the field of mathematics), but an absolutely dreadful writer of ponderous fictions (I am qualified to judge one’s talents in the field of literature).

Wallace’s essay aggregate A Supposedly Fun Thing that I Will Never Do Again (1997) is worth reading, if one is an undiscriminating reader, but it also contains a number of vexing difficulties that should be addressed.  I will focus here upon the two essays to which I was most attracted: “E Unibus Pluram: Television and U.S. Fiction” and “David Lynch Keeps His Head,” a conspectus on the director’s cinema from Eraserhead (1977) until Lost Highway (1997).  Wallace seems unaware of Lynch’s work before 1977.

In “E Unibus Pluram,” Wallace warmly defends the Glass Teat in the way that only an American can.  He sees very little wrong with television, other than the fact that it can become, in his words, a “malignant addiction,” which does not imply, as Wallace takes pains to remind us, that it is “evil” or “hypnotizing” (38).  Perish the thought!

Wallace exhorts American writers to watch television.  Not merely should those who write WATCH television, Wallace contends; they should ABSORB television.  Here is Wallace’s inaugural argument (I will attempt to imitate his prose):

1.) Writers of fiction are creepy oglers.
2.) Television allows creepy, ogling fiction writers to spy on Americans and draw material from what they see.
3.) Americans who appear on television know that they are being seen, so this is scopophilia, but not voyeurism in the classical sense. [Apparently, one is spying on average Americans when one watches actors and actresses on American television.]
4.) For this reason, writers can spy without feeling uncomfortable and without feeling that what they’re doing is morally problematic.

Wallace: “If we want to know what American normality is – i.e. what Americans want to regard as normal – we can trust television… [W]riters can have faith in television” (22).

“Trust what is familiar!” in other words.  “Embrace what is in front of you!” to paraphrase.  Most contemporary American writers grew up in the lambent glow of the cathode-ray tube, and in their sentences the reader can hear the jangle and buzz of television.  David Foster Wallace was wrong.  No, writers should NOT trust television.  No, they should NOT have faith in the televisual eye, the eye that is seen but does not see.  The language of television has long since colonized the minds of contemporary American writers, which is likely why David Foster Wallace, Chuck Klosterman, and Jonathan Safran Foer cannot focus on a single point for more than a paragraph, why Thomas Pynchon’s clownish, jokey dialogue sounds as if it were culled from Gilligan’s Island, and why Don DeLillo’s portentous, pathos-glutted dialogue sounds as if it were siphoned from Dragnet.

There are scattershot arguments here, the most salient one being that postmodern fiction canalizes televisual waste.  That is my phrasing, not Wallace’s.  Wallace writes, simply and benevolently, that television and postmodern fiction “share roots” (65).  He appears to be suggesting that they both sprang up at exactly the same time.  They did not, of course.  One cannot accept Wallace’s argument without qualification.  To revise his thesis: Postmodern fiction–in particular, the writings of Leyner, DeLillo, Pynchon, Barth, Apple, Barthelme, and David Foster Wallace–is inconceivable outside of a relation to television.  But what would the ontogenesis of postmodern fiction matter, given that these fictions are anemic, execrably written, sickeningly smarmy, cloyingly self-conscious, and/or forgettable?

It did matter to Wallace, since he was a postmodernist fictionist.  Let me enlarge an earlier statement.  Wallace is suggesting (this is my interpretation of his words): “Embrace popular culture, or be embraced by popular culture!”  The first pose is that of a hipster; the second pose is that of the Deluded Consumer.  It would be otiose to claim that Wallace was not a hipster, when we are (mis)treated by so many hipsterisms, such as: “So then why do I get the in-joke? Because I, the viewer, outside the glass with the rest of the Audience, am IN on the in-joke” (32).  Or, in a paragraph in which he nods fraternally to the “campus hipsters” (76) who read him and read (past tense) Leyner: “We can resolve the problem [of being trapped in the televisual aura] by celebrating it.  Transcend feelings of mass-defined angst [sic] by genuflecting to them.  We can be reverently ironic” (Ibid.).  Again, he appears to be implying: “Embrace popular culture, or be embraced by popular culture!”  That is your false dilemma.  If you want others to think that you are special (every hipster’s secret desire), watch television with a REVERENT IRONY.  Wallace’s hipper-than-thou sanctimoniousness is smeared over every page.

Now let me turn to the Lynch essay, the strongest in the collection.  There are several insightful remarks here, particularly Wallace’s observation that Lynch’s cinema has a “clear relation” (197) to Abstract Expressionism and the cinema of German Expressionism.  There are some serious weaknesses and imprecisions, as well.

Wallace: “Except now for Richard Pryor, has there ever been even like ONE black person in a David Lynch movie? … I.e. why are Lynch’s movies all so white? … The likely answer is that Lynch’s movies are essentially apolitical” (189).

To write that there are no black people in Lynch’s gentrified neighborhood is to display one’s ignorance.  The truth is that at least one African-American appeared in the Lynchian universe before Lost Highway: Gregg Dandridge, who is very much an African-American, played Bobbie Ray Lemon in Wild at Heart (1990).  Did Wallace never see this film?  How could Wallace have forgotten the opening cataclysm, the cataclysmic opening of Wild at Heart?  Who could forget Sailor Ripley slamming Bobbie Ray Lemon’s head against a staircase railing and then against a floor until his head bursts, splattering like a splitting pomegranate?

To say that Lynch’s films are apolitical is to display one’s innocence.  No work of art is apolitical, because all art is political.  How could Wallace have missed Lynch’s heartlandish downhomeness?  How could he have failed to notice Lynch’s repulsed fascination with the muck and the slime, with the louche underworld that lies beneath the well-trimmed lawns that line Lynch’s suburban streets?  And how could he have failed to draw a political conclusion, a political inference, from this repulsed fascination, from this fascinated repulsion?

Let me commend these essays to the undiscriminating reader, as unconvincing as they are.  Everything collected here is nothing if not badly written, especially “Getting Away from Already Being Pretty Much Away from It All,” a hipsterish pamphlet about Midwestern state fairs that would not have existed were it not for David Byrne’s True Stories (1986), both the film and the book.  It is my hope that David Foster Wallace will someday be remembered as the talented mathematician he perhaps was and not as the brilliant fictioneer he certainly was not.

Joseph Suglia