Commencements and graduations have seasons—we’re just finishing up with one now—and, more and more, I think they have vintages. Year to year, a defining mood emerges out of the web of exhortation and prospect that all that pomp and circumstance lets loose. Some years seem full of inspirational stories; some years seem full of manifestos; some years seem primarily jokey. (All years, of course, have all of these going on to some extent.)
With my antennae more or less permanently oriented toward music and the arts, the defining mood of this year’s commencement season has been realism. This is a year in which, it seems, society is determined not to let students of the arts out into the world without making sure they’re painfully aware of what awaits them, financially and socially. A May 2013 report from the Georgetown University Center on Education and the Workforce, showing relatively high unemployment and low median income for graduates in the arts, got a lot of play this spring. On the PBS NewsHour, Paul Solman devoted an entire segment to the dim job prospects facing music graduates.
At the Columbia University School for the Arts, David Byrne not only relayed a host of such statistics to the graduates, he did so, in either a bout of détournement or defeat, via the preferred expressive medium of corporate America: a PowerPoint presentation. (Rachel Arons reported on Byrne’s address for The New Yorker.) The message from the critics to the graduates: the jobs aren’t there; the opportunities aren’t there; the money’s not there. It’s not a new message; I’ve been hearing it for as long as I’ve been in music, one way or another. And I’ve come to find it amazing—amazing that people can be in the grip of a delusion so strong that they don’t even realize it.
I’m referring to the critics, of course. Who did you think I was talking about?***
Here’s one of the more sweeping and true things that C. Wright Mills ever wrote:
The first rule for understanding the human condition is that men live in second-hand worlds.
Mills used to be more well-known than he is today. He was a classically trained sociologist who honed his considerable analytical skills in a very specific time and place: Cold-War America. A Texan moved east, he cultivated an outsider image, living hard, riding a motorcycle to and from his Columbia office. Toward the end of his life, a yearning to transform analysis into action led him to become one of the first Castro apologists among the American left—his 1960 “pamphlet” Listen, Yankee! marked the zenith of his celebrity, but its time-capsule nature might well symbolize Mills’s posthumous fade: as many of the assumptions of the Cold War became obsolete, Mills’s analysis of them became ever harder to apply.
This is unfortunate, since, on those occasions when Mills went global with his conclusions, he pinpointed tensions and deep currents that, far from becoming obsolete, have since only metastasized. That previous quote is from one of a series of lectures that Mills gave to the London School of Economics during the 1958-59 term, which were also broadcast on the BBC Third Programme (and later published in the BBC’s house magazine). Mills called this lecture “The Cultural Apparatus.” This was also to be the title of a book-length study of American intellectual life, a book that remained unfinished when Mills died in 1962. It would have covered one of his favorite topics: the relationship between ideas and power, how the intellectual community aspired to influence the power elite (a term Mills coined), how those in power exploited their connection to the world of ideas.
Mills was unusually attuned to the way culture creates second-hand worlds, sets of assumptions and ready-made judgments that we try to fit onto experience, even if the fit isn’t so good. He was also attuned to the multiplicity of cultures, those beyond art and science and intellectual pursuits. One in particular fascinated Mills: the culture of capitalism.
As early as 1951, Mills was attentive to how post-war capitalism marginalized and even infantilized creative pursuits. In his book White Collar, Mills noted that most diagnoses of worker dissatisfaction and alienation in the wake of the Industrial Revolution could be traced back to the gulf between the ideal of craftsmanship—where there is “no ulterior motive in work other than the product being made and the processes of its creation”—and the reality of modern work: “The model of craftsmanship is an anachronism….As a practice, craftsmanship has largely been trivialized into ‘hobbies,’ part of leisure not of work.” A couple of years later, in an opinion piece for the New York Herald Tribune, Mills reiterated that assessment (“Today many people have to trivialize their true interests into ‘hobbies,’ which are socially considered as unserious pastimes.”) before going on to chart how the culture of business and capitalism had infected even leisure itself, “that tired frenzy by which we strive for the animated glee we call fun.” He offered a counterproposal, one worthy of any charge to the graduates:
We ought to judge the quality and level of our personal culture by the best that has been achieved anywhere and any time, and we ought to go further than that: with our material equipment, and the more ample time it might make available, we ought to project our ideals even higher than the best mankind has ever achieved. Were we to do this, seriously and imaginatively, we would see that our choice is between genuine leisure, which enlarges the feeling and reason, and spurious leisure, which blunts the very capacity for truly personal experience.
The catch? “The first thing to be said about this choice,” Mills pointed out, “is that most Americans never get to make it.” Why not? Because the culture of business has become so ingrained in society that the choice is forestalled—as impractical, as quixotic, as irresponsible. Because, for artists, it’s capitalism that’s the second-hand world. And it is not a good fit.***
Every tally of post-graduate income, every analysis of the financial worth of the arts, every management-speak probe into the wherewithal of artistic institutions: they all prioritize the culture of capitalism and business over the culture of art. We do it all the time, of course. Occasionally, it’s even necessary. But every time, the balance shifts a little.
The thing about such talk about the arts is that it’s not even wrong. Statistics and graphs, tables and percentages—the factual nature of it disguises the damage that it does. It’s like a live recording of a concert in which the crowd noise is mixed so high that it drowns out the music. It’s not technically inaccurate; it may even (based on some concerts I’ve been to) be a somewhat faithful documentation of the experience. But it misses the point, to the detriment of its own reason for being.
Still, we have to talk about it, don’t we? People have to eat. People have to pay rent. People have to make money. It’s ridiculous to think we shouldn’t talk about it, right? And it feels productive to talk about it. It feels incisive: getting down to brass tacks. Or the other cliché, which stalked Byrne’s address, as The New Yorker reported: “Claire Simno, from New Orleans, liked how Byrne had confronted the ‘elephant in the room,’ and felt confident about the future of her son, Jeff, a theatre grad.”
In other words, it’s being realistic. And we all have to be realistic. But exactly what mountain is that commandment coming down from? Why is realism necessary to survive in the world we’ve ended up in? Realism and lack of sentiment are, after all, virtues of the business world. So you might wonder if any message bearing those virtues is getting anywhere near to the root of the matter, to the Pre-Cambrian levels of encrusted assumptions that pass for conventional wisdom. Short answer: nope. Because if you really start overturning deep assumptions about anything, if you really start to question just why the elephant is in the room in the first place, you don’t sound realistic or pragmatic; you sound instead like—well, you sound like Ludwig Wittgenstein sounded to Bertrand Russell when they first met, when the philosophical conversation turned on yet another large mammal in a small space. Russell:
My German engineer, I think, is a fool. He thinks nothing empirical is knowable—I asked him to admit that there was not a rhinoceros in the room, but he wouldn’t.
You sound like a fool. Fools have had a hard time of it since the advent of industrial capitalism—the term has come to be synonymous with a dupe, a sucker. A fool and his money are soon parted. What worse insult is there in a free-market society?***
It is probably not an accident that the last refuge of the fool is in the theater. The arts.
The fool in Philip Barry’s 1928 play Holiday is Ned Seton, oldest son and youngest child of the wealthy Seton family. Ned’s oldest sister, Linda, is restless and rebellious, constantly chafing at the straitjacket of privilege. The middle child, Julia, is pretty but vague—Linda tries to protect her, Ned is skeptical of whether she wants protecting. The ignition of the plot is Julia’s engagement, to Johnny Case—who, it turns out, sees business as only a means to an end, wanting only to make enough money to retire young, to the horror of the old-money Seton patriarch, and the delight of Linda Seton.
Ned is pleasant and ineffectual on the surface. He is also a truth-teller—we know this because he is, at the same time, in a Philip Barry play and a drunk. (Drunks are always truth-tellers in Barry’s plays. His best-known play, The Philadelphia Story, hinges on the fact that certain characters will only tell the truth when they’re drunk.) Ned wilts in the face of his father’s demands; he is not taken seriously. But he has the fool’s prerogative of saying what others won’t, be it about his late mother (“Drink to Mother, Johnny—she tried to be a Seton for a while, then gave up and died.”) or about Julia (“At bottom she’s a very dull girl, and the life she pictures for herself is the life she belongs in.”). It is Ned who teaches Linda an existential lesson in the guise of the “game” of drunkenness:
NED: Swell game. Most terribly exciting game.
LINDA: You—get beaten, though, don’t you?
NED: Sure, but that’s good, too. Then you don’t mind anything—not anything at all. Then you sleep.
LINDA [She is watching him, fascinated]: How—long can you keep it up?
NED: A long while. As long as you last.
LINDA: Oh, Ned—that’s awful!
NED: Think so?—Other things are worse.
LINDA: But—where do you end up?
NED: Where does everybody end up? You die—And that’s all right, too.
(This is reminiscent of Pompey, the fool in Shakespeare’s Measure for Measure: “[H]e that drinks all night, and is hanged betimes in the morning, may sleep the sounder all the next day.”)
It is significant, then, that in the 1938 film version of Holiday (the second such adaptation, such was the play’s popularity), Ned is a composer. When Johnny finds Linda in the playroom—once the children’s domain, now Linda’s refuge—he notices the piano, the guitar, the drums tucked against the wall, long unused: Ned’s instruments.
LINDA: He could’ve been a fine musician.
JOHNNY: What do you mean, could’ve been?
LINDA: If father hadn’t interfered.
When Ned wanders in (“I haven’t been in this room in years.”), Linda prods him into playing some of his music—the “Seton Concerto in F major.” Ned sits down and, after a few mocking bars of “Bei Mir Bist Du Shein” and a sisterly reproach (“Oh, Neddy, stop—I’ve been boasting about you!”), we get a breeze of Gershwin-esque haze, lovely, fleeting, and unfocused—rather like Ned himself—and then it is abruptly cut off by Julia’s entrance into the scene. At the end of the story, Ned remains trapped in the family, even as Linda and Johnny escape. His tragedy is that he knows exactly how trapped he is.
The screenplay for Holiday was co-written by Donald Ogden Stewart. A famous humorist and raconteur—something like the David Sedaris of his day—Stewart had appeared in the original Broadway production of Holiday: the part of Nick Potter was, in essence, Barry’s stylized version of his friend. The two were on the same wavelength in their fascination with the rich, equal parts envy and critique, a portrayal in which wealth is as much of a trap as poverty. If it was Stewart who made Ned into a composer, an anticipatory manifestation of Mills’s warning about the way capitalism trivialized the arts into hobbies, then it would prove personally prophetic as well. After moving to California and making a handsome living writing screenplays for the studios, Stewart eventually began to feel guilty about it. “I had won all the money and status that America had to offer—and it just hadn’t been good enough,” he wrote. “The next step was Socialism.”
He knew that he had the priorities wrong; but that doesn’t mean he got them right. Socialism was still a materialist lens, capitalism’s mirror image—the criteria were still profit and loss, haves and have-nots. Stewart’s commitment was sincere and unwavering: he was proud of his convictions. He ended up blacklisted, and lived the last three decades of his life in exile in England. He had realized that there was something beyond the culture of business and capital; he just wasn’t able to trust that it could be art.***
To start with, we don’t trust the language anymore. Maybe we never did: most current language about music and art is still, in large part, similar to Romantic language about music and art, and Romantic language is, not to put too fine a point on it, purple as a bruise. That was the point—the original Romantics figured that if any language was going to get at the effect of beauty, it would be at the far edges of sense and sensibility. It worked for a while, but it also made it all the easier for the simplistic, seemingly more objective language of business and analysis to push it aside, to make it sound, by comparison, ridiculous.
This is another game, this one so painstakingly explored by Wittgenstein in the years after he refused to admit that a rhinoceros was not in the room. It’s a language game. Wittgenstein, too, went out to the edge of language, but in a microscopic rather than a macroscopic way. He frogged the knit of language, the way we organize words and propositions into games, with their own definitions and rules. Philosophical problems, for Wittgenstein, were language-game problems, the result of thinking that we’re applying one set of rules when another is already in play, the assumptions invisible in plain sight (“The aspects of things that are most important for us are hidden because of their simplicity and familiarity.”), our thinking lacking the perception to realize we’re even playing the game. “The confusions which occupy us arise when language is, as it were, idling,” he wrote, “not when it is doing work.”
In his final work, Philosophical Investigations (from which I have been quoting), Wittgenstein keeps setting up simple language games to train the reader to notice when the grander games come to bear. One of these games—which I find especially apt, since it stakes its ground at the nexus of market realism and the arts, the intersection of aspiration and expectation—concerns wishes.
439. In what sense can one call wishes, expectations, beliefs, etc. “unsatisfied”? What is our prototype of non-satisfaction? Is it a hollow space? And would one call that “unsatisfied”? Wouldn’t this be a metaphor too?—Isn’t what we call non-satisfaction—say, hunger—a feeling?
In a particular system of expressions we can describe an object by means of the words “satisfied” and “unsatisfied”. For example, if we stipulate that a hollow cylinder is to be called “an unsatisfied cylinder”, and the solid cylinder that fills it “its satisfaction”.
441. By nature and by a particular training, a particular education, we are predisposed to express wishes in certain circumstances…. In this game [that is, the “satisfied”/”unsatisfied” game], the question as to whether I know what I wish before my wish is fulfilled cannot arise at all. And the fact that some event stops my wishing does not mean that it fulfills it. Perhaps I wouldn’t have been satisfied if my wish had been satisfied.
In this game, the question as to whether I know what I wish before my wish is fulfilled cannot arise at all. There is a specter haunting artists in capitalist societies: the specter of regret. Within the ingrained societal habit of talking about achievement and satisfaction in market-based terms, it is regret that unites C. Wright Mills’s disaffected worker, Donald Ogden Stewart’s guilt, Ned Seton’s tragedy. The game is rigged: You can either regret choosing a career in the arts, or you can regret not choosing a career in the arts.
Or, maybe, you can simply decide not to regret. It’s harder than it seems, though. Mills knew how ingrained the conditioning was: “Growing up and working within it, educated by it, many cultural workmen today never feel the need to make political choices simply because they are in fact committed before the age of political consent.” So, in his linguistic way, did Wittgenstein: “One cannot guess how a word functions. One has to look at its application and learn from that,” he wrote. “But the difficulty is to remove the prejudice which stands in the way of doing so.” He added: “It is not a stupid prejudice.”***
Just know this: realism, in the hard-nosed, nickels-and-dimes business sense, is a way of maintaining the status quo. Wittgenstein and his rhinoceros excepted, every time I have heard or read someone confront the large animal in the room of whatever they’re facing, it is because they have given up on ever getting it out of the room at all. Brass tacks pin you down.
Everybody in the music world, I think, subscribes to the idea that music is more than just entertainment, that it is transformative, that listeners should be changed by the experience. But in the face of the encroachment of free-market and capitalist rhetoric and values into every corner of society, that sort of talk about music has been reduced to the level of platitudes. “Music can change the world!” sounds sentimental and unrealistic. But do we believe it or not? Maybe a statement like that isn’t extravagant enough. Art’s realism is no less real than capitalism’s realism, even if the respective vocabularies stand in disparate esteem. The first step toward resolving the disparity might be, literally, to talk the talk. The danger? You might get lumped in with fools. But it’s fools who know the score; and anyone who calls you unrealistic isn’t really interested in anything beyond cosmetic changes anyway.
In the midst of working on the Philosophical Investigations, in one of his many notebooks, Wittgenstein jotted down an unusually emphatic command:
Don’t for heaven’s sake be afraid of talking nonsense! But you must pay attention to your nonsense.
My thoroughly unqualified advice to the graduates? Don’t stop until everyone else is paying attention to your nonsense as well.