As Marc Bousquet has shown in How the University Works, a 1989 document called the “Bowen Report” predicted that the mid-1990’s would see a “a substantial excess demand for faculty in the arts and sciences,” with five jobs open for every four applicants. Instead, there proved to be one job for every three applicants. Let me repeat that, because it’s stunning, and because it’s one of the best summaries we have of exactly what went on: the Bowen Report predicted that the mid-1990’s would see “substantial excess demand,” with five jobs open for every four applicants. Instead, there proved to be one job for every three applicants. We were taught that the old abundance of academic jobs that had blessed our mentors in their youth (the putative ancestor of the 1990’s fictitious “excess demand”) had been based on the expansion of the American economy, and both would simply continue to the end of days.
In reality, by the end of the 1970’s growth had already shifted from the real economy of manufacturing production goods for export, to the fairytale economy (remember “Goldilocks”?) of finance and consumption. The vast deficits of the Reagan era signaled a normalization of unsound money backed by endless debt. Combined with twin oil bonanzas in Prudhoe Bay and the North Sea, this easy credit (that is, ballooning debt) created a false sense of economic security among the owning classes of the US and UK. That false sense of security was cultivated and maintained by a speculative investment market that required their “confidence” in order to function. The American dependence on debt forces all its participants–including the foreign Central banks that buy U.S. Treasury Bills–to maintain the fiction that one day, the ever-growing debt will be paid. That, however, would require that the dollar be backed by something other than the printing press and the Federal Reserve banks that loaned it into existence. The only choice for the participants is to insist (on those very rare occasions when the question is brought to bear) that one is simply betting on “future economic growth” to generate the vast wealth required to pay down all the interest and the principal. The need for this pretense (and others) led the national narrative–the journalism that drives “groupthink”–dangerously far from the solid ground of real economic and ecological events.
Just as neurotic symptoms arise as defenses against a problem that the patient has repressed down into the unconscious, the Reagan-Thatcher pathology of borrow-and-consume was a defense against the quickly repressed “malaise” of the 1970’s. Largely unremarked at the time, 1971 saw two events whose significance for American the future can’t be overstated: domestic oil production peaked in the Lower 48 states and went into irreversible decline, and the greenback was taken off the gold standard as Richard Nixon “temporarily” ended the Breton Woods accord. Those are, of course, the two sides of a single coin. Without cheap and abundant hydrocarbon energy, it became harder and harder to convert raw materials into finished goods for export, and the finance economy ballooned to fill the gap left by a hollowed-out manufacturing sector. After the international oil shocks of 1973 (associated with the Yom Kippur War in Israel) and 1979 (the Iranian Revolution), American influence brought the price of Saudi crude very low. This was perhaps the main driver of the USSR’s collapse, since oil was and remains Russia’s chief export. In managing that Saudi relationship, President Reagan was building on the work of his Republican predecessor: after all, why hadn’t the dollar collapsed in 1971 when Nixon unmoored the greenback from gold and forced it to “float”? Because around the same moment, his administration also ensured that Saudi crude would only be sold in American dollars, no matter who the customer might be. That meant the rest of the world had to produce real goods and services, trade them for American dollars, and use those dollars to buy the petroleum they needed–whereas Washington could simply turn on the presses at the Fed and print the dollars up from nothing, getting America its Saudi oil essentially for free. This is a generalization, but the gist is right.
The process is called “petrodollar recycling”: America sells dollar-denominated debt to China in the form of Treasury Bills, bank bonds, corporate bonds, and stock. In exchange for these and other American IOU’s, China sends money to the US Treasury and to US banks and US firms. These then pay their American workers (for jobs that may or may not generate real wealth, the way manufacturing and agriculture do, but finance does not), who in turn spend their dollars on cheap imported Chinese goods at Wal-Mart and Target. The Chinese who receive these American consumer dollars can then spend them on Saudi crude oil to fuel Chinese manufacturing, and the process continues. Of course, every step of this sketch is a reductive distortion, but the general pattern is accurate, and it is the pattern that is just beginning to break down. Like many cycles, it looks sustainable if you only pay attention to the moving parts, not where they originate or end up. China accumulates endless mountains of increasingly risky American paper investment instruments (over a trillion as of this writing); America accumulates staggering debts; U.S. consumers spend money they don’t have and wind up indentured to banks and other creditors; and the biosphere on which we all depend is depleted to the brink of ecological disaster. Without those four inconvenient externalities, the system is just fine.
The year 1970 was a pivotal one for American power for the two linked reasons explained above, domestic oil scarcity and the fiat dollar. This helps explain the shifting sands on which American undergraduate and (especially) graduate education has been built. “The last year in which the notion of apprenticeship had any validity for the [academic] profession,” writes Cary Nelson (in “Resistance Is Not Futile”), “was 1970.” In the apprenticeship model, a graduate student earns her doctorate by taking courses, writing a dissertation, and learning to teach undergraduates by doing so throughout her graduate education (or at least after the first or second year of study is complete). The term “apprenticeship” applies because after the PhD is complete, the fully trained PhD steps into an academic job similar to the job of the mentor who trained her. That presupposes a flow of interested undergraduates on one side, proportionally matched by new full-time teachings posts on the other side. What has changed is not a decline in the number of college students in need of instruction – there is no shortage of undergraduates. What has changed is the way universities get their teaching done: they no longer higher PhDs and pay an adequate lower middle class salary with medical benefits and a retirement provision, along with some basic guarantee of academic freedom. The faster way to say that is “tenure-track,” but the most controversial part of tenure – the alleged lifelong immunity to being fired – is not the salient point. This is: “Thirty-five years ago, nearly 75 percent of all college teachers were tenurable; only a quarter worked on an adjunct, part-time, or non-tenurable basis. Today, those proportions are reversed.” Marc Bousquet continues:
"If you’re enrolled in four college classes right now, you have a pretty good chance that one of the four will be taught by someone who has earned a doctorate and whose teaching, scholarship, and service to the profession has undergone the intensive peer scrutiny associated with the tenure system. In your other three classes, however, you are likely to be taught by someone who has started a degree but not finished it; was hired by a manager, not professional peers; may never publish in the field she is teaching; got into the pool of persons being considered for the job because she was willing to work for wages around the official poverty line (often under the delusion that she could “work her way into” a tenurable position); and does not plan to be working at your institution three years from now. In almost all courses in most disciplines using nontenurable or adjunct faculty, a person with a recently earned Ph.D. was available, and would gladly have taught your other three courses, but could not afford to pay their loans and house themselves on the wage being offered… Most undergraduate education is conducted by a super-exploited corps of disposable workers…often collecting wages and benefits inferior to those of fast-food clerks and bell-hops. According to the Coalition on the Academic Workforce survey of 2000, for instance, fewer than one-third of the responding programs paid first-year writing instructors more than $2,500 a class; nearly half (47.6%) paid these instructors less than $2,000 per class. At that rate, a full-time load of eight classes nets less than $16,000 annually and includes no benefits… Like Wal-Mart employees, the majority-female contingent academic workforce relies on a patchwork of other sources of income, including such forms of public assistance as food stamps…” (Bousquet, 2-3).
The number of incoming undergraduates in need of professorial instruction at American universities has continued to grow; so has the number of PhD’s “produced” by graduate programs at the same universities. What has radically shrunk is the number of full-time tenurable faculty jobs where those PhD’s can be employed to do that undergraduate teaching, since the jobs have been replaced with super-exploitive “adjunct” professorships. “In the reality of structural casualization,” Bousquet continues, “the jobs of professors taking early retirement are often eliminated, not filled with new degree holders.” (80) During the same period, the number of administrators and the size of their salaries grew at alarming rates. Descriptions of the would-be new-faculty predicament were proffered by deans, department chairs, and disciplinary professional organizations such as the Modern Language Association (MLA), which ministers to English departments. Those descriptions were articulated in a management discourse in whose primary allegiance is not the university’s tuition-paying clientele, the faculty, nor civil society altogether, but capital. So its terms were exclusively those of supply and demand, production and capacity. Since this presupposes an equivalence between higher education and any other product—such as the cars Henry Ford manufactured while formulating the kernel of modern corporate management doctrine—“Fordism” is the name given to the supply-side approach. The problem, from the Fordist perspective, is not that the good teaching jobs have been essentially stolen and alchemized into new stadiums, dorms, and Assistant Provost salaries; no, the problem is that too many PhD’s have been “produced” by graduate programs whose size must therefore be curtailed:
The Fordism of the discourse surrounding graduate education is a nearly unchanged survival of the dominant interpretive frame established between 1968 and 1970, when a freight train of scholarship decrying a Cold War ‘shortage’ of degree holders suddenly reversed itself in attempting to account for a Vietnam-era ‘surplus.’ (Bousquet, 190)
The now-venerable Full Professors who had arrived in the 1960’s were imprinted with (and misled by) an experience of opportunity abundance. Those hired in the 1970’s and 1980’s, by contrast, had the false impression that the conditions of their own scary-but-successful scramble for academic employment represented the worst the profession would ever throw at any fresh crop of candidates. Since they had achieved tenure anyway, surely their students, too, could sing “We Shall Overcome” and prove it true. These mentors and thesis advisors believed the Bowen Report’s fish story of an imminent 1990’s job-glut for a tangle of reasons: it appealed to their conscience (without the Report’s promise of a new golden age, they would be guilty of training new PhD’s for jobs that would never materialize: a cruel hoax); it appealed to their vanity (we strained uphill and made it to the heights, but you kids will sail downhill toward even greener pastures); and it concealed both the real situation and the reasons for it. Though the Fordist solution—curtail production of new PhD’s by shrinking or eliminating some doctoral programs and raising graduate admissions standards—was often discussed in meetings and in print, nobody ever took it seriously as an action program. Why not? After all, that program certainly sounds like it would produce the desired parity between new job candidates and the full time faculty posts they had spent tens of thousands of dollars and years of strenuous effort to earn. Because such parity was never the real goal. Instead, hordes of earnest, wide-eyed, talented graduate students are annually accepted, educated, mentored, and released into the airless chamber of permatemp poverty in order to create and maintain a just-in-time supply of casualized low-wage teaching labor, large enough to serve the ever-growing classes of incoming undergraduates.
Why does graduate student Jane Doe willingly enter what often turns out to be a lifetime of teaching for sub-poverty wages, with no benefits, no office, no telephone, no photocopy privileges, and no respect? Because at each stage, it appears to be in her rational interest to do so. At first, she accepts it as a matter of course, because she’s still a mere graduate student with no rank, who needs to teach in order to become experienced. This is the apprenticeship model she carries in her head; it is the basis of her dignity and is not easily discarded. Then, some of her classmates quit, often for lack of money to complete their studies; these continue to teach as adjuncts because they lack the PhD which the good jobs require. Ms. Doe completes the PhD. She continues to teach as an adjunct because in order to reach the eventual holy grail of a tenurable job, she must continuously keep a hand in the profession by teaching one course or more at all times. After a few years of relentless rejection from those scarce-or-fictitious tenurable jobs, she will continue to teach as an adjunct because (as even the most obtuse idealist will have realized by then) it is the only remaining way to practice the teaching vocation in such conditions. Search committees for the few real jobs that appear to open up (sometimes the funding falls through and the job proves illusory) tend to prefer the relative naïveté and boundless energy of the 28 year old gente nuova. At that point the 40 year old job candidate has fifteen years of adjunct teaching experience, three scholarly books, and a sheaf of peer evaluations: but her PhD is fifteen years old. So the credentials Dr. Jane Doe worked so hard to achieve have become not only useless, but disqualifying. Yesterday’s groomed wunderkind is tomorrow’s overqualified cynic.
I have another idea...