In The Hello Girls, historian and novelist Elizabeth Cobbs tells the story of America’s first women soldiers, more than two hundred women inducted into the U.S. Army Signal Corps in World War I to operate the newest communications technology–the telephone switchboard–at the front and behind the lines in France. These women served alongside male soldiers, handling vital communications and placing their lives in danger, at the same time as feminist activists at home were demanding a federal suffrage Amendment. As Cobbs shows, the military service of these women pioneered a new engagement of women with the military and with the state. Altogether, the story told in the book explores how Americans mobilized for World War I, telephones transformed the United States, females joined the armed forces, suffragists won the vote, and women and men fought together for justice. It illuminates the battles that defined the twentieth century and still shape our own.
In the video below, Cobbs introduces the Hello Girls and all they meant to the war and the nation.
“This French fried fraud is attacking our way of life from atop the bestseller list.”
“Remember the old saying: the rich get richer, and the poor should really try being rich, it’s nice.”
“Well folks, I’m not gonna take it from this Pepé Le Punitive Tax Rate, here.”
“Why are you trying to tear down the Western economic system and replace it with socialist redistribution of wealth?”
“Doesn’t the United States—sittin’ on top of the pile here—don’t we have the right to have our economic system benefit us and maybe not the rest of the world?”
“Inequality is bad because you want mobility. But if I’m already mobiled up to the place I want to be, isn’t it better for me to pull up the ladder behind me? Because then I can be charitable with money that I don’t even miss, to all the poor people I can easily find.”
In 2001 Alice Randall published The Wind Done Gone, a novel reinterpreting Margarett Mitchell’s Gone with the Wind from the perspective of “Cynara,” a recently freed woman who’d been raised as a slave on Scarlett’s plantation. Cynara, a character absent from Mitchell’s book and the film that followed, was Scarlett’s half-sister, the daughter of Scarlett’s father and Mammy, the loyal slave character immortalized on screen by actor Hattie McDaniel. Despite taking care to avoid the use of proper names from Gone With the Wind, Randall and her publisher were sued for copyright infringement by Mitchell’s estate.
That “unauthorized parody” will be joined this October by Ruth’s Journey, an official Gone With the Wind prequel commissioned by Mitchell’s estate. Here, too, the book will focus on a person of color, this time Mitchell’s Mammy, whom author Donald McCaig gives the name Ruth. On a recent episode of her MSNBC show, Melissa Harris-Perry hosted HUP authors Micki McElya and Khalil Gibran Muhammad to discuss the origins of the Mammy archetype and the issues inherent in this renewed focus on Mitchell’s creation:
“Mammies,” as they have been described and remembered by whites, like all faithful slaves, bear little resemblance to actual enslaved women of the antebellum period. Black women did work in white homes, cooked innumerable meals, cared for white children, and surely formed emotional ties to white family members at times, but the mammy was—and is—a fiction. She is the most visible character in the myth of the faithful slave, a set of stories, images, and ideas that have been passed down from generation to generation in the United States, through every possible popular medium, from fine art and literature to the vaudeville stage and cinema, and in countless novelty items from ashtrays to salt and pepper shakers. These narratives are locked emotionally and politically to the slave narrative genre. Early versions produced in the antebellum period by proslavery white southerners were explicitly reactionary. The stories were designed to provide reassurance that their authors’ patriarchal benevolence was real, and was recognized and appreciated by those they enslaved. They were hurled northward in response to the publication of slave narratives detailing the horror and inhumanity of the institution, the speaking tours by activist runaways, and the impact of abolitionist works such as Uncle Tom’s Cabin. As personally satisfying as they were politically and economically potent, tales of faithful slavery appeared with ever greater frequency.
[...]
Accounts of enslaved people’s fidelity constituted the ultimate expression of southern paternalism, which held that the relationship of the master to the slave was removed from market forces and economic exigency and functioned more like a familial relationship between father and child based on a set of mutual obligations and responsibilities as well as affection. Proslavery theorists argued that this was very different from the cold contract of “free labor,” under which bosses owed nothing but wages to the laborers they employed and could fire them at will. Slave owners claimed, by contrast, to be responsible for providing every aspect of enslaved people’s well-being, including clothing, food, housing, and medicine, and they bore this burden for the lifetime of their slaves as their obligation. The only thing required of the carefree slave in this scenario was work and loyalty. The faithful slave narrative, however, went one step further to argue that enslaved people appeared faithful and caring not because they had to be or were violently compelled to be, but because their fidelity was heartfelt and indicative of their love for and dependence on their owners. At their core, stories of faithful slavery were expressions of the value, honor, and identity of whites. They had little if anything to do with the actual perceptions and attitudes of the enslaved.
One hopes McCaig had occasion to make himself familiar with McElya’s work before embarking on Ruth’s Journey.
Open access to ideas is fundamental to democracy and to what we do. As a public research library we want as many people as possible to be inspired by and learn from the collection in the infinite number of ways human beings express and communicate knowledge.
[...]
I am deeply concerned about the twin evils of anti-intellectualism and historical illiteracy that are apparent in many college classrooms and are most visible in our current political culture. In this climate, African American history and Black Studies are especially vulnerable. Given the breadth and depth of our holdings, and our mission to promote learning about and the interpretation of history and culture, the Center is well positioned to lead the way in promoting historical literacy. Two areas are especially important: national engagement and youth outreach. We will rebrand the Schomburg as a center for applied historical research. We want to develop mechanisms for our researchers to translate their work for public and media consumption, particularly when their expertise can shed light on topics of the day.
The live-streaming of events like this evening’s is one such mechanism. Visit the Schomburg Center’s YouTube channel for footage of the myriad lectures and conversations that make the Center’s programming so vital.
Marcus—whom we’ve also published several times—began the evening by addressing the controversy surrounding the book, which documents major Hollywood studios accepting direction from German authorities for a longer period and to a much greater extent than previously known.
We’ll pick things up with Marcus addressing the persistent calls for HUP to withdraw the book:
This book has been attacked because it is sensationalistic; because it’s one-sided; because it’s exaggerated. Well this is quite remarkable. Nobody asks that a book be withdrawn because it’s sensationalistic or exaggerated. The argument has also been made that, “Well, everybody knew this was going on. You could find it in Variety. There’s nothing new here.” You know, traffic accident, nothing to see, move along, move along. “But if there is anything new here it’s not important.” It’s like the argument that people made for generations about the Rosenbergs: “They didn’t give anything to the Soviet Union. And if they did it wasn’t anything important. It didn’t have any effect.”
Marcus went on to situate the response to Urwand’s book within a history of Jewish authors discussing Jewish matters and being met by outcry:
The campaign—and it has been among various people an organized campaign—against The Collaboration has been of such virulence and so hysterical, and with accusations that verge on the criminal, that it recalls other instances where Jewish authors have delved into forbidden material, and have come back with stories that a lot of people want to hear, and a lot of people don’t. When Hannah Arendt published Eichmann in Jerusalem, and she argued that members of the Jewish councils in Eastern Europe worked with the Nazis to facilitate transport of Jews from the ghettoes to concentration camps, in hopes that the people they were supposedly representing would get better treatment, and that they would get better treatment. This was a small part of Hannah Arendt’s book, but it was an explosive argument. She was attacked by not only people in the press, but attacked my colleagues, attacked by lifelong friends. She was shunned. Her classes at the New School in New York were taken away from her. When Philip Roth published Portnoy’s Complaint, Gershom Scholem, the great Israeli historian, compared it to The Protocols of the Elders of Zion. Which I think no one has done of The Collaboration yet.
Forgive us, please, for continuing to dwell on Spielberg’s Lincoln. It’s not often that we have a rich trove of books on the very subject captivating the nation’s conversation. Wait, strike that, it’s pretty often, actually. It’s kind of our thing.
Anywho.
The film has a scene in which Lincoln and Radical Republican congressman Thaddeus Stevens have retreated from the grand reception of the president’s second inauguration down to a messy White House kitchen. Daniel Day Lewis’s Lincoln, driven to engineer the passage of the 13th Amendment, is trying to convince Stevens, played by Tommy Lee Jones, to temper his usually-forceful demands for racial equality, so as not to scare off the moderate votes required for the bill’s adoption. The scene, packed with quick Kushner dialogue, witty and wise, is one of the film’s great depictions of politics as art of the possible.
In the course of their volley Lincoln expresses his admiration for Stevens’s “zeal,” but explains why he’s had to act with greater restraint than the congressman had wished:
If I’d listened to you, I’d’ve declared every slave free the minute the first shell struck Fort Sumter; then the border states would’ve gone over to the confederacy, the war would’ve been lost and the Union along with it, and instead of abolishing slavery, as we hope to do in two weeks, we’d be watching helpless as infants as it spread from the American South into South America.
That reference to South America is one of the very few moments in which Spielberg gestures towards the global context of the American Civil War, and it’s all the more striking for how matter-of-factly it’s dropped into conversation. The standard popular and historical gaze back at the Civil War—embodied pretty accurately by the film, save the above—sees the coming of the conflict as territorially bounded by regions that were only delineated by the war itself. The narrative, in other words, reads the parties created by the war as those that started it.
As Walter Johnson argues in River of Dark Dreams: Slavery and Empire in the Cotton Kingdom, a much-anticipated book whose publication is but a couple of months away, we need to ask not just what “the South” was seceding from, but what it was seceding to:
It takes no great insight (only a taste for heresy) to say that the story of “the coming of the Civil War” has been framed according to a set of anachronistic spatial frames and teleological narratives. It is resolutely nationalist in its spatial framing, foregrounding conflict over slavery within the boundaries of today’s United States to the exclusion of almost every other definition of the conflict over slavery. Because of the territorial condition of the regions under debate and the character of federal recordkeeping, the Missouri Compromise, the Compromise of 1850, and the Kansas-Nebraska Act produced tremendous archives that American historians have used to terrific effect. Yet for many in the Mississippi Valley (and for the president of the United States, who in 1852 devoted the first third of his State of the Union address to the topic), the most important issue in the early 1850s was Cuba, an issue that was related to but certainly not reducible to the question of territory gained through the Mexican War and the Compromise of 1850. Similarly, for many pro-slavery Southerners, especially in the Mississippi Valley, the issues of Nicaragua and the Atlantic slave trade were more important than the question of Kansas (dismissed by many as a fight over a place where no real slaveholder would ever want to live anyway) and more important than what was happening in Congress, from which they, in any case, expected very little. The standard narrative, that is to say, projects a definition of spaces which resulted from the Civil War—no Cuba, no Nicaragua, no Atlantic slave trade—backward onto its narrative of the description of the conflict over slavery before the war.
Much of this work has been done through the category of “the South,” which serves in its dominant usage as a spatial euphemism for what is in fact a conceptual anachronism: those states which eventually became part of the Confederacy. But what the “Southern position” was on any given issue... was subject to fierce debate at pro-slavery commercial conventions of the late 1850s, which are generally seen as hotbeds of secessionism. About the only things upon which those conventions could agree was that there was something called “the South” that was worth fighting for and that the election of a Republican president in 1860 would be grounds for secession. The ultimate grounds for secession represented a sort of lowest common denominator, a platform defined by what everyone involved agreed “the South” could not be.
We recently had a chance to speak with Johnson about pro-slavery imperialism, the political economy of slaveholders, and the inherently-expansionist logic of cotton monocropping:
Incidentally, there’s another striking moment in Lincoln, in which the president and his Secretary of State are directly pleading for Kentucky congressman George Yeaman’s support for the amendment. “I saw a barge once, Mr. Yeaman,” Lincoln says, “filled with colored men in chains, heading down the Mississippi to the New Orleans slave markets. It sickened me, ‘n more than that, it brought a shadow down, a pall around my eyes.”
Those New Orleans slave markets are the subject of Johnson’s earlier work, Soul by Soul: Life Inside the Antebellum Slave Market. The book—recently the focus of Ta-Nehisi Coates’s “Effete Liberal Book Club”—lays bare the chilling day-to-day workings of the commodification of human beings. It’s a deeply effecting work, and a must-read for anyone unsettled by the tendency to reduce the era to faceless statistics and the congressional jockeying of white men.
From our modern vantage, warfare of the past often appears barbaric. The close quarters of combat and high death counts, all conducted at the whim of rulers, can strike us as monstrously inhumane. And yet, as James Whitman argues in The Verdict of Battle, pitched battles of the past were far more effective at containing their violence than today’s sprawling wars of idealism. In the article and video below, Whitman details the evolution of warfare, and highlights some lessons of the past.
-----
Pitched battles are gruesome events. Consider the description of the aftermath of the Battle of Solferino in 1859 given by Henri Dunant, the founder of the Red Cross: “The poor wounded men... were ghastly pale... They begged to be put out of their misery, and writhed with faces distorted in the grip of the death-struggle...” When we read such accounts it is difficult to imagine that anybody could regard a pitched battle with anything but horror.
Nevertheless for many centuries our ancestors thought of pitched battles as a good thing—in fact, a blessing for society. A pitched battle may be nightmarish for the soldiers who fight it, but it is a contained way of settling an international dispute. If a conflict can be decided by a day of concentrated killing on the battlefield, then violence can be prevented from spilling over to the rest of society. In fact, for many centuries staging a battle was deemed to be a perfectly acceptable, and highly desirable, legal procedure. In the Middle Ages, pitched battles were regarded as “judgments of God,” lawful ways of calling upon the Almighty to decide an international dispute. In the eighteenth century, they were regarded as settlement procedures, fought under a “tacit contract of chance.” Eighteenth-century battles were a kind of lawful wager, by which the warring sovereigns agreed to allow their conflict to be settled by the “chance of arms.”
To modern lawyers, such attitudes seem incomprehensible and repellant. We cannot accept the idea that international disputes could be resolved by deliberately staging a mass killing on a battlefield. The modern law of war is humanitarian law. It is founded on the proposition that war is a curse, and that conflicts should be resolved through peaceful means, with war used only as a desperate last resort.
The popularization of the term “military-industrial complex” is commonly traced to Dwight Eisenhower’s 1961 address upon leaving the presidency. Despite our knowledge of the term’s roots therein, argues Aaron B. O’Connell, few people remember the actual content of Eisenhower’s message. O’Connell, the author of Underdogs: The Making of the Modern Marine Corps, reminds us in an NYT opinion piece that what Eisenhower cautioned against was a permanent militarization of America, an unyielding state of preparation for and engagement in war that threatened to become a dominant feature of modern American life.
Despite Eisenhower’s warning, that’s essentially where we’ve arrived, led by civilians and politicians who project and protect the sanctity of American military institutions. From O’Connell’s piece:
From lawmakers’ constant use of “support our troops” to justify defense spending, to TV programs and video games like “NCIS,” “Homeland” and “Call of Duty,” to NBC’s shameful and unreal reality show “Stars Earn Stripes,” Americans are subjected to a daily diet of stories that valorize the military while the storytellers pursue their own opportunistic political and commercial agendas. Of course, veterans should be thanked for serving their country, as should police officers, emergency workers and teachers. But no institution — particularly one financed by the taxpayers — should be immune from thoughtful criticism.
“That which is left unexamined eventually becomes invisible,” he continues, “and as a result, few Americans today are giving sufficient consideration to the full range of violent activities the government undertakes in their names.”
As O’Connell explains in Underdogs, the Marines in particular have excelled at catapulting themselves to these heights of popular esteem. They’ve managed to become a force in American culture, he argues, largely by way of their own culture of force. O’Connell, himself a Marine reserve officer and assistant professor of history at the US Naval Academy, recently spoke with us about the book, and the concept of militarization:
For centuries, from Cape Cod to Newfoundland the return of fish, birds, and marine mammals—each in their season—sparked quiet rejoicing in fishing towns and outport villages. Many of those communities had few economic alternatives to harvesting the sea, and fishing folk chose to believe that the sea would provide forever. That belief dovetailed with the attitude of naturalists and scientists, who often insisted, at least until the mid-twentieth century, that the sea was eternal and unchanging, even though almost every generation of harvesters noted evidence to the contrary and raised disturbing questions about the perpetuity of the stocks on which they relied. Beginning in the nineteenth century, however, fishermen’s hard-won knowledge all too often disappeared as new technologies increased catches. Bumper catches obliterated memories of how the same number of men, with the same gear, fishing in the same place, had been catching fewer fish as time passed—an indicator that stocks were diminishing. Shoreside naturalists’ insistence that the sea was eternal and fishermen’s periodic loss of vernacular knowledge that stocks were declining reinforced each other. Combined, they camouflaged one of the northwest Atlantic’s great untold sea stories, a true tale of changes in the sea.
An irony sharp as a sculpin’s spines pervades that story. No profession has ever placed more emphasis on avoiding disaster than seafaring. Mariners instinctively anticipated danger, maintained a sharp lookout, and constantly scanned their surroundings for indication of the slightest problem. To relax vigilance was to court catastrophe. Yet disaster struck for both fish and fishermen, periodically in the seventeenth, eighteenth, and nineteenth centuries, then universally at the end of the twentieth century, in part because neither fishers nor scientists nor policymakers chose to believe that what they were seeing was happening. The sea was not immortal.
Tired? Probably. You can’t help but be yawny, especially during the workweek. That’s because the dictates of modern society rarely allow you to live in synchronicity with your individual body clock, which is just one of three that we’re constantly trying to balance. The others are the sun clock (daylight and darkness send us signals) and the social clock (that thing hanging on the wall or on your wrist).
For most of us, the social clock is boss. But when it requires us to live out of sync with our body clocks, we suffer from what chronobiologist Till Roenneberg calls “social jet lag.” Social jet lag can be just as powerful as the more familiar jet lag caused by traversing time zones, and because it’s a function of our daily lives it can be a chronic condition. And, as Roenneberg explains in the excellent video posted below, social jet lag has consequences:
Most people with social jet lag are more likely to be smokers. They are more likely to drink alcohol. And people with social jet lag drink a lot of caffeine during the day. And now we’ve discovered another consequence of living against the body clock, and that is if you are already a little chubby and not very thin, it is very likely that living against the body clock makes you even become obese. In view of all these consequences, it is really high time we do something against social jet lag.
Roenneberg and his colleagues have built an online body clock calculator to aid them in their research and to help you understand your own internal time, so head over to the Munich Chronotype Questionnaire to receive an evaluation of your chronotype. And then check out Roennenberg’s Internal Time for 24 short case stories meant to help you understand why you’re so tired.
The Harvard University Press Blog brings you books, ideas, and news from Harvard University Press. Founded in 1913, Harvard University Press has published such iconic works as Bernard Bailyn’s The Ideological Origins of the American Revolution, John Rawls’s A Theory of Justice, and Sarah Blaffer Hrdy’s The Woman That Never Evolved.