• Features
  • Reviews
  • Teaching
  • Watch & Listen
  • About

The past is never dead. It's not even past

Not Even Past

Why Study the Ugliest Moments of American History? Reflections on Teaching Howard Zinn’s A People’s History of the United States

History professors often look for ways to use the past to inform present debates. With long-past events, that sometimes requires some acrobatic leaps over centuries or millennia, but in my own courses on violence in American history, the connections are often pretty obvious. Every day, a stream of new or ongoing violent events invite historical consideration: years-long wars in Iraq, Afghanistan, and Syria; regular drone strikes in Yemen, Somalia and Libya; armed militia groups parading in America’s streets; mass shootings and regular evidence of police misconduct; and violent demonstrations in America’s cities, including right here in Austin. I often ask my students: Do any of these events have historical precedents worth knowing about? Are there any long-standing structural forces at work that that help explain how violence has operated in American society?

For those of us old enough to remember the Cold War – or the Rodney King riots, the Oklahoma City bombing, or the surge of militia movements in the 1990s – the answer to both of these questions is a pretty emphatic “yes.” But the first-year students in my War and Violence in American History course often know little about any of these events beyond the fact that some of them happened. (Don’t judge – how thoughtful about US history were you when you were 18?) Of course, for historians, knowing an event occurred is just the starting point. We strive to understand the surrounding historical context and we probe for causes, both particular and structural: What were the drivers of these events? Why did they happen when they did? Are there previous events that seem to have ignited from a similar spark?

When I ask these questions in my classes, the conversation moves quickly away from the present and back into the past: “Why are people protesting against police misconduct in Austin?” turns into “Has this happened before? How many protests have been sparked by police misconduct in previous eras?” Discussions of terrorism move from “Why would terrorists attack civilians?” to “How should we think about the Sons of Liberty or the Sullivan Expedition in the Revolutionary War? In all of these discussions, the educational value comes not just from knowing what happened in the past, but in understanding how those events shaped later laws, policies, culture, decisions over resources, and the ideological frames that we use to make sense of the present.  

To take my students through the long history of violence in America, I use a book that has been in the news lately: Howard Zinn’s A People’s History of the United States. This book is well-known – even controversial, both inside academia and out – partly because Zinn tackles some of the sacred cows of America’s national mythos: Is Christopher Columbus better remembered as a genius sailor or a genocidaire? What does it say about the progression of liberty that single women in New Jersey had the vote until 1807, when it was stripped from them by an all-male state legislature? Most important: why have so many history books focused almost exclusively on the stories of white, wealthy men whose total numbers have never approached half – or even a quarter – of the country’s total population?

Zinn’s answer to all of these questions is that there have always been long-standing structural inequalities in American society that have shaped everything from the writing of laws, to the ways they are interpreted, to the stories we tell today about the nation’s past. It is nice to think of America as one big family, Zinn explains, but telling the story that way conceals fierce conflicts in that family’s history “between conquerors and conquered, masters and slaves, capitalists and workers, dominators and dominated in race and sex. And in such a world of conflict, a world of victims and executioners, it is the job of thinking people, as Albert Camus suggested, not to be on the side of the executioners.”[1]  

Is telling America’s story primarily from the perspective of victims “biased” history? Of course it is – and you’ll never find a history book that isn’t. Writing history means making an argument, which necessarily means making choices about which stories to tell and whose voices to include. No historian can include every relevant event or every history book would be an even more boring version of James Joyce’s Ulysses, with thousands of pages devoted to explaining a single day or decision. The key to good history is to make those choices judiciously, acknowledge conflicting evidence, and support one’s analytical claims with relevant historical facts and quotes from the people who were there at the time. (It helps to tell an entertaining story too, and that’s usually harder when telling people about things they’d rather not confront.)

Zinn not only understands that writing history involves picking and choosing – he explains it to his readers in his very first chapter. “In the inevitable taking of sides which comes from selection and emphasis in history, I prefer to try to tell the story of the discovery of America from the viewpoint of the Arawaks, of the Constitution from the standpoint of the slaves, of Andrew Jackson as seen by the Cherokees, of the Civil War as seen by the New York Irish, of the Mexican war as seen by the deserting soldiers of Scott’s Army, of the rise of industrialism as seen by the young women in the Lowell textile mills,” and so on.[2] He focuses on these perspectives precisely because he did not learn about them in his own undergraduate education in the 1950s, and because he thinks they are too important to ignore. That’s a bias, to be sure, but is it any more biased than the many histories that ignored such perspectives?  By what measure?

Beyond the question of bias, I often ask whether focusing on the stories of the victims of violence and oppression is “good history”? My students debate this question every week. And, to be clear, they don’t only read Howard Zinn. They also read critiques of Zinn, methodological pieces, and primary sources that either reinforce or complicate the arguments Zinn makes. I also invite my students to question my motives. After all, I assigned the readings – do you think I am pushing my ideas on them for some ulterior purpose? That conversation sometimes gets pretty animated when they learn I spent a quarter of a century in the Marine Corps and that Howard Zinn was himself a World War II combat veteran.  Are veterans more likely or less likely to criticize the U.S. government and its foreign policies?, I usually retort. How could you research that question? Do veterans have more authority to speak on these issues than professional historians who haven’t served in the Armed Forces? Why or why not? Which credentials do you think matter most for explaining the past or commenting on issues in the present?   

Screenshot from https://www.howardzinn.org/

Not everyone likes Zinn’s book. I have to admit that it is a sometimes depressing read. But my students are usually united in the belief that it is important to know about the ugliest parts of America’s past. We don’t focus on such moments to “make students ashamed of their own history,” or to “destroy our country,” as President Trump claimed at a White House Conference on American History last month. Rather, my goal – and I think, Howard Zinn’s as well – is to invite students to confront the contradictions between rhetoric and reality in the American experience – contradictions that were present at the creation of the United States and which continue, even now, to undermine the government’s fundamental purpose to “establish justice, insure domestic tranquility, provide for the common defense, promote the general welfare, and secure the blessings of liberty to ourselves and our posterity.”

Every semester I ask my students if the laudable goals set down in the Constitution’s preamble have finally been fully achieved.  No one thinks they have. If you do, how do explain the rather startling absence of domestic tranquility or a shared sense of justice across the country today? And that’s why I’ll continue to teach and talk about A People’s History of the United States in the future. Doing so certainly won’t destroy our country. Quite the opposite: I believe such conversations are a necessary and urgent component of its healing.

Aaron O’Connell is Associate Professor of History at the University of Texas at Austin. Follow him on Twitter @OConnellAaronB.


[1] Howard Zinn, A People’s History of the United States (Harper Perennial Modern Classics, 2003), 10.

[2] Ibid., 10.

Filed Under: Teaching

Wicked Flesh: Black Women, Intimacy, and Freedom in the Atlantic World By Jessica Marie Johnson (2020)

By Tiana Wilson

Many recent studies on chattel slavery in the Atlantic World have decentered the voices of the colonizers in an effort to creatively reimagine the inner lives of Black people, both enslaved and “free.” However, narrating the complex ways race, gender, and sexuality played out in a colonial setting beyond violence has proven difficult due to the brutal, inhumane conditions of enslavement. At the same time, the drastic imbalance of power raises questions about consent within sexual and intimate relationships. While most scholars of slavery have tended to shy away from such a contentious and messy topic, historian Jessica Marie Johnson presents a compelling analysis of how African women and women of African descent used intimacy and kinship to construct and live out freedom in the eighteenth century.

She demonstrates how the legal status of free, manumission from bondage, or escape from slavery did not protect Black women from “colonial masculinities and imperial desires for black flesh” that rendered African women as “lecherous, wicked, and monstrous” (14). Slaveowners, traders, and colonial officials attempted to exploit Black women’s bodies (enslaved or legally free) for labor. In return, Johnson argues, Black women defined freedom on their own terms through the intimate and kinship ties they formed.

Focusing on Black women in New Orleans, Wicked Flesh takes readers from the coast of Senegal to French Saint-Domingue and from Spanish Cuba to the US Gulf Coast areas in order to tell the varying experiences of Black women across the Atlantic world. Johnson draws on archival material written in multiple languages dispersed across three continents and uses a method that historian Marisa Fuentes describes as “reading along the bias grain” to offer an ethical historical analysis of her texts. Although the majority of sources Johnson utilizes were produced by colonial officials and slaveholding men, this methodology allows Johnson to carefully and innovatively piece together archival fragments, providing readers insight into the everyday intimate lives of Black women during this era. Intimacy, as Johnson explores, encompassed the “corporeal, carnal, quotidian encounters of flesh and fluid” and was the very thing that tied Black women to white and Black men. It was through these connections that women of African descent simultaneously endured violence and resisted colonial agendas. Wicked Flesh seriously consider the ways Black women fostered hospitable and pleasurable spaces on both sides of the Atlantic.

Johnson begins her narrative in West Africa between the geographical region of the Senegal River (north) and the Gambia River (south), also known as Senegambia. Senegal’s Atlantic coast saw Portuguese-Dutch-French-Wolof trade alliances and their struggle for power, but by 1659, the French drove out the Dutch from the northern area and founded the comptoir (administrative outpost of Saint-Louis. It is in this locale, comptoir, that Johnson introduces readers to free African women like Seignora Catti, Anne Gusban, and Marie Baude, who all actively engaged in networks with European and African men.

Throughout chapters one and two, Johnson demonstrates the different ways free African women cultivated freedom in efforts to seek safety and security. This included participating in grand gestures of hospitality for French officials or marring European men, but rejecting their Catholic practices. These practices impacted three groups, free African women who has intimate ties with European and African men, captifs du case (enslaved people who belonged to comptoir residents), and Africans forced onboard of slaved ships set to travel to the Americas. Chapter three examines the latter, including Black women’s and girl’s horrific experiences on the long middle passage and how this forced migration produced a “predatory network of exchanges” that attempted to “dismantle their womanhood, girlhood, and humanity” (123).

Chapters four and five shifts to the Gulf Coast region and encourages readers to reconceptualize the price of manumission for people of African descent that extended beyond the material world. Through the lives of figures like Suzanne, the wife of a New Orleans “negro executioner,” Johnson further illustrates just how bound Black women’s freedom was to their intimate relations and kinship ties with men in power who were acting on behalf of the French colonial regime. When Suzanne’s husband, Louis Congo, initially entered in a contractual obligation with slaveowners or Company officials, he requested freedom for Suzanne too. However, French colonists rejected his demand and instead, only allowed Suzanne to live with her husband, if Louis agreed to grant the Company full use of his wife when the Company needed her. While one scholar may read this account as an example of a Black woman gaining her freedom through her husband’s occupation, Johnson critically assess Suzanne’s lack of control over her own body and movement.

Diving deeper into the intricate ways women of African descent navigated French colonial power in New Orleans, Johnson’s fifth chapter follows girls like Charlotte, the daughter of a French colonial officer, who demanded manumission for herself. It is in this section that Johnson introduces the concept of “black femme freedom” that “points to the deeply feminine, feminized, and femme practices of freedom engaged in by women and girls of African descent” (260). Scholars of Black and other women of color feminists use the term “femme” to describe a queer sexual identity that is gendered in performances of femininity. Johnson finds this term productive in the context of eighteenth-century New Orleans, because strands of resistive femininity and intimacy between women was present during this time. Black femme freedom details a type of liberation that went beyond masculine and imperial desires. It describes the importance of reading Black women’s intimate decisions to privilege themselves and each other in a world that violently privileged the position of slaveowners and husbands. An example of this Black femme freedom lies within Black women’s efforts to create spaces for pleasure, spirit, and celebration against French and later Spanish censorship of their behaviors. This included hosting night markets and wearing headwraps. The last chapter explores the shift in colonial powers and how free women of African descent used this change to claim kinship ties through registration of their wills and testaments.

Wicked Flesh is a well-researched, beautifully written text that is an essential read for anyone interested in the intersections between Slavery, Gender, and Sexuality. Following in the tradition of historians like Stephanie Camp, Jennifer Morgan, and Marisa Fuentes, Johnson’s work is a superb addition to these groups of scholars who are shifting the field of Atlantic History to critically engage with definitions of freedom for enslaved and legally free women of African descent during the seventeenth and eighteenth century. Graduate students including myself can (and likely will) use Johnson’s work as a model for problematizing white colonial sources, while ethically utilizing contemporary theoretical frameworks to imagine and retell the lives of those silenced by institutional archives.

Image credits

Banner image – Ndeté-Yalla, lingeer of Waalo, Gallica, bnf.fr – Réserve DT 549.2 B 67 M Atlas – planche n °5 – Notice n° : FRBNF38495418 – (Illustrations de Esquisses sénégalaises) Image from Wikimedia Commons


TIANA WILSON is a Ph.D. Candidate in the History Department at the University of Texas at Austin.

Filed Under: 1400s to 1700s, 1800s, Atlantic World, Empire, Gender/sexuality, Race/Ethnicity, Reviews, Transnational Tagged With: Atlantic Slave Trade, atlantic world, black history, Black Women's History, British Caribbean, Caribbean History, Freedom, gender, Sexuality, slavery, West indies

This is Democracy – Supreme Court Confirmations: How Have They Changed?

Guest: Dr Stephen Vladeck, Charles Alan Wright Chair in Federal Courts at the University of Texas School of Law

This episode discuss the current nomination controversies in the Supreme Court, and the relevance of the courts to the youth today.

Zachary sets the scene with his poem, “By Allergy and Allegory”.

Stephen I. Vladeck holds the Charles Alan Wright Chair in Federal Courts at the University of Texas School of Law and is a nationally recognized expert on the federal courts, constitutional law, national security law, and military justice. Professor Vladeck has argued multiple cases before the U.S. Supreme Court and the lower federal courts; has served as an expert witness both in U.S. and foreign tribunals; and has been repeatedly recognized for his influential and widely-cited legal scholarship, his prolific popular writing, his teaching, and his service to the legal profession. Vladeck is the co-host, together with Professor Bobby Chesney, of the popular and award-winning “National Security Law Podcast.” He is CNN’s Supreme Court analyst and a co-author of Aspen Publishers’ leading national security law and counterterrorism law casebooks. And he is an executive editor of the Just Security blog and a senior editor of the Lawfare blog.

About This is Democracy

The future of democracy is uncertain, but we are committed to its urgent renewal today. This podcast will draw on historical knowledge to inspire a contemporary democratic renaissance. The past offers hope for the present and the future, if only we can escape the negativity of our current moment — and each show will offer a serious way to do that! This podcast will bring together thoughtful voices from different generations to help make sense of current challenges and propose positive steps forward. Our goal is to advance democratic change, one show at a time. Dr. Jeremi Suri, a renowned scholar of democracy, will host the podcast and moderate discussions.

Filed Under: Watch & Listen

Dead Babies in Boxes: Dealing with the Consequences of Interrupted Reproduction

This article first appeared in Nursing Clio. The original can be accessed here.

One morning in June 2019, two city workers in Lyon, France, pulled a plastic bag out of the river that runs through the city center and found it contained the body of a “late term fetus or a newborn baby thought to be less than a day old.” Such occurrences have a long history in France. About 240 years earlier, in 1781, a fisherman working on the river bank in the city center pulled out “the absolutely naked body” of a female infant with afterbirth still attached.1 Then and now, disposers and discoverers came face to face with the emotional, logistical, and legal challenge of handling the remains of interrupted reproduction, often in ambiguous and opaque circumstances in which differentiation between termination, miscarriage, stillbirth, and neonatal death was unclear. Even the terminology used at the time in medical texts and conversations was ambiguous – a woman who “injured herself” could have experienced a miscarriage or an abortion. Efforts to interrupt reproduction before quickening (then thought to occur around 18–20 weeks) were legal and understood as “restoring” women’s menstrual cycle.

In the seventeenth and eighteenth centuries, the bodies of dead preterm or newborn babies were found around the city of Lyon, almost always wrapped in fabric, often put in small pine boxes, and routinely left in or around churches. In 1631, for example, someone wrapped the foot-long cadaver of a baby girl in a coverlet, put it in a pine box that closed with two locks, marked a cross on top, and carried it to a parish church where it was left under the baptismal font. The next morning it was found by the church’s priests.2

The discoveries, documented in records from 1631 to the late 1780s, demonstrate that these practices were not secrets, although they were often the subjects of silence; participants chose carefully if or with whom they would share these stories. Women’s efforts to manage their fertility often involved allies, active and implicit, whose assistance could include disposing of the remains as well as interrupting reproduction. Networks of intimate partners, neighbors, midwives, and gravediggers knew how to get the remains buried or who to ask about how to do so.

Why did the fetuses or newborns die? If officials were alerted, they called surgeons, who performed autopsies on the spot. The surgeons observed injuries such as bruising, compressions, and drops of blood in the nose that indicated someone had pinched the baby’s nose to stop its breathing. A surgeon who examined the remains of a fetus in 1743 concluded that a botched abortion had endangered the life of the mother and a midwife had dismembered it to save her. Sometimes they concluded that the babies had been strangled.3 On occasion, treatments “to restore menstrual cycles” induced preterm miscarriages with newborns too small to be viable, or aid was not delivered to newborns, ensuring that they quickly expired.

Early-term interruptions produced clots of blood that women could dispose of in communal latrines found in the tenement buildings where workers lived. But later-term interruptions produced identifiable fetal remains. If women were suspected of intentionally interrupting reproduction, latrines were often searched. Women who delivered alone often did so in the public latrine, due likely to the simple matter of the volume of liquid labor produced, regardless of intent. When Estimiette Chretien, for example, claimed in 1715 that she had not known she was pregnant when she delivered a baby who fell down the latrine, a doctor followed the drains to the river where he “searched through fecal matter half a foot deep” but found only the afterbirth.4

Yet women who could call on intimate partners or other safeguarders could find help in these situations. After Marie Dutremblois “injured herself” and delivered a seven-month fetus in 1743, she was ill in bed for two weeks. The dead baby was put in a small box. Her boyfriend, François Noel, visited frequently, as did neighbors and friends. One day he asked her in front of other visitors, as they later recalled, what she “was going to do with the dead baby which was in a box.”5 That is, how did she plan to dispose of the remains?

Such events must often have involved discussion about what to do and the actions or involvement of people other than the woman. Someone had to carry the remains to leave them or bury them, and this person was not necessarily the woman who had just experienced interrupted reproduction. After Louise Gimet gave birth on her own in a rented room she shared with a landlady in 1724, for example, she wrapped the baby’s body in a bodice and hid it between the wall and the bed. When her landlady returned to find the floor covered in blood and amniotic fluid, she asked Louise if she had “injured herself,” which Louise denied. A neighbor joined the landlady to search the room, and they found the corpse. Louise insisted the baby was born dead. Although seemingly skeptical of her explanation, no doubt because she had hidden the body instead of reporting the stillbirth to an official, they asked another neighbor, “How do we get the baby buried?” In the middle of the night, that neighbor carried the body, wrapped in linen and in a pine box, toward their churchyard. She met a midwife on the way who took the box and said she would pay the gravedigger to bury it.6

On occasion, people took small boxes or cadavers wrapped in covers and left them in or near churches, clearly intending that they would be found quickly and given a churchyard burial. Sometimes, whoever carried the body buried it informally at night in the gardens of monasteries or convents. The pine boxes functioned as makeshift coffins, and the choice to associate the remains with a sacred space suggests that working people wanted to integrate these tiny remains into the neighborhood Christian community, even in these circumstances and unbaptized. Regardless of theology that emphasized how death without baptism would mean damnation, these burials indicate that ordinary people did not want to see fetal bodies as damned. To bury them meant they were not at fault.

Indeed, if the discovery of remains was reported to officials, they routinely ordered burial in the nearest churchyard. Even though surgeons processed the remains as legal evidence by placing red wax seals on the tiny bodies, court officials simply closed the reports with orders that the bodies be buried.

Nevertheless, remains usually were disposed of in ways that were intended to avoid any official involvement, as in the case of Louise’s neighbors. In another case, the body of a tiny six-inch-long fetus was found hidden in a pile of bricks in 1683. Others likely deposited remains in the two fast-flowing rivers that ran through the center of Lyon, where the currents quickly carried them away, as suggested by occasional recoveries from the waters. When two men pulled a sack out of the river in the city center in 1743, discovering the dismembered parts of a baby’s body, they simply buried it under a nearby bridge with a quickly fashioned cross.7

These deaths, disposals, and burials were surely occasions for intense and mixed emotions – relief as well as anxiety, sadness, or shock for the woman, the father if he was still on the scene, and others who either participated in the disposals or discovered the bodies. A surgeon who visited Louise Gimet described her as “overwhelmed and sick.” A teenager recalled that he was “frightened” when he dug up a tiny body. Surgeons sometimes noted that the bodies were putrefied, and discoverers occasionally recalled they first thought they had found a cat or a “creature.”8

The circumstances that brought a woman or her partner and allies to deal with fetal or neonatal remains, whether born alive or dead, seem in some ways timeless, even though the specific resources, networks, legal status, medical practice, and much else have changed. The circumstances of the fetal and infant remains discovered around Lyon were not unusual. Evidence of similar patterns exists elsewhere in France and in early modern England. These dead babies may remind us of a prom mom who gave birth alone in a bathroom at a high school dance and left the remains in nearby trash receptacles, or of a woman who left the remains of twins in a case in the ditch at the side of a country road, or of the difference legal abortion has made to women’s options and experiences in such situations. Yet if women’s vulnerability and women’s agency have persisted, these past patterns and current permutations remind us of the pressing need to guarantee appropriate support and services for a fundamental aspect of women’s lives.

Notes

  1. Of course the long history of abortions and infanticide and the challenges of disposal goes back much further still. For the 2019 event mentioned here, see Lyon: un sac-poubelle avec un bébé mort jeté dans les eaux du Rhône. For the 1781 episode, see Départementales du Rhône (hereafter ADR), 7 April 1781. All the episodes described here are drawn from the Lyon archives, either from the official reports that were created when the discovery of an infant cadaver was reported to local officials (including descriptions of the circumstances and the reports of surgeons called to do autopsies) or from incidental discussions of dead babies in boxes in testimonies given when women filed paternity suits if their intimate partners refused to marry them. For more on all of these issues, see Julie Hardwick, Sex in an Old Regime City: Young Workers and Intimacy in France, 1660–1789 (New York: Oxford University Press, 2020). 
  2. ADR 11G311, 5 January 1631. 
  3. ADR 11G311 1 November 1636; ADR 11G313 2 September 1743; ADR 11G313 13 February 1626, 1 November 1636, and 10 December 1676. 
  4. ADR BP2974, 2 July 1715. 
  5. ADR, BP3552 26 November 1743. 
  6. ADR BP3009, 18 July 1724; and ADR BP3544, 11 January 1725. 
  7. ADR 11G312, 11 May 1683; and ADR 11G313, 2 September 1743. 
  8. ADR 11G311, 25 January 1631; and ADR 11G313, 24 April 1744. For similar patterns around the findings of infant cadavers in southern France in terms of the use of boxes and lack of interest in prosecution, see Agnès Barroul, “Du geste à la parole: Infanticide et abandon d’enfants à Marseilles à la fin due XVIIIè siècle,” in Marseillaises: Les femmes et la ville, des origines à nos jours, eds. Yvonne Knibiehler et al. (Paris: Côtés-Femmes, 1993). For British women seeking to dig graves for infant cadavers in churchyards and other infant disposals, see Garthine Walker, “Just Stories: Telling Tales of Infant Death in Early Modern England,” in Culture and Change: Attending to Early Modern Women, eds.Margaret Mikesell and Adele Seeff (University of Delaware Press, 2003); and Garthine Walker, “Child-Killing and Emotion in Early Modern England and Wales,” in Death, Emotion and Childhood in Premodern Europe, eds. Katie Barclay, Kimberley Reynolds, and Ciara Rawnsley (Palgrave Macmillan, 2016). 

Read the original article by Julie Hardwick at Nursing Clio here. And look out for a Not Even Past profile coming soon of Dr Hardwick’s recently published book Sex in an Old Regime City: young workers and intimacy in France (Oxford, 2020).


The views and opinions expressed in this article or video are those of the individual author(s) or presenter(s) and do not necessarily reflect the policy or views of the editors at Not Even Past, the UT Department of History, the University of Texas at Austin, or the UT System Board of Regents. Not Even Past is an online public history magazine rather than a peer-reviewed academic journal. While we make efforts to ensure that factual information in articles was obtained from reliable sources, Not Even Past is not responsible for any errors or omissions. 

Filed Under: 1400s to 1700s, 1800s, Europe, Features

Primary Source: Technology in Paper – Interactive Design in Early Printed Books

Technology in Paper: Interactive Design in Early Printed Books

By Aaron T. Pratt

Primary Source: History from the Ransom Center Stacks is a collaboration between the Harry Ransom Center and Not Even Past. It aims to … In this article, Aaron T. Pratt introduces …

Long before computer mice and touchscreens made it possible to manipulate complex digital interfaces with ease, volvelles were cutting-edge information technology, transforming the otherwise static surface of the page into an interactive vehicle for learning and knowledge creation.

From Latin volvere, a verb meaning “to turn,” “volvelle” emerged in the Middle Ages as a noun that emphasizes the dynamic nature of new rotating or “turnable” devices that were being made of parchment and paper. First designed for inclusion within medieval manuscripts, volvelles and other interactive diagrams proliferated with the advent of print. They demonstrate both the ingenuity of medieval and early modern thinkers and the technical skill of the printers and bookbinders who helped bring their ideas to broader audiences.

The large volvelles created by Leonhard Thurneisser zum Thurn (ca. 1530–ca. 1596) are among the most elaborate ever printed.

Leonhard Thurneisser zum Thurn, Dess Saturni Circkel und Lauff (Berlin: Leonhard Thurneisser, ca. 1575). Harry Ransom Center Book Collection, -f- BF 1724 T54 1575.

Thurneisser was a German polymath who earned a reputation as a miracle doctor. In 1571, he became the personal physician of Elector Johann Georg of Bradenburg after helping his wife through an illness. Johann Georg subsequently furnished him with space in Grey Abbey, a former monastery in Berlin. There, Thurneisser established his home, along with a library, laboratory, and printing operation.

Thurneisser designed and printed eight paper machines around 1575. Meant to accompany the second edition of his alchemical and astrological treatise, Archidoxa, they are tools to help readers determine the influence that the heavens would exert on their lives and the natural world. In essence, they are horoscope calculators.

In each, a woodcut frame surrounds the base of each device. At its foot, the frame depicts female alchemists with the tools and symbols of their craft, connecting alchemy to astrology as an associated art. The volvelles themselves are made from as many as seven distinct parts, and, with their curves and fine lines, they would have been a challenge to cut out and assemble.

The serpentine dragons that serve as pointers are particularly striking, but look closely and you can find detailed constellation charts among the lower layers.

Thurneisser’s volvelles remain an ambitious—and visually stunning—instance of early interface design. Some sets remain black and white, but the Ransom Center’s set has been colored by hand.

There are other examples of volvelles in the Ransom Center. Active during the later thirteenth and early fourteenth centuries, Ramon Llull (ca. 1232–ca. 1316) dedicated much of his prolific career as a writer to the goal of converting Muslims to Christianity, and in the process became one of the earliest—and most influential—volvelle-makers. A 1578 edition at the Center includes an instance of the volvelle he integrated into his Ars brevis. It combines the letters of a symbolic alphabet, which is designed to help readers contemplate important questions and arrive at Christian truths. For example, Llull writes that the combination “BCD” can help readers contemplate “whether any goodness is as infinitely great as eternity.” (Llull’s answer is “yes; otherwise all the greatness of eternity would not be good.”)

Ramon Llull, Ars brevis (Paris: Gilles Gourbin, 1578). Harry Ransom Center Book Collection, BV 5080 L85 1578.

Volvelles show up in other areas of learning, too. Take for example Gioseffo Zarlino’s (1517–1590) work on calendar reform. Best known today as a music theorist and composer, Zarlino published a book for the consideration of Pope Gregory, who was eager to institute a new calendar that would bring Easter back to the time of the year when it had been observed by early Christians. Zarlino’s proposed calendar would not end up being the one Gregory adopted in 1582, but the ingenious volvelle he designed remains an impressive piece of technology. By manipulating the discs, readers can reconcile the 28-year solar cycle of the Julian Calendar with the 203-year cycle that his book proposes. (Zarlino subdivides his solar cycle into seven 29-year periods, or “Great Years.” A solar cycle is the number of years before days of the week will recur on the same days of month.)

Gioseffo Zarlino, De vera anni forma (Venice: Giovanni Varisco, 1580). Harry Ransom Center Book Collection, CE 10 Z298 Copy 1.

Such objects are not only beautiful. In its own way, each underscores the intimate relationship between new forms of knowledge and media design. Design communicates knowledge, yes, but it also creates it.

This article is revised and updated from a piece originally published in the online Ransom Center Magazine in January 2018.

Filed Under: Primary Source:

Protected: Collaboration between the University of Texas at Austin and Killeen High School

This content is password protected. To view it please enter your password below:

Filed Under: Uncategorized

IHS Climate in Context: Understanding Resilience in the History of Climate Change

Earth has warmed by over one degree Celsius since the pre-industrial period, and computer models suggest that human emissions could lead to another two degrees Celsius of warming by the end of the twenty-first century. The combination of the speed, eventual magnitude, global impact, and human origin of present-day warming has no parallel in Earth’s history. Yet natural forces have long transformed the climates in which human populations struggled to survive. For over a century, “climate historians” have attempted to uncover how populations responded to those changes.

Climate history as a field has been dominated by studies that use statistical or qualitative methods to link these harvest failures to food shortages, famines, outbreaks of epidemic disease, and conflict within or between societies. Some of the most influential books and articles written by climate historians conclude that sustained periods of cooling or drying caused some of history’s best-known civilizations to “collapse” – to rapidly lose political organization, socioeconomic complexity, and ultimately population.

Other studies assert that even many civilizations that escaped collapse during periods of cooling or drying still endured sustained subsistence crises that culminated in political transformation. In some accounts, crises could undermine entire continents, in the fourteenth or seventeenth centuries, for instance, when tens of millions perished. In all this scholarship, the worst-affected civilizations were those with subsistence strategies, hydraulic infrastructure, military or demographic pressures, and inefficient or unpopular governments that left them vulnerable to environmental disruption.

Many climate historians study the past not only to enrich historical scholarship, but also to learn lessons that may inform efforts to plan for the future. Indeed, climate history now informs some of the direst predictions for humanity’s future, including the common assumption that present-day civilizations would disintegrate if global heating exceeds dangerous thresholds.

Yet in recent decades, archaeologists have uncovered striking evidence that hunter-gatherer communities and early agricultural settlements endured and exploited climatic shocks similar in magnitude – if not in origin – to those anticipated in some of the most alarming projections of future warming. A new generation of climate historians has also criticized the methods and assumptions common to most accounts of crisis and collapse in complex civilizations, particularly those of the Common Era.

Studies now reveal that even populations – especially Indigenous populations – once used as examples in narratives of climate-driven crisis in fact found ingenious ways to respond to new environmental realities. The traditional focus on societal catastrophe by climate historians today seems less like a reflection of past reality, and more like a distortion caused by selection biases that privilege periods of agricultural disruption in agrarian empires.

Climate historians therefore increasingly explore the resilience of past communities and societies to climate changes and anomalies. That term – “resilience” – today has many definitions and many critics, who argue for example that it can either hide the socially constructed power dynamics most responsible for vulnerability, or privilege macro-scales of social analysis where profound disruption rarely occurs.

Yet a broad definition of resilience that emphasizes power dynamics and distinguishes between change across different scales in time and space is providing a powerful tool for climate historians. Resilience encourages us to cast our gaze away from the agrarian empires and abstract associations between human and climatic histories that have long dominated climate history. It compels historians to, for example, explore social responses to climate changes on local scales, to uncover ingenious responses to shifting environmental circumstances, and to explain how resilience for some came at a cost to others.

Resilience, in short, is helping us craft more complete accounts of the past – and, just maybe, helping us learn valuable lessons for the future. 


Dagomar Degroot is an Associate Professor of History at Georgetown University and an environmental historian who bridges the humanities and sciences to explore how societies have thrived – or suffered – in the face of dramatic changes in the natural world. 


The views and opinions expressed in this article or video are those of the individual author(s) or presenter(s) and do not necessarily reflect the policy or views of the editors at Not Even Past, the UT Department of History, the University of Texas at Austin, or the UT System Board of Regents. Not Even Past is an online public history magazine rather than a peer-reviewed academic journal. While we make efforts to ensure that factual information in articles was obtained from reliable sources, Not Even Past is not responsible for any errors or omissions.

Filed Under: Climate in Context, Environment, Features, Ideas/Intellectual History

My Journey Through Career Diversity

By Alejandra C. Garza

From the Editors: This is the first in a new series, Navigating the PhD and Beyond: Lessons from the AHA Career Diversity Initiative. In this article, Alejandra Garza introduces the Career Diversity for Historians Initiative and her role as a 2018-20 graduate student fellow. Each week on Friday we will feature an interview and connected materials assembled and curated by Alejandra as part of this important initiative.

“I don’t want to be a professor.”

Those seven words are probably not what a typical Ph.D. candidate is supposed to say.  But that’s exactly what I announced in 2015 as I considered the University of Texas at Austin as the potential site for my doctoral research. Thankfully, these seven words did not derail my admission. I breathed a sigh of relief when Dr. Alison Frazier, the graduate advisor, opened that Recruitment Weekend with a straightforward but hugely important statement: “We know not everyone wants to be a professor, or will obtain a tenure-track job. That is okay. We believe in the malleable Ph.D. and we will help you build your skill set.”

The idea of the “malleable PhD” became my driving motivation (along with my research) as I completed my graduate school education at UT-Austin. I have not only felt comfortable saying that my career aspirations extended beyond faculty-positions but I also have seen the history department here actively engage with what is so often called “alt-ac.”

Before going any further, I would like to take this opportunity to interrogate the use of that phrase. “Alt-ac” implies that anything PhD holders achieve outside faculty jobs is “alternative” to what they “should” be doing, namely teaching at a university. It automatically creates an “other” category and isolates graduate students who do not aspire to be professors. A far better way to think of things is in terms of “faculty” and “non-faculty” jobs, both of which are outcomes to be aspired to.

A far better way to think of things is in terms of “faculty” and “non-faculty” jobs, both of which are outcomes to be aspired to.

Anthony T. Grafton and James Grossman explained it best in their article “No More Plan B” that appeared in the October 2011 issues of Perspectives. They  challenged graduate programs “to change our attitudes and our language, to make clear to students entering programs in history that we are offering them education that we believe in, not just as reproductions of ourselves, but also as contributors to public culture and even the private sector.” It was a cultural reset for graduate programs and kicked off what is now known as the Career Diversity for Historians Initiative run by the American Historical Association (AHA).

In 2013, the Andrew Mellon Foundation awarded the AHA a grant to pursue the Career Diversity Initiative beyond the research phase. In 2016, UT-Austin’s Department of History participated in Faculty Institutes and was one of 20 departments chosen for a Career Diversity Implementation Grant (CDIG) in 2018.

From 2018-2020, I was the student fellow alongside Dr. Frazier, who was the faculty fellow.  During that time, I created a student advisory council with representatives from each cohort to ensure that career diversity was openly discussed and embraced within the department. We hosted a range of events that featured alumni, visiting historians, and administrators from across campus. We were also one of the first departments to welcome Annie Maxfield, the Director of Graduate Career & Professional Development at UT-Austin as she expanded the Texas Career Engagement Office.

My goal as a fellow was to create an environment where no student felt like an outsider because of their career aspirations. I would also like to note that Career Diversity was not solely about non-faculty job exploration. As I once told the student advisory council, “We aren’t anti-professor, we are pro-jobs.” The goal was to show graduate students, and frankly professors too, that the skills we develop in this program are applicable no matter what we do when we leave.

The goal was to show graduate students, and frankly professors too, that the skills we develop in this program are applicable no matter what we do when we leave.

For those reasons and many more, I am pleased to introduce a new series for Not Even Past as part of the AHA’s Career Diversity for Historians Initiative at UT-Austin. Called “Navigating the PhD and Beyond”, this series features interviews with alumni who candidly share their experience on the Forty Acres, their journey into the job market and more recently, COVID-19’s effects on their position and daily lives.

My time as Career Diversity Fellow was one of the most important and fulfilling parts of my graduate school journey. I am proud of how far the department and the university have come in the last five years. I am grateful for the guidance of Dr. Frazier, Emily Swafford (Director of Academic and Professional Affairs at AHA), Dylan Ruediger (Coordinator, Career Diversity for Historians and Institutional Research at AHA), and everyone who participated in the workshops, panels, and discussions that I had the pleasure of organizing over the past two years.

I entered the graduate program at the University of Texas at Austin knowing what I didn’t want to do afterward, but I had no idea what I wanted to do. Now, I aspire to work in higher education administration, ideally an office like Texas Career Engagement, where I can help future graduate students see how their grad degree can help them be whatever they want to be. A graduate degree doesn’t limit you in any way, it does the exact opposite.  Thanks to the AHA Career Diversity Initiative, I think more and more graduate students and professors are realizing this and I am proud to share these interviews with Not Even Past’s readers.


The views and opinions expressed in this article or video are those of the individual author(s) or presenter(s) and do not necessarily reflect the policy or views of the editors at Not Even Past, the UT Department of History, the University of Texas at Austin, or the UT System Board of Regents. Not Even Past is an online public history magazine rather than a peer-reviewed academic journal. While we make efforts to ensure that factual information in articles was obtained from reliable sources, Not Even Past is not responsible for any errors or omissions. 

Filed Under: Author Spotlight, Education, Features

The War in Afghanistan is Nineteen Years Old: What Can it Teach us about Violence in American History?

From the Editors: This article is accompanied by a comment from Jeremi Suri, the Mack Brown Distinguished Chair for Leadership in Global Affairs at the University of Texas at Austin. Such comments are a new feature for Not Even Past designed to provide different ways to engage with important new work.

This week marks the 19th anniversary of the start of the U.S. War in Afghanistan – an armed conflict that has been branded by some as the longest war in American history. But is that depressing label really accurate? If so, by what measures? If not, do terms like “endless” or “forever” wars have any explanatory value? What can they – or the Afghanistan war itself – teach us about patterns of violence in American history?

The first question – Is Afghanistan really the nation’s longest war? – seems like one of simple arithmetic. The war in Afghanistan began on October 7th, 2001 and continues to this day, making it roughly twice as long as the nation’s previous longest war, Vietnam. This, critics charge, is not only long but new: a state of constant, and seemingly endless “forever war,” that is a departure both from the principles of the founders and the isolationist tradition of the 19th and early 20th centuries.

As a military historian, I tend to wrinkle my brow when I hear claims like this, because none of those characterizations stand up well under scrutiny. In fact, there are several other candidates for America’s longest war besides Afghanistan—or Vietnam, for that matter. Most Americans just don’t know about them, or worse, want to ignore what they mean for America’s best stories about itself and its role in the world. In fact, state-sanctioned, organized armed conflict has been so regular the nation’s history that it is probably better to refer to Afghanistan as only our latest longest war – one of many in a pattern of armed conflicts that stretches back to before the nation’s founding.

What’s in a Name?

So what is America’s longest war? Part of the problem is that it is difficult to know when a state of war actually exists. Should we only consider declared wars? Can words alone create a formal state of hostilities, or are lethal actions the real determinants? Must we send soldiers “equipped for combat” into foreign territory – the standard established by Congress’s 1973 War Powers Resolution? Does there need to be a single, named enemy? Depending on how one answers these questions, the list of wars and their durations grow and shrink.

Let’s first dispense with the notion that a war must be declared by Congress for it to exist. There is much to criticize the U.S. Government for in the Afghanistan War, but fighting it without a formal declaration is neither new nor even unusual in American history. Wars do not come into being simply because Congress uses the verb “declare.” If that were true, then the entire history of the United States would include just five formal wars  and none after World War II.

Before the United States began its first declared war (the War of 1812), it had already fought three undeclared ones: the Northwest Indian War (1790-1795), the Quasi-War with France (1798-1800), and the First Barbary War, against the Bashaw of Tripoli (1801-1805). Nor was there a congressional declaration of war in the nation’s most deadly conflict: the Civil War (1861-1865). This was neither an oversight nor an accident: President Lincoln insisted Southern secession was not a war, but an illegal rebellion, in order to scuttle Southern efforts to gain diplomatic recognition from the United Kingdom.

From these examples, it should be obvious that the words do not make the war. Rather, it is violence—and specifically, state-sanctioned, purposefully-directed violence against a group or linked groups—that moves a nation from a state of peace to a state of war. Using that standard, Afghanistan isn’t longest war in American history. In fact, it’s not even the longest war of most Americans’ lifetimes.

Afghanistan isn’t longest war in American history. In fact, it’s not even the longest war of most Americans’ lifetimes.

Searching for America’s Longest War

The Cold War ended just a decade before the Global War on Terror began, and that four-decade-long conflict involved regular, state-directed, lethal paramilitary operations – mostly covert – against Communist, Communist-linked, and Communist-leaning governments and insurgencies (as well as two “hot” wars in Vietnam and Korea). The War on Drugs, begun by President Nixon in 1971, has thus far lasted almost a decade longer than the Cold War and like Schrödinger’s Cat, seems to be both alive and dead today depending on which part of the U.S. government is doing the describing. 

Some might dismiss these two conflicts as “metaphorical wars” or mere turns of phrase to mobilize public support. But that’s too simple. The essential elements of warfare are violent means to achieve political ends, whether undertaken by the U.S. military, covert American paramilitary forces, or foreign proxy forces fighting with American weapons and coordination. Those types of operations were regular in the Cold War and are still ongoing in the War on Drugs. According to a 2017 CATO Institute policy analysis, “American-backed, anti-​drug operations in Mexico . . . have resulted in some of the bloodiest years in Mexican history. In fact, since former Mexican president Felipe Calderón began using the [American-supplied] military to fight cartels, more than 85,000 people have been killed.” Given that statistic, it seems hard to claim the War on Drugs isn’t a war or that the United States isn’t somehow part of it. But even if we exclude these two as somehow too indirect to count, Afghanistan still wouldn’t be the nation’s longest war.

The presence of combat-equipped military forces conducting violent operations abroad is a good common-sense test of when a state of war exists, and we’ve had those in Afghanistan for nineteen years.  But the U.S. Navy and Marine Corps also occupied Haiti from 1915-1934 where armed Marines undertook a mix of police training, institution-building, and occasionally battled insurgents – precisely the tasks still being performed in Afghanistan. Fighting in Haiti was intense at times, involving pitched battles and the issuance of at least eight Congressional Medals of Honor. If one of these two occupations is a war, isn’t the other one as well?

Significant numbers of U.S. forces also occupied and patrolled parts of China from 1912 to 1941 (29 years) to protect business interests and to respond to attacks on U.S. persons and facilities. It’s true that many of the “China Marines” at times enjoyed a relatively luxurious life in Shanghai and Peking, but the skirmishes and punitive landings were sufficiently numerous that by 1928, there were over 5,000 U.S. troops in China, equipped with mechanized vehicles, airplanes, and 44 separate naval vessels. In operations that the Congressional Research Service describes as occurring “continuously,” naval landing parties put down riots and battled “bandits,” while American gunboats and destroyers fired on Chinese boats and shelled the shores with four-inch deck guns.

But what about the relatively low intensity of these operations? Are these really wars or something short of them? One way to answer this question is to reverse the roles of the participants: If either China or Haiti had similar numbers of ships, planes, and troops stationed in the United States and firing on soldiers and civilians alike, would any American try to claim that the United States was somehow in a state of peace? My guess is no. 

Nineteen years in Haiti. Twenty-nine years in China. Forty years of Cold War, including just how many in Vietnam? There’s no straightforward answer, because the U.S. began providing military support to the French fighting in Indochina as early as 1950. After French forces were defeated at Dien Bien Phu, the Pentagon created a Military Assistance and Advisory Group or “MAAG” for the new country of South Vietnam, and sent staff, planes, and pilots to Saigon, some of whom had to fly through small arms fire to land their aircraft. U.S. combat advisors followed in 1961 and by 1962, there were 11,000 of them in the country, undertaking tasks that were almost identical to today’s train, advise, and assist missions in Afghanistan.

These facts have created considerable confusion about the “official” start date for the Vietnam War. The Vietnam Veterans’ Memorial names the 1956 murder of Technical Sergeant Richard B. Fitzgibbon, Jr. as the first American death of the war, and both the Pentagon and the Department of Veterans’ Affairs speak of a “Vietnam War Era” with three potential start dates: 1955 (the creation of the new MAAG), 1961 (the first combat advisors), and 1964 (the Gulf of Tonkin incident). So, depending on how one counts, the American war in Vietnam lasted for either seven years, or fourteen years, or twenty years, making it both longer and shorter than the current war in Afghanistan and the Global War on Terror.

Eight Decades of Continuous Conflict?

If neither Afghanistan nor Vietnam is the nation’s longest war, then what is? Actually, the U.S. government has already weighed in on this subject, albeit indirectly. Every year, the Federal Register publishes the U.S. government’s officially-designated “periods of war” for calculating veterans’ benefits, and that list’s first entry spans more years than any other: “Indian Wars: January 1, 1817, through December 31, 1898, inclusive”: 81 years. (Of course, even this characterization understates the violence, because the military conquest of the continent didn’t begin in 1817; the U.S. had been in a state of constant conflict with various tribes since it became a nation and indeed, for more than a century before that.)

But can we even call these numerous conflicts a single war? Skeptics will charge that I am lumping when I should be splitting: the Indian wars are only a single continuous conflict if one overlooks the numerous cultural and linguistic differences between the tribes and disregards the occasional periods of low violence or temporary peace.

I disagree on both counts. There were, of course, real, long-standing cultural differences among the many Indian nations of North America, but none of those differences stopped the Anglos from organizing themselves militarily to remove Native Americans from the lands wanted for whites. Numerous military leaders and quite a few American presidents—the people actually ordering and performing the violence —considered all Native peoples as a single, inferior, race of savages, who, in Andrew Jackson’s famous words “must necessarily yield to the force of circumstances and ere long disappear.”  This tendency to monolithize the enemy – which re-occurred in the Cold War in the War on Terror as well – not only meant constant, organized violence on the American frontier; it also produced several offensive operations against Indians who were at peace with the United States at the time, such as the Gnadenhutten Massacre (1782), the Sand Creek Massacre (1864) and Baker’s Massacre (1870), to name but a few. If the tribes’ many ethnic and cultural differences didn’t produce much differentiation on the battlefield, if neutral parties – and sometimes U.S. allies like the Lower Creeks – also found themselves subject to conquest and dispossession, then it makes sense to call the centuries-long “settling” (read: conquest) of the continent a single, racially-motivated war, and an act of ethnic cleansing to boot.

What about the occasional periods of temporary peace? Do those make it impossible to talk about a continuous war by whites to control the present-day United States? It’s hard to know, because federal troops, state and local militias, and armed vigilantes regularly attacked Native Americans even during the so-called periods of “peace.”

The constant military operations against native peoples means the “settling” of the continent was really a single, racially motivated war, and an act of ethnic cleansing to boot.

Take the conquest of Florida as an example. Military historians typically divide the wars against Florida’s native population into three: The First Seminole War (1816-18), Second Seminole War (1835-42), and a much smaller Third Seminole War (1855-58). But the lines between these three conflicts are muddy at best, and for the Seminoles, life was often no safer between the wars than in them.

Organized, state-sanctioned violence against the Seminoles began long before Andrew Jackson illegally invaded Florida in 1816; in fact, state militias had been conducting raids into Seminole territory at least since American independence, and both the Jefferson and Madison administrations authorized covert military operations in Florida to wrest it from Spain. The constant threat of violence worked. In 1819, Spain sold Florida to the United States, and raids and patrols were regular enough thereafter that today’s Seminoles still speak of a single “Seminole War,” not three separate conflicts. The State of Florida’s own Seminole Wars website admits something similar when it notes that “official and unofficial U.S. military expeditions” continued between the first and second Seminole Wars (and this says nothing about raids by state and local militias, who were often even less constrained in their actions than federal forces). Given these facts, it is fair to describe U.S. actions against the Seminoles as a war that lasted from at least 1816 to 1842 – six years longer than the current War in Afghanistan.

A Pattern of Endless Wars?

Given these facts, I think it fair to say that neither the War in Afghanistan nor the broader Global War on Terror are major aberrations in American history; rather, they are mostly continuations of a long-standing American preference for using military force to pursue political objectives, even ones that are perhaps better achieved using non-military tools. As historians like Richard Slotkin, Michael S. Sherry, and Andrew Bacevich have shown, war didn’t only create the United States, it has been an essential glue for the nation that has bound Americans together in a shared sense of purpose and community since the 18th century.

This process of building an imagined community through violence has made it fairly easy for presidents to gain public support for military operations outside the United States and to frame non-military problems in martial terms at home. It’s no wonder then that even as the United States is directly involved in very real wars in the Middle East, the Horn of Africa, and Central Asia, President Donald Trump is now using the language of warfare to describe federal actions against both peaceful protesters and the Coronavirus, just as President Nixon did with drugs, President Johnson did with poverty, and President Franklin D. Roosevelt did with the Great Depression. Warfare isn’t just the preferred tool for foreign policy anymore; it has become the preferred metaphor for federal problem solving.

What can thoughtful Americans take away from this depressing overview of little-known and hard to categorize conflicts? What can any of us do? The first step is for those who study history to raise our voices about just how common armed conflict has been in this nation’s history, particularly against non-white nations and groups. 

Most Americans probably don’t consider themselves as particularly warlike, but it’s long past time to recognize that the United States is, both in culture and in action. In addition to the size of the U.S. defense budget and the monetary value of its arms sales (both the largest in the world), the United States is now using military violence in at least 14 separate countries, and has bases in 38 countries around the world. Another 120 or so countries receive U.S. military training or assistance – almost two-thirds of all nations in the world. To manage all of these ongoing military operations (not to mention the near-constant military exercises the U.S. conducts alone or with partners), Pentagon planners have divided the world up into six separate “combatant commands,” ensuring that every square inch of land and water on the earth is under the watchful eye of an admiral or general. For this and other reasons, in 2019, the Institute for Economics and Peace’s Global Peace Index ranked the United States as the third most militarized country in the world.

This isn’t to say that all or even most contemporary American military operations are either ill-intentioned or part of an earlier pattern of dispossession and conquest. But the history recounted above should at least force us to reject feel-good narratives that portray the United States as either fundamentally peace-loving or loath to use military force — two claims that are indefensible when compared against the historical record.  

There are reasons why historian William Appleman Williams wrote of American “empire as a way of life” in 1980 – long before the current War on Terror – because even then, using violence to advance American interests abroad had become ordinary enough to hide in plain sight. But the facts of that history haven’t hidden themselves. They have been pushed into hiding by advocates of American exceptionalism who downplay or dismiss the irrefutable truths of America’s violent past, usually for political purposes. And when that happens, it’s the historian’s job to push back. 

Need a final piece of evidence of just how constant organized violence has been in American history? Look no further than the Congressional Research Service’s annual report on the subject, which shows that the U.S. has sent troops into combat abroad in 208 of the previous 222 years – a staggering 94 percent of the total years covered. That may be a pattern of “endless” or “forever wars,” but it didn’t start with Afghanistan, and it probably won’t end any time soon.

Violence and American History 

Jeremi Suri 

 

            Aaron O’Connell’s essay is revisionist history at its best: questioning the peaceful development of American democracy, as described in countless textbooks, popular histories, and especially the rhetoric of national politics. O’Connell shows that war has not been the exception, but the norm in American history. He surveys the evidence from the Indian Wars through the recent wars in Afghanistan and Iraq, and he includes the non-traditional American wars on poverty, drugs, and terror. Reading O’Connell, scholars and students must grapple with the many ways that violence is deeply embedded in every facet of the American experience. 

            Ubiquity, however, is not the same as consistency. O’Connell’s essay opens an important research and teaching discussion about how violence has changed over time. The Indians Wars were very different from recent wars in the composition of the fighting forces, the technologies deployed, and the purposes behind the conflicts. The consequences also varied considerably. Most of the pervasive wars in American history did not end with a decisive “win” or a clear “loss,” as our inaccurate sports analogies tell us that they should. The conflicts lingered long, they bled into many other parts of society, and they left lasting marks on diverse groups. Today, for example, the location and curriculum for many of our schools reflect prior (and continuing) wars over resources and whose experiences matter most.  

            Military history, therefore, is about much more than set-piece battles. It is, as O’Connell shows, the story of organized group violence for political and social purposes. To see how violence has so deeply influenced American society we must embed the field into all areas of study, as we do for race, economics, and gender. Doing this will not only improve our knowledge of the past; it will equip citizens to recognize the uses and misuses of violence in our own time.   

O’Connell’s essay discourages expectations that we can eliminate violence from the American experience, but his narrative gives us hope that we can manage violence better for the purposes of our democracy. That is, after all, what current debates about police reform and racial justice are all about. Black Lives Matter is a powerful call to see the violence all around us, understand its inherited and unequal harms, and do something about it.

Aaron O’Connell is Associate Professor of American history at the University of Texas at Austin. Follow him on Twitter @OConnellAaronB. An earlier version of this article was presented at the Miller Center’s conference on “The Presidency and Endless War” at the University of Virginia.

Jeremi Suri holds the Mack Brown Distinguished Chair for Leadership in Global Affairs at the University of Texas at Austin. He is a professor in the University’s Department of History and the Lyndon B. Johnson School of Public Affairs. Dr Suri is also the host of a weekly podcast, “This Is Democracy“ which can be listened to here or on the Not Even Past homepage.

Filed Under: 1400s to 1700s, 1800s, 1900s, 2000s, Asia, Empire, Features, Memory, Middle East, Politics, Transnational, United States, War

Teaching Texas History in a Time of Pandemic: Reflections on Online Teaching

Banner image for the post Teaching Texas History in a Time of Pandemic: Reflections on Online Teaching

By Walter L. Buenger

Note: This article was originally published in Perspectives on History. You can see the original article here.

When asked near the end of the chaotic spring 2020 semester how my sudden shift to online teaching went, I responded, “It was twice the work and half the fun of teaching in person.” Once the semester was over, however, I received messages from students thanking me for carrying on close to normal and providing them with stability in a very stressful time. Some even proclaimed it their best learning experience ever. Perhaps students at the University of Texas at Austin are just forgiving and understanding of a professor who, after four decades of teaching nothing but in-person classes, was suddenly forced last March to adapt to teaching online. Still, perhaps both my summary judgment and the students’ were correct. Despite the difficulties and frustrations, my efforts to clearly illustrate the central message of the course and make the transition to online as seamless as possible may have paid off.

Photograph of an empty classroom in Garrison Hall on UT Austin's campus
Empty Classrooms in Garrison Hall

Online teaching is indeed twice the work, especially for a first-timer. I quickly upgraded my technology at home while UT scrambled to provide the best platform for teaching remotely. Everything was new—new equipment at home, new systems to master, new difficulties in maintaining a classroom community and communicating with each student. Because I had students scattered around the world, I taught each class asynchronously. That forced me to imagine a full classroom while switching back and forth from focusing the camera on me to screen sharing quotations, photographs, music, or the lecture outline for the day. Thankfully, over half of the 40-person class attended synchronously, but looking at their faces on my screen, keeping an eye on the chat feature, responding to raised hands, and trying to maintain the normal flow added another level of difficulty. The whole experience felt like juggling while balancing on a beach ball. I never ended a class thinking that I had nailed it that day and that things went especially well.

Photograph of an empty auditorium in Garrison Hall on UT Austin's campus

Stress and worry made the teaching experience half the fun, but I also missed the human contact with students. It was impossible to read the body language of students for clues on how the class was going and to look them in the eyes when responding to questions. I also missed the adrenaline rush after class, when students come up and declare “you made me think about this” or “that was a new idea for me.” Instead of inspiring students to think about the past and the evidence that supported new ways of viewing the past, it seemed that far too often I was simply performing and pushing information down the throats of students. Class seemed flat with fewer probing questions asked than during a normal in-person class.

Given the challenges and off-putting nature of online teaching, why were students generally positive about the experience? Crucially, we had almost half a semester together in person. They knew me, and I knew them. We had built community, and they were comfortable asking questions. I had established a certain rhythm in each class, and I decided to follow that rhythm online to the extent possible—which turned out to be a wise decision. In my undergraduate course on the history of Texas, I would open the class with a recording of a song from a Texas artist or a song that especially fit the topic of the day. For example, when the topic was lynching and white supremacy, I opened with Billie Holiday singing “Strange Fruit.” With the camera on me, I spoke about the major argument for the day and the song. Then I shared a photo of the 1916 lynching of Jesse Washington in Waco, Texas, while I continued speaking. Then back to the outline of points to be covered for the day, and so on, until near the end when I shared a quote from a newspaper that illustrated the commonplace nature of lynching. That led to a concluding discussion of the questions and thoughts to be carried forward from that day. The experience was not identical to our face-to-face sessions, of course, and I had to pay special attention to fielding questions and comments from students as they came up. Yet the pattern was familiar enough to the students to allow them to continue to get something out of the course.

Photograph of an empty classroom in Garrison Hall on UT Austin's campus

I also doubled down on emphasizing the central organizing premise of the course—the Texas you thought you knew, the Texas you imagined, was probably not the real Texas of the past. I made a special point of focusing on that theme at the conclusion of each class, instead of letting it percolate naturally as I do in person. In other words, I simplified my approach and made it more direct as I encouraged them to think about the past. I tried to demonstrate how historians work and to make students historians in training.

I should also admit that because of other duties I was only teaching one course. Teaching that one course was exhausting, and I am not sure I could have maintained the focus and energy needed to effectively teach online in multiple classes.

While I had hoped to never repeat the online teaching experience, the COVID-19 pandemic has worsened in Austin and across Texas since last spring. This fall, I am teaching one graduate class online, and I have reflected on what I learned from the spring. A graduate seminar is a different animal from an undergraduate class, but some lessons carry over. 

  • Take the time in advance to familiarize yourself with the equipment and the platform you will use. 
  • Pay special attention to building a community of inquiring scholars. I will not have the advantage of two months of face-to-face time, but I plan to use such things as the discussion features in Canvas and Zoom breakout rooms. About two-thirds of the way through each session, I will divide the class into two breakout rooms, so students have a greater opportunity to share ideas in a small group. I will visit each group to answer questions and spur debate. Then when we reconvene as a whole, a different student each week will report back from the breakout rooms. By using such techniques, I hope to make the seminar a shared experience instead of an experience that leaves each student feeling isolated. 
  • Keep as much the same as possible—but realize that a three-hour Zoom seminar will be exhausting for everyone. We will need two breaks instead of one, and the change of pace provided by the breakout rooms should also help lessen fatigue. 
  • Keep it simple, and keep the central message front and center. 

Off I go into what I still hope is my last time teaching online. The experience may remain twice the work and half the fun, but at least I have a plan and evidence of modest past success. Stay optimistic and energetic. Our students need that.


Walter L. Buenger is Summerlee Foundation Chair in Texas History and Barbara Stuart Centennial Professor in Texas History at the University of Texas at Austin.

This article originally appeared as Remote Reflections: Twice the Work and Half the Fun in Perspectives on History in September 2020.

Images of Garrison Hall courtesy of Grace Goodman.


The views and opinions expressed in this article or video are those of the individual author(s) or presenter(s) and do not necessarily reflect the policy or views of the editors at Not Even Past, the UT Department of History, the University of Texas at Austin, or the UT System Board of Regents. Not Even Past is an online public history magazine rather than a peer-reviewed academic journal. While we make efforts to ensure that factual information in articles was obtained from reliable sources, Not Even Past is not responsible for any errors or omissions.

Filed Under: 2000s, Digital History, Education, Features, Science/Medicine/Technology, Teaching Methods, Texas, United States

« Previous Page
Next Page »

Recent Posts

  • This is Democracy – Iran-Contra and its Legacies
  • NEP’s Archive Chronicles – Full Series
  • This is Democracy – Free Speech and Repression in Turkey
  • This is Democracy – Israel-Palestine
  • This is Democracy – Broadcasting Democracy
NOT EVEN PAST is produced by

The Department of History

The University of Texas at Austin

We are supported by the College of Liberal Arts
And our Readers

Donate
Contact

All content © 2010-present NOT EVEN PAST and the authors, unless otherwise noted

Sign up to receive our MONTHLY NEWSLETTER

  • Features
  • Reviews
  • Teaching
  • Watch & Listen
  • About