• Features
  • Reviews
  • Teaching
  • Watch & Listen
  • About

The past is never dead. It's not even past

Not Even Past

John Hope Franklin: An Appreciation

Prof Juliet E. K. Walker recalls being a student of the great pioneer of African Amercian history and discusses his importance to history writing and teaching.

And take a look at a video lecture by John Hope Franklin in the collection of the Briscoe Center for American History at The University of Texas at Austin

Filed Under: 1900s, Biography, Features, United States

Rebel With a Cause: Johnny Tremain (1957)

By Robert Olwell

Johnny Tremain Cover (via Wikipedia)

As a historian of early America, my subject predates the invention of film or video, voice or music recording, or even photography.  When I watch my modernist colleagues deliver multi-media lectures  – including film clips, snatches of popular music or speeches, and photos –  I feel a twinge of envy. The closest 18th-century Americans ever got to a multi-media presentation was to paint the words: “No Stamp Act!” on the side of a porcelain teapot.image A few years ago, after discussing the origins and consequences of the Boston tea party, including reading contemporary newspaper accounts, letters, and cartoons, I treated my students (and I will admit, myself) to the Boston tea party scene from Disney’s 1957 film Johnny Tremain, based on the 1943 novel by Esther Forbes.   The depiction of the “tea party” as an oddly orderly act of vandalism is probably accurate, but one cannot escape the impression of watching 1950s Americans playing at being 18th-century revolutionaries, and not even bothering to wash the Brylcreem out of their hair.  Of course, as I told my students (in self-defense) before showing them the clip, any historical film is a document of the age that produced it rather than of the age it depicts. So what can Johnny Tremain tell us about America in the 1950s?

At the start of the film, Johnny Tremain, a young orphan and silversmith’s apprentice, is portrayed as petulant, conceited, and disdainful of authority.  In one scene that anyone who has ever seen the movie will remember, Johnny breaks the rules by working on the Sabbath and gets his come-uppance when he accidentally puts his hand into a puddle of molten silver.  Audiences in the 1950s would likely have recognized Johnny as a colonial version of an emerging cultural phenomenon: the “teenager.” Newspapers, radio, and the new medium of television were full of stories of teenage “rebellion,” which, sociologists warned, was creating a new social problem called “juvenile delinquency.” In the spring of 1954, a Congressional sub-committee held televised hearings on the causes of juvenile delinquency that competed for viewers’ attention with the simultaneous, and subsequently far more famous, Army-McCarthy hearings.  The troubled and violent teen became a common character in popular films. Perhaps the best known were Marlon Brando in “The Wild One” (1953), and James Dean in “Rebel without a Cause” (1955).

Theatrical Release Poster, 1955 (via Wikipedia)

Hal Stalmaster, who plays Johnny Tremain, is no Brando or Dean in terms of acting skills. Similarly, Johnny’s angst is depicted far more crudely; with a crippled hand he literally cannot find a place for himself in society.

Fortunately, the American Revolution arrives to give Johnny a healthy outlet for his destructive (and patricidal) impulses.  At the tea party, Johnny gleefully smashes tea chests alongside approving and participating adults (Paul Revere, Sam Adams, and Joseph Warren).  Even an on-looking British admiral admires the tea partiers’ politeness and principles. The only disapproving authority figure is, significantly, also Johnny’s only blood relation, his uncle, played with panache by Sebastian Cabot as an effete popinjay who is eventually revealed to be an anti-revolutionary loyalist and thus, in a phrase that was loaded with meaning in the 1950s, “un-American.”  The film’s depiction of the American Revolution as an Oedipal conflict resembles (in a far less disturbing and simplified fashion) the version provided by Nathaniel Hawthorne in his 1831 short story: “My Kinsman, Major Molineaux.”

After the tea party, Johnny joins the “Sons of Liberty” and becomes active in the revolutionary underground.  Eventually, he finds himself on Lexington green when the first shots in the war are fired, and takes part in the running battle to drive the redcoats back to Boston.  The film ends that night with Johnny in the camp of the patriot army gathering outside of Boston, a warrior in the cause of American (and his own) freedom.  But unlike Forbes’s novel, in which the outbreak of the war (and the death of a friend) forces Johnny to grow up and accept the responsibilities of adulthood, the film’s Johnny makes no such psychological breakthrough. He is the same callow, smart-alecky teen at the film’s end as at the beginning.  War (and revolution) do not change him.  In the film’s depiction of combat, it is disconcerting to watch Johnny cackle with laughter as he ambushes and kills the king’s soldiers (as if fulfilling a long repressed desire).  The film also shies away from espousing any overt political ideology.   James Otis, the only character in the film who tries to articulate a larger meaning to the struggle, is described and portrayed as mentally unbalanced.

The film’s reluctance to ground the revolution on either abstract ideals or nitty-gritty class struggle closely reflects the views of American historians writing in the 1950s, who argued that, by crossing the ocean, the colonists had left the ideological conflicts of the old world behind them and instead shared in a broad liberal consensus.   In 1955, Louis Hartz argued, in The Liberal Tradition in America, that colonial American society was an egalitarian world of small property holders.  Lacking either an aristocracy or a peasantry, concepts such as class (and class struggle) were meaningless.  Not that they thought about politics much. Lockean, possessive individualists by nature rather than persuasion, early Americans were blessed with a “charming innocence of mind.” For obvious reasons, these historians tended to focus their attention on the northern colonies, where slavery was of relatively small consequence.

In retrospect, one can readily see how the hopes and fears of larger 1950s society shaped this so-called “consensus school” of early American history, both in terms of  its celebration of middle class values and bourgeois conformity, and its dread of radicalism.  Forbes’s novel, written in 1943, reflected the concerns of the depression era and was far more focused on issues of class and poverty than was the film. After a brief theater run – it premiered on July 4, of course – Johnny Tremain was broadcast on Disney’s weekly television program in 1958 and was rerun many times thereafter.  Like me, the vast majority of Americans probably first saw Johnny burn his hand in the comfort of their living rooms. For this reason, Johnny Tremain perhaps should be compared not to contemporary movies, but to 1950s television. More than Brando or Dean, the fictional teen who Stalmaster’s Johnny Tremain most closely resembles is Eddie Haskell, played by Ken Osmond in the T.V. series Leave it to Beaver, who made his first appearance in 1957.  Although a wiseass and troublemaker when adults are absent, Eddie’s sycophancy in their presence indicated his desire to conform.  Likewise, with the singular exception of his loyalist uncle, Johnny is deferential to all the adults in the film.  Even when he takes up arms against the establishment, it is a reflexive, almost thoughtless, act rather than the result of a deliberate decision to turn the world upside down or from a radical hope to build a new heaven and a new earth. Johnny Tremain’s version of the revolution is an orderly one, in which rebellious teens fall in line behind their patriotic elders, Brylcreem tubes in hand.

For more on history writing in the 1950s, see
Peter Novick, That Noble Dream: The “Objectivity Question” and the American Historical Profession
On the invention of juvenile delinquency:
James Gilbert, A Cycle of Outrage: America’s Reaction to the Juvenile Delinquent in the 1950s
On the Boston Tea Party, take a look at:
Alfred Young, The Shoemaker and the Tea Party: Memory and the American Revolution

Filed Under: 1400s to 1700s, 1900s, Biography, Cold War, Fiction, Gender/sexuality, Memory, Politics, Reviews, United States, War, Watch Tagged With: American Revolution, Cold War, film, Watch

George on the Lege, Part 3 – Redistricting

imageBy George Christian

Judging from the past half century, only one thing is certain in this year’s redistricting battle in the Texas Legislature: the lawyers who attack and defend the maps will do very well, indeed.

The modern history of legislative and congressional redistricting begins, as it does across the southern United States, with the United States Supreme Court’s 1962 decision in Baker v. Carr (369 U.S. 186). Reversing decades of abstention from the “political” issue of redistricting, the court held that the federal courts were compelled to step in if state legislators violated constitutionally-guaranteed voting rights through the exercise of their redistricting power. Three years later Congress enacted the Voting Rights Act, which subjects specific states, including Texas, to statutory redistricting standards that protect minority voting rights and require prior federal approval of legislatively-drawn maps. It is important to note that while local jurisdictions, such as cities, counties, and school districts, are likewise subject to federal constitutional and Voting Rights Act standards, this brief overview focuses on the Legislature’s history of drawing districts for Congress and the state house and senate.

The Texas Constitution grants specific redistricting authority to the Legislature (Art. III, §§1, 28). Nothing requires the Legislature to redraw congressional districts at any particular time, and it has frequently drawn congressional maps in special session (for example, in 1971, 1981, 1991, and 2003). The constitution specifies that legislative redistricting for the Texas House and Senate must occur in the first regular legislative session after publication of the U.S. decennial census. Since 1950, census data has been published in the spring of the year following each census and the Legislature has drawn, or attempted to draw, legislative districts in each of those regular sessions (51, 61, 71, 81, 91, 01). Indeed, in 1971 the Texas Supreme Court ruled that the Legislature had to act during that regular session, even if only a few days remained following publication of the census (Mauzy v. Legislative Redistricting Board). If the Legislature fails to act (as it did, for instance, in 1971, 1981, and 2001), the Legislative Redistricting Board, created by constitutional amendment in 1948 and composed of the Lieutenant Governor, Speaker, Attorney General, Comptroller, and Land Commissioner, takes over the responsibility for fashioning new state senate and house districts. The LRB has no jurisdiction over congressional redistricting.

Texas redistricting has a history of complex litigation and political fireworks. Probably the most famous (or notorious) event occurred in 2003, when Democrats in both the Texas House and Senate (the so-called “Killer D’s”) fled the state in order to break a quorum and prevent enactment of a revised congressional map. Democrats objected particularly to the overtly partisan nature of revising congressional districts not in the usual course of a legislative session following the census or in response to a court order, but in one immediately following an election that changed the make-up of the Legislature itself (specifically, the House, which turned from a Democratic to Republican majority in the 2002 election). To make matters even more contentious, U.S. House Majority Leader Tom DeLay (R-Sugarland) played a prominent role in raising money for Texas House candidates and then pushing state leaders to undertake redistricting “out of order.” The Democrats’ parliamentary tactics, while spectacular political theater, did not ultimately succeed and has had no lasting impact on redistricting law or the legislative redistricting process (though it may have contributed to Democratic electoral gains in the Texas House in 2005, 2007, and 2009). It did, however, result in a six-seat swing in favor of Texas Republicans in the 2004 election: Texas’ congressional delegation went from a 17-15 Democratic edge in 2002 to a 21-11 Republican bulge in 2004 (in the wake of November’s election, the GOP majority has grown to 23-9).

The Legislature’s 1991 congressional redistricting bill, however, was a different story. In Bush v. Vera, 517 U.S. 952 (1996), the court determined that the Legislature’s map (which included the famous fish-hook shaped District 30 in Dallas County) deliberately used race as a proxy for, in this case, Democratic voters. Although the Justice Department had approved the Texas map, the high court ruled that it violated the Voting Rights Act. Federal courts also rejected the Legislative Redistricting Board’s Texas house maps in both 1971 and 1981, and have ordered changes in congressional maps on numerous occasions (1973, 1982, 1996, 2001, and, most recently, 2006). Additionally, the Texas Supreme Court has at least twice rejected legislative redistricting for violating the Texas Constitution’s “county line rule” (districts may only split counties with compelling justification). In 1971 and 1981, the Legislature’s map split 33 and 34 counties, respectively.  The state claimed that it had to divide these counties in order to comply with the Voting Rights Act, but in each case the Texas Supreme Court found that alternative maps could be drawn that both complied with the Act and maintained constitutionally required county lines. Ironically, State Rep. Tom Craddick of Midland, who served as Texas House Speaker during the bitter 2003 redistricting battle, initiated the successful 1971 challenge that invalidated the Texas House map (see Smith v. Craddick, 471 S.W. 2nd 375 (Tex. 1971)).

Texas is thus no stranger to redistricting controversy.  We can expect the same this year, as the Texas Legislature sharpens its map-making tools once more. The 2010 census data is expected to come out next month, and we already know that extraordinary population growth will entitle Texas to four new congressional districts—the biggest jump since 1991, when the state picked up three new members of Congress. In both cases, the Texas population grew nearly 20% over the ten-year period preceding the census and, in both cases, the great majority of the growth occurred in suburban areas along the I-35 corridor and in the Rio Grande Valley.  In 1991 the Democratically-controlled  Legislature, with Democrat Ann Richards in the Governor’s Mansion responded by trying unsuccessfully to draw three new minority-majority districts. It appears likely that this year, with the GOP firmly in control of the Governor’s office and the Legislature, Republicans will try to fashion at least three, and perhaps a fourth, Republican-leaning district. It is also likely that whatever happens this spring in Austin, the ultimate decision will likely rest with the federal judiciary. That, at least, is the common denominator in Texas redistricting history. And, indeed, as The Texas Tribune reported last week, “Redistricting doesn’t start until next week, but the first lawsuit has already been filed.”

Sources:

Texas Legislative Council, Texas Redistricting

Texas Legislative Council, State and Federal Law Governing Redistricting in Texas (March, 2001)

For more on redistricting as it occurs, see The Texas Tribune, “Redistricting Reality,” and “Texas Redistricting Lawsuit.” 

Photo: Wing-Chi Poon  [CC-BY-SA-2.5] via Wikimedia Commons


The views and opinions expressed in this article or video are those of the individual author(s) or presenter(s) and do not necessarily reflect the policy or views of the editors at Not Even Past, the UT Department of History, the University of Texas at Austin, or the UT System Board of Regents. Not Even Past is an online public history magazine rather than a peer-reviewed academic journal. While we make efforts to ensure that factual information in articles was obtained from reliable sources, Not Even Past is not responsible for any errors or omissions.

Filed Under: 1900s, 2000s, Crime/Law, Features, Politics, Texas, United States

“Not Like Baghdad” – The Looting and Protection of Egypt’s Treasures

by Christopher Heaney

This weekend, as Cairo’s protestors struck their tents and tidied up Tahrir Square, a clean-up operation of another sort was underway nearby: in the Egyptian Museum, home to King Tutankhamen and countless other archaeological treasures.

image

The museum had haunted the protests since they began in late January.  In the first few days of the unrest that toppled President Hosni Mubarak on Friday, newspapers were quick to note that a group of looters had broken through a skylight, apparently searching for gold. Although Tut’s famous mask was thankfully under lock-and-key, the intruders knocked over or damaged approximately 70 artifacts. In one of the many exciting turns of the last several weeks, however, Egyptian neighborhood patrols surrounded the museum and caught the would-be thieves.

“I’m standing here to defend and to protect our national treasure,” one man told the AP. “We are not like Baghdad!” shouted another. The looting of Iraq’s National Museum in 2003 was on every observer’s mind.

The Egyptian military arrived soon after, took the looters into custody and made the museum a base of operations. Just outside, Tahrir Square became the protests’ center. It seemed to be a victory for Egypt’s people and its military. image

“If Cairo museum and the pyramids are safe, Egypt will be safe,” said Zahi Hawass, Egypt’s Minister of Antiquities, The National Geographic reported.   “Inspectors, young archaeologists, and administrators, are calling me from sites and museums all over Egypt to tell me that they will give their life to protect our antiquities,” he further claimed. Countless political cartoonists, meanwhile, depicted a sphinx-like Mubarak losing his nose, or tumbling from the pyramids.

On Saturday, however, Hawass revealed that not all in the house of Tut was as well as he had claimed. Confirming what many Egyptologists suspected, Hawass reported that 18 objects have disappeared, including two gilded statues of the boy pharaoh, as well as an image of Nefertiti. A Twitter campaign is now calling for Hawass’s investigation, and a march for his resignation is apparently being planned for Wednesday.

Before commentators perpetuate the inevitable comparisons to the National Museum of Iraq, it is worth remembering the difference in scale – upwards of 15,000 artifacts lost in 2003.  But, more importantly, Egypt has a long and complicated history of the protection of  ruins and artifacts. In times of transition, that history is felt keenly. For millennia, new governments in Egypt – dynastic, foreign and national – have had to re-build the country’s monuments and mummies, often aware that the rest of the world was waiting to collect the pieces.

Of Strongmen and Mummy Powder

Egypt’s fear of such loss is as ancient as the pyramids. Egyptians of the Old Kingdom, the third millennium BCE, spent much effort protecting their monuments and tombs from the malice of future regimes. Pharaohs built “dummy chambers” to hide their mummies and nobles engraved their statues with bawdy warnings such as: “Whosoever should deface my statue and put his name on it … let Emil, the lord of this statue and Shamasg tear out his genitals and drain out his semen. Let them not give him any heir.”

Millennia later, Arab, Mamluk and Ottoman rule saw a healthy trade in books that claimed to reveal the treasures of forgotten tombs. Digging up mummies proved more profitable, however. To feed European hunger for “Oriental” secrets and medicine, traders shipped the ancient dead to Italy, where they were bought and sold as mumia – ground-up mummy — used as a medicine and sometimes as a color in painting. imageAntiquarianism during the Renaissance and Enlightenment again made Egypt an object of desire. When Napoleon invaded Egypt in 1798, he brought along a bumptious team of French savants who sketched ruins and collected artifacts for the Louvre. France capitulated to England, however, and the British Museum got the vast majority of Napoleon’s spoils, including the famed Rosetta Stone. (The French savants, for their part, first threatened to throw it all in the sea, rather than turn the prize over to their enemies.)image

In 1835, Egyptian authorities established the Antiquities Service of Egypt to staunch rampant looting by locals and Europeans like “The Great Belzoni,” a former circus strongman from Venice who hauled out busts and obelisks by the ton – a tale laid out in rollicking, lacerating form by archaeologist Brian Fagan in The Rape of the Nile: Tomb Robbers, Tourists and Archaeologists in Egypt. Under the system that developed, only scientific experts could excavate and half of the artifacts they found would stay in Egypt – a system called “partage.”

As in all archaeologically-rich regions, however, enforcement proved hard. Locals continued looting to supply growing tourist demand, British archaeologists saved key artifacts for their patrons, and French and Egyptian employees of the Antiquities Service sold pieces from the museum on weekends. Egyptian train conductors used mummies for fuel, Mark Twain claimed. Somewhat hypocritically, Europeans and Americans cast the dysfunction of Egypt’s antiquity system as a further rationale for England and France’s informal rule: if Egyptians couldn’t protect the treasures of their past, the logic went, they weren’t ready to govern themselves.

National Foundations

Yet this wasn’t the entire story. Kept out of Egyptology by the British, Egypt’s Muslim intellectuals engaged in an Arab humanism ignored by history until only recently. They found work in the Cairo Museum – opened in 1902 – and led the study of Coptic and Islamic architecture and art. “It is not fitting for us to remain in ignorance of our country or neglect the monuments of our ancestors,”  Egyptian scholar Alu Mubarak wrote in 1887, as quoted by Donald Malcolm Reid in Whose Pharaohs?: Archaeology, Museums, and Egyptian National Identity from Napoleon to World War I. “For what our ancestors have left behind stirs in us the desire to follow in their footsteps, and to produce for our times what they produced for theirs; to strive to be useful even as they strove.”

When the country achieved independence in February 1922, nationalists used Cairo’s Museum and foreign archaeologists to debate continuing British influence. In November of that year, American Howard Carter, funded by England’s Lord Carnavon, made his famed discovery of King Tut’s tomb. image In the media circus that followed, newly independent Egyptians protested that the English and American team was being too secretive and perhaps moving artifacts to England and New York. Egypt eliminated the “partage” system and claimed all excavated objects, to Carter’s lasting fury.

Foreigners still loomed large over the next half-century of research, but in the years after Hosni Mubarak took power in 1981, the Egyptian Museum and its flamboyant spokesman, Zahi Hawass, oversaw a period of national consolidation. Seen as a cultural hero by some and a one-man showboat by others, Hawass focused world attention on Egypt’s past with a series of blockbuster discoveries, profitable traveling exhibits, and noisy campaigns for the return of the country’s artifacts from American and European museums.  In some cases, he was successful. In November 2010, the Metropolitan Museum agreed to return its share of treasures that Howard Carter had stolen from the young pharoah’s tomb  (as former Metropolitan Museum of Art director Thomas Hoving had claimed in his 1978 book, Tutankhamun: The Untold Story).  

A New Start?

To use a metaphor as old as Tut, will Mubarak’s fall, and the museum’s loss of 18 objects, now shake Hawass’s own dynasty? As in past transitions, the politics of archaeology are running hot. Hawass has confirmed outside claims that looters broke into another deposit of artifacts in Egypt’s south; further losses are sure to emerge. Hawass’s affiliations with Mubarak may also cause him problems. While protestors assailed the administration in the streets, Mubarak promoted Hawass to his cabinet. Like their fellow Egyptians in Tahrir Square, the Cairo Museum’s employees are demanding higher wages and less corruption. Meanwhile, human rights advocates say that the army has detained and abused protestors either behind the museum or inside the museum itself.

Although an investigation of the Museum break-in should proceed, it is too soon to call for Hawass’s resignation. As Ian Parker of the New Yorker recently noted, Hawass is too central to Egypt’s antiquity system to lose during the transition. More importantly, it would be unfair to blame the losses on the protests themselves or to claim that Egypt’s museums have “failed.” It has been speculated that Mubarak’s secret police themselves – perhaps working with the museum’s guards – may have been behind the museum’s robbery. Hawass has vehemently denied the charge, but it provokes the question: no matter the identity of the looters, for whom were they collecting? The 18 stolen pieces were true treasures, suggesting, like the looting of Baghdad’s museum, that the job was done to fulfill the “wishlist” of a private collector, perhaps foreign.

The robbery of the Egyptian Museum and the loss of other objects from archaeological deposits around Egypt is  a reminder of the shared responsibility of cultural patrimony. The protection of monuments and mummies like Egypt’s requires vigilance, not only from Egypt’s people but also from the international community whose collectors have made looting lucrative for centuries.

As the behavior of the heroic neighborhood patrols who detained the looters attests, Egypt seems to be holding up its side of the bargain. It’s a small comfort, but perhaps the theft of only 18 of the museum’s objects is a victory, given past losses. It’s time that the rest of us do the same, respecting just what Cairo’s museum and its history represent to the Egyptian people. A former Arab diplomat has characterized the last month as “Facebook meets the Egyptian Museum.” As we read newspapers online and scrutinize photos of celebrations in Tahrir Square, we should cheer the fact that Cairo’s museum, and the history it represents, is literally that victory’s background.

image

Sources: The pharaoh’s ancient curse is cited in Z. Bahrani, “Assault and Abduction: The Fate of the Royal Image in the Ancient Near East,” Art History 18 (1995).

Further reading:

An online exhibit on the Napoleonic scientific expedition in Egpyt by the Linda Hall Library.

Another AP article on the museum’s looting.

Mummies as medicine via Wikipedia, Scientia Curiosa and Res Obscura.

Former diplomat Charles Wolfson on Egypt’s future, via CBS news.

 Photos of the Egyptian Museum interior and exterior by Kristoferb (Own work) [CC-BY-SA-3.0 (Wiki Commons)]; of Zahi Hawass in Northern Egypt, 8 May 2010, VOA News; Rosetta

 Stone: public domain; Napoleon Before the Great Sphinx, by Jean-Leon Gerome; Howard Carter at the tomb of Tutnakhamen, by Harry Burton via Wiki Commons.

Filed Under: 2000s, Crime/Law, Features, Ideas/Intellectual History, Middle East, Museums Tagged With: Arab Spring, archeological museum, archeology, Egypt, Egyptian Museum, Howard Carter, mummy, Rosetta STone, Tahrir Sq, Tutankhamun, Zahi Zawass

The Great Partition: The Making of India and Pakistan by Yasmin Khan (2008)

by Gail Minault

Reading this compelling account of the partition of India in 1947, one is moved to ask:  What were they thinking?   Early accounts of the end of British rule in India concentrate on the high politics of the negotiations between the leaders of the Indian National Congress, the Muslim League, and a succession of Viceroys—ending with the striking and decisive Lord Mountbatten.  The British were concerned to leave a legacy, of which they could be proud and hence avoid an unseemly civil war in the wake of their departure.  Both the Congress, led by the future Indian Prime Minister, Jawaharlal Nehru, and the Muslim League headed by Muhammad Ali Jinnah, the father of Pakistan, were concerned to inherit a state or states that would be governable and would guarantee civil rights and economic improvement to their people.  The negotiations had reached a stalemate by late 1946.  Mountbatten arrived in Delhi in early 1947, rapidly decided that partition was the only solution, convinced the leaders to agree to the plan by early June, and then announced, to everyone’s amazement, that the transfer of power would take place in mid-August.

This grand narrative, with Mountbatten as the master of ceremonies, has remained dominant for far too long.  Questions about Mountbatten’s judgment have emerged in recent years, but the politics of the partition settlement have generally remained at the center of historical focus.  In this book, Yasmin Khan endeavors to change that focus by bringing together a range of voices that reveal the human toll of those hasty political decisions.  The Great Partition, in other words, listens to “the Indian street,” the stories of ordinary men and women, hapless and displaced by decisions over which they had no control.  Hindus, Muslims, and Sikhs fled for their lives to opposite sides of a frontier that was not defined until after Independence Day and its attendant celebrations.  Hundreds of thousands lost their lives in the violence.  The solidarity of neighborhoods and villages disintegrated.

Khan assembles an impressive array of sources from all levels of the social and political spectrum to paint a convincing picture of official incompetence and unseemly haste.  The British were more concerned about withdrawal than with maintaining order as they did so.  The political leadership, who should have been better prepared for the possibility of violent mass migration, remained remarkably insouciant, convinced that once power was transferred, all would be well.  No such luck.

This dramatic account brings in much needed ground-level detail and opens up partition’s stories to more varied interpretations. It is accessibly written and I recommend it as a much-needed revision of the official partition histories of decades past.

 

Related recommendations:

Judith M. Brown, Gandhi: Prisoner of Hope (1989)

Penderel Moon, Divide and Quit: An Eye-Witness Account of the Partition of India (1998)

Gyanendra Pandey Remembering Partition: Violence, Nationalism, and History in India (2001)

 

Filed Under: 1900s, Asia, Ideas/Intellectual History, Periods, Politics, Regions, Religion, Reviews, Topics, Transnational Tagged With: Asia & Middle East, British Empire, India, Pakistan

Racing the Enemy: Stalin, Truman, and the Surrender of Japan by Tsuyoshi Hasegawa (2006)

On August 6, 1945, the United States of America became the first (and so far only) nation to use atomic weapons against an enemy.  Since then, the world has wrestled with questions about the bombings of Hiroshima and Nagasaki: Did the A-bombs save American and Japanese lives by hastening Japan’s surrender?  Could the war have ended differently, with less loss of life?  Did the U.S. hope to strike terror in Stalin’s heart as much as in Emperor Hirohito’s?  How much did President Truman know about the bombs, and what did he think about their ethical implications?

These questions do not belong only to armchair generals and academics.  They belong to all inhabitants of this world of nuclear proliferation.  How do we find the answers to these important, emotional, divisive questions?  How do we separate facts from convictions and recreate the events that led to the destruction of two Japanese cities, the loss of hundreds of thousands of lives, and the start of the Cold War?

For starters, we can read Tsuyoshi Hasegawa’s magnificent page-turner, Racing the Enemy.  Hasegawa gives us a riveting, month-by-month, day-by-day, and even minute-by-minute narrative of the end of the war in the Pacific.  The story that emerges is very much a global one.  Hasegawa gracefully moves us from Washington to the Crimea, from Moscow to occupied Manchuria, from Tokyo to the Kuril Islands.  His assiduous attention to detail—to the treachery of time zones, to diplomacy lost in translation, to treaties made and violated—puts us in the thick of momentous and terrifying events.  Rival factions in the Japanese bureaucracy debate how to end the war with honor, Stalin plots territorial acquisition, and the U.S. government rejects diplomatic overtures in favor of an unprecedented show of force.

image

And what of the A-bombs?  According to Hasegawa, Japan surrendered primarily due to the threat of Soviet invasion; the shared fate of Hiroshima and Nagasaki merely provided a palatable public rationale.  The bombs signaled to Stalin that time was not on his side, and led him to hurriedly instigate the almost-forgotten final land battle of World War II.  Truman could have profitably pursued a negotiated, conditional peace, but for political and personal reasons he tacitly supported the use of the weapons of mass destruction.

Racing the Enemy is high political drama grounded in Japanese, Soviet, and U.S. archival material.  It will not be the final word on the end of the war, but it is a powerful and authoritative volume that all subsequent debate on the subject should reference.  No one with even a passing interest in the Pacific theater, atomic warfare, or the early Cold War can afford to miss it.

Filed Under: 1900s, Periods, Regions, Reviews, Topics, Transnational, War Tagged With: pacific history, Transnational, World War II

Contending Visions of the Middle East: The History and Politics of Orientalism by Zachary Lockman (2004)

by Kristin Tassin

In this work, Zachary Lockman seeks to introduce a general audience to the history of the study of Islam and the Middle East in the United States and Europe, with particular attention to US studies from the mid-twentieth century. The importance of this book lies in Lockman’s attempt to reach the general public with information about the history, politics, and culture of the Middle East. image Lockman’s concern is that certain kinds of knowledge about the Middle East and Islam have been used to shape and justify dangerous policies without the consent of an informed public.

The first half of the book surveys how the “West” has imagined the Middle East from ancient times to the twentieth century. Lockman begins with the early history of different formulations of the east/west divide from the Greeks onwards. He argues the images of the “East” created by the Greeks and Romans had little to do with reality and were influenced by early conflicts with Persia and the concocted opposition between “civilized” Greeks and Romans on the one hand and “barbarians” on the other. These early theories were then adapted by western political theorists to serve particular political and military goals. In an argument reminiscent of Edward Said’s in his book Orientalism, Lockman argues that representations of Islam can be deployed for contemporary political purposes because they still have public emotional resonance.

In the second half of the book, Lockman focuses on the American and British image of the Middle East over the past fifty years. The bulk of the book is devoted to the modern development of the field of Middle Eastern Studies through the influences of empire, the Cold War, and the rise of area studies after World War II. Lockman argues that images of Eastern cultures in the West have been linked with the growth of European and American power over Muslim territories. In the final chapter, Lockman lays out what he sees to be the threats posed to Middle Eastern studies by politically-motivated anti-Eastern policy, manifested mainly in think-tanks and the media. Lockman argues that, particularly after the events of September 11th, there has been a serious effort to censor opinions deemed too liberal or too supportive of Arab or Islamic causes.

Lockman’s work has been criticized for giving undo attention to debates in the public sphere and neglecting work produced inside academia. But this is Lockman’s point exactly. Political debates within public policy and the media have overtaken genuinely scholarly interest in the Middle East, and have influenced the types of questions addressed in colleges and universities. For this reason, Lockman’s book arrived at a perfect time to give a clear history of the study of the Middle East and Islam. The book calls attention to the social and political interests that have been served by the adoption of a certain type of one-sided scholarship.

Filed Under: 1400s to 1700s, 1800s, 1900s, Empire, Ideas/Intellectual History, Middle East, Periods, Politics, Regions, Reviews, Topics, Transnational Tagged With: 20th Century, Asia & Middle East, Islam, Orientalism

The Hour of Our Death by Philippe Ariés (1982)

by Zach Doleshal

History is full of dying, but before this book was published historians rarely concerned themselves with how a society thinks about death. We have lists upon lists of casualty counts in all manners of battles throughout the ages but we have little understanding of the ways the idea of death has changed over time. image Philippe Ariès’ monumental work, The Hour of Our Death, was something of an exception, as it offers rare insight into European representations of death from the eleventh century to the twentieth.  Beautifully written and admirably translated, the work takes the reader through a dizzying array of cemeteries, epic poems, and deathbeds to provide a view of the ever-evolving place of death in European society. After reading it, one will never look at death the same way again.

The book rests on the Jungian concept of the collective unconscious, which means that Aries is attempting to find similar artifacts across a range of sources and regions that show the “unconscious expression of the sensibility of the Age.” In order to do so, he follows three themes through one thousand years: individual versus collective memorials, ideas of transitions between life and death (or “tame” death versus “wild” death), and the physical proximity between the dead and the living. The result is an eloquent description that conveys the mutability of western attitudes toward death.

And yet, Ariès anchors his continually evolving coffins with commonalities. For example,  “the idea of sleep is the most ancient, the most popular, and the most constant image of the beyond.” Death as sleep, as a neutral state of repose, uninterrupted calm and peace, is a belief that survived despite the best efforts of the Catholic Church, which sought to persuade society of the soul’s mobility after death. But Ariès finds recumbent figures in literature and statuary from the eleventh to the seventeenth centuries to document the persistence of the idea of death as sleep. Likewise, Ariès finds the monuments themselves, from the Mediterranean to England and from Spain to Germany as having a “genetic unity of forms,” which suggest an ongoing tension between the Christian liturgy and an older, pan-European idea of death as the big slumber.

The second half of the book concentrates on a different tension when Ariès presents a dramatic break in attitudes toward death with the advent of the twentieth century hospital death. For Ariès, the modern hospital death is an “absolutely new type of dying” because, quite simply, people lost control over their own deaths. For thousands of years death was a moment that could be foreseen by the soon-to-be deceased, and considered best spent in the company of friends and family, who were to provide a serene point of departure while the dying person gave orders and advice. In the modern era, Ariès argues, it became a moment of machines and professionals who kept the dying in ignorance of their condition, in sterile rooms far from the public’s attention. This loss of mastery to Ariès meant that death became a moment of diminished consequence. In the past, each death was a critical event in an ongoing, communal, struggle against evil spirits where each person’s salvation was a social project. Now, in a society striving to avoid it, death has become a hollow, empty affair. Ultimately, Ariès’ book makes a case against the modern hospital death. For him, writing in 1977, dying had been increasingly marginalized in proportion to society’s loss of faith.

A pioneering work on the study of death when it was first published; Ariès’ book remains an important read today because it was at the forefront of the natural death movement. His condemnation of contemporary practices pushed social activists to develop more humane practices and historians to think about death as an important subject, with its own questions of rights and social justice. Anyone curious about the evolving conceptions of dying would do well to start at The Hour of Our Death.

Filed Under: 1400s to 1700s, 1800s, 1900s, Europe, Ideas/Intellectual History, Periods, Regions, Reviews, Topics, Transnational Tagged With: european history, Jung, Transnational

Rise of the Vulcans: The History of Bush’s War Cabinet by James Mann (2004)

by Aragorn Storm Miller

James Mann provides a lively and comprehensive study of the advisers who would guide George W. Bush as he sought to make the world safer for U.S. interests.  Mann argues that Bush’s inexperience led him to rely on—as well as greatly empower—a cohort including some of the most experienced and respected members of the conservative foreign policy making community.  This cohort—Cheney, Rumsfeld, Powell, Wolfowitz, Armitage, and Rice—came up in the ranks together, devoting much thought to altering the means and ends of U.S. foreign policy in order to avoid what they considered to be the errors of previous liberal and conservative administrations.  The Bush administration’s abandonment of realpolitik and judicious use of force, to embrace of ambitious unilateralism and export of U.S. institutions, was thus not as sudden or unprecedented as many contemporary observers suggested.  Such a new way forward had been brewing for decades.

860707As a journalist, Mr. Mann performs his role of “providing the first draft of history” admirably.  Professional historians will appreciate his effort to position Bush’s “Vulcans”—the nickname his advisors used to convey their devotion to toughness and power—as the intellectual fulcrum between U.S. Cold War and post-Cold War conceptions of foreign relations.  The more casual reader will also appreciate Mann’s ability to make these policymakers come alive as human beings who, like anyone, consist of a lifetime of personal dreams, disappointments, goals, and agendas.  Indeed, as historians are becoming increasingly aware, men and women who toil on a specific problem or issue for decades often come to exercise far more influence on national policy than the given president who simply does not have the time to master every foreign policy question.

Filed Under: 2000s, Periods, Politics, Regions, Reviews, Topics, Transnational, United States, War Tagged With: Bush Administration, foreign relations, journalism, United States

City Lights: Austin’s Historic Moonlight Towers

Austin’s moonlight towers have long been a distinctive part of the city’s landscape, their lights casting a gentle glow on the streets 150 feet below.* Though Austin’s fifteen surviving towers are now the last of their kind, this form of street lighting was once common across the United States. Many cities erected tower lights in the 1880s and 1890s, and Austin’s system was modeled closely on Detroit’s, then the most extensive in the world.**

Image of modern moonlight tower in a residential neighborhood in Austin, Texas.

The first practical source of electric light was essentially a sustained spark, or arc, between two carbon rods.  Though highly efficient, such arc lights had a serious drawback: their glare was too intense to be endured at close range, yet there was no way to make the arc smaller without extinguishing it altogether.   The carbon rods also burned down quickly; those in the first arc lights lasted just an hour or two before they had to be replaced, though later models could last through the night.

Paris and other European cities began mounting arc lights on ordinary streetlight poles in the late 1870s, but the shielding required to reduce the glare meant that much of their light was wasted.  Boosters in San Jose, California, tried to solve the problem in 1881 by putting the arc lights atop a 237 foot tall tower that straddled a main intersection.  Although it never really lit up the whole city as initially hoped, and climbing up to change the rods was a challenge, the San Jose tower inspired others (including, some said, the builders of the Eiffel Tower), and more practical lighting towers, most 100 to 180 feet tall, soon went up in other American cities. Detroit erected over a hundred such towers in the 1880s and in 1894 Austin purchased thirty-one new towers of its own from the Fort Wayne Electric Company.**

In the 1890s Austin was in the middle of an electrical boom.  Hoping to spur industrial development, the city built an expensive granite dam across the Colorado River (where Tom Miller Dam now sits), to provide power for new cotton mills.  Though the mills never materialized, dynamos installed in the dam’s powerhouse supplied current for an electric streetcar system, a municipal electric utility, and the new moonlight towers.

The first tower was lit in May 1895 in the heart of Hyde Park, a “streetcar suburb” then on the north edge of town, and the other towers were lit soon afterward.  The dam collapsed in a flood in 1900, and Austin’s dreams of economic development went with it.  But the city-owned electric utility survived, and while Detroit and other cities abandoned their arc light towers in the 1910s in favor of simpler street-level incandescent lights, Austin stuck with its moonlight towers long enough for them to become much-loved curiosities.

Image of the First Electric tower erected in Austin, 41st & Speedway, 1895

In the 1920s the city replaced the arc lights atop the towers with dimmer but less troublesome incandescent bulbs. In 1936 these gave way to mercury vapor lamps, and later to the metal-halide bulbs that are still used on most of the towers today, though they are gradually being replaced with new LED bulbs — all brighter than the old incandescent bulbs, but no match for the original arc lights.**

About half of the original thirty-one towers have come down over the years, and several have been relocated; two that were recently removed to make way for downtown construction are now awaiting new homes.  Since 1976, all have been listed in the National Register of Historic Places.

Picture of a modern moonlight tower protruding above the treeline in a residential neighborhood in Austin, Texas

The spread of artificial lighting in the nineteenth century had enormous effects on modern life, a story well told by Wolfgang Schivelbusch in Disenchanted Night: The Industrialization of Light in the Nineteenth Century.  When Schivelbusch visited Austin in 2007, he met with my undergraduate seminar on the history of electric technology and its social impacts, and we walked over to look at the moonlight tower at 22nd and Nueces Street.  Schivelbusch was, he said, very glad to see that this piece of electrical history was still standing over Austin’s streets.

*corrected Tuesday, November 12, 2013, with updated information about the remodeling done in the 1990s.
**corrected Tuesday, September 20, 2016, with updated information about the addition of LED bulbs to Austin’s bulb replacement project and with new information about the purchase of the original towers. Although numerous sources reported that Austin bought its towers second-hand from Detroit, Catherine Cordeiro, a graduate student in the UT Austin School of Architecture’s Historic Preservation program, discovered that Austin bought its towers new from the Fort Wayne Electric Company, and they were slightly improved over the type sold a few years earlier to Detroit.

Photo credits:

West 41st St and Speedway, 2008, by Andy Mattern
“First Electric tower erected in Austin, 41st & Speedway, 1895”, The Austin History Center
Moon Tower by Carlos Lowry
Moonlight Tower by with an eye


The views and opinions expressed in this article or video are those of the individual author(s) or presenter(s) and do not necessarily reflect the policy or views of the editors at Not Even Past, the UT Department of History, the University of Texas at Austin, or the UT System Board of Regents. Not Even Past is an online public history magazine rather than a peer-reviewed academic journal. While we make efforts to ensure that factual information in articles was obtained from reliable sources, Not Even Past is not responsible for any errors or omissions.

Filed Under: 1800s, 1900s, 2000s, Art/Architecture, Features, Science/Medicine/Technology, Texas, United States, Urban Tagged With: Austin, city lighting, electricity, moon towers, moonlight towers, Texas History, urban history

« Previous Page
Next Page »

Recent Posts

  • NEP’s Archive Chronicles: A Brief Guide Through Some Archives in Gaborone and Serowe, Botswana
  • Review of Hierarchies at Home: Domestic Service in Cuba from Abolition to Revolution (2022), by Anasa Hicks
  • Agency and Resistance: African and Indigenous Women’s Navigation of Economic, Legal, and Religious Structures in Colonial Spanish America
  • NEP’s Archive Chronicles: Unexpected Archives. Exploring Student Notebooks at the Institut Fondamental d’Afrique Noire (IFAN) in Senegal
  • Review of No Place Like Nome: The Bering Strait Seen Through Its Most Storied City
NOT EVEN PAST is produced by

The Department of History

The University of Texas at Austin

We are supported by the College of Liberal Arts
And our Readers

Donate
Contact

All content © 2010-present NOT EVEN PAST and the authors, unless otherwise noted

Sign up to receive our MONTHLY NEWSLETTER

  • Features
  • Reviews
  • Teaching
  • Watch & Listen
  • About