Author Archives: clairetambo

Does Anybody Really Know What Time It Is?

by Michele Callaghan, Manuscript Editing

“Does anybody really care?” is the next line of the classic rock song by Chicago. I do. Maybe it is because I have a history degree in addition to being an editor. This means I am fated to have an obsession with details about writing and that I also get annoyed about inaccuracies in showing the passage of time.

Once a month my local newspaper—which will remain nameless because I don’t think this odd twisting of the order of events is their error exclusively—mentions the number of people seeking unemployment benefits and the new unemployment rate. This is how they phrased it in early March: “The February unemployment rate is expected to rise to 7.9 percent.” To my mind, it should read “is expected to have risen to 7.9 percent.” Sometimes rather than moving the present to the past, they go even farther and make the past into the future, saying things like, “The February unemployment rate will be higher than in January.” This February is in the past, so how can its unemployment rate be in the future? (To NPR’s credit, they used the past tense—“the rate did go down”—in the same context.)

Authors sometimes fall victim to other variants of this odd thinking about time. It makes a certain sense to use the present tense (the here and now) for describing the narrative of a story (Scrooge wakes up on Christmas morning and realizes that he hasn’t missed the holiday after all.). It also makes sense to use the present tense for theories that reflect current thinking (Darwin’s theory of evolution holds that humans are descended from other life-forms.).  Where authors can go off track is when they extrapolate from this approach and begin to use the present tense for everything about Charles Dickens or Charles Darwin. We can find the men writing and acting decades after their deaths.

In the 1960s, French avant-garde filmmakers and writers, such as Alain Robbe-Grillet and Jean-Luc Goddard, experimented with the narrative form. The audience was expected to follow the action forward and backward, eventually learning what occurred in the plot. Similarly, Anglo-Indian writer Rumer Godden sprinkled her excellent prose with verb phrases like “as John was to have said later.” But let’s not put our readers of academic nonfiction—or even the daily paper!—through this exercise. Let’s proceed with time as it flows in real life: from past to present to the great unknown of the future.

Comments Off on Does Anybody Really Know What Time It Is?

Filed under Uncategorized

What’s my line?

by Michele Callaghan, manuscript editor

Like an actor assuming a role, we editors need to inhabit the voice and the knowledge base of our authors. In recent months, I have been a precise medieval historian, a statistics-spewing football fan, a physicist with a flair for describing science for a lay audience, and a political science professor from the Big Apple.

The stereotypical actor asks, “What’s my motivation?” We need to ask similar questions to translate what is in the author’s mind for the intended reader. And like the actor, we can ad lib somewhat and use our knowledge to tease out the meaning and add to the original script. But there are two things we cannot do: break character and speak as ourselves or go off script to create our own story.

Longtime readers of the blog know that at one time I was an aspiring author and that in my youth my unreachable goal was to be the next James Joyce. I know I am not the only person out there with a file cabinet full of good and bad writing. So, why the shift from auteur to actor? Perhaps it is the teacher in my blood; both my parents and half of my grandparents were educators.

The joy of seeing someone through the gamut of emotions about their magnum opus going from raw material to a real book and the feeling of solving the puzzle of smoothing out can’t be beat. And while I enjoy taking a bow in the acknowledgments from time to time, what I really like is hearing the cries of “Author, author!” from the balcony.

1 Comment

Filed under Behind the Scenes, Editing, Writing

Jazz Noir

Guest Post by Mark Osteen

A sharply creased fedora rests atop the oiled hair of a smart-talking detective, whose steely eyes gaze at a seductive blonde smoking a cigarette. When they kiss, a slinky jazz saxophone plays. Hat, blonde, smoke, jazz: these are the signature tropes of classic film noir. But there’s a problem: the jazz wasn’t really there. In fact, not a single 1940s noir and only a few from the ‘50s featured a jazz soundtrack. Nevertheless, as I argue in Nightmare Alley, noir filmmakers used jazz to explore America’s shifting attitudes and anxieties about race, gender, sexuality, and violence, and to register the dissonances of a changing postwar world.

Film noir’s many nightclub scenes introduced viewers to an underground world of racial mixing, louche behavior, and unorthodox gender roles and sexual orientations. These associations color noir’s portrayals of white jazz musicians, who are typically depicted as sexually suspect and prone to madness and violence. Yet the films also betray a fascination with these figures who enact viewers’ repressed attraction to blackness and its (often stereotyped) tropes. In other words, white jazz musicians are “noired”—transformed into surrogate African Americans—in films like Phantom Lady (1944) and Black Angel (1946). However, later noirs, such as The Strip (1951) and Sweet Smell of Success (1957) present jazz not merely as a respectable way to make a living, but as an island of integrity in a continent of corruption.

Some films even find that jazz trumpets progressive ideals such as hybridity, emotional liberation, equality, and self-creation. The best examples of this strain are two movies in which the multi-talented Ida Lupino plays world-weary, resilient torch singers. Petey Brown, in The Man I Love (1947), belies the title song’s lyrics: instead of submitting to her man, she leaves him, in order to emerge with her integrity and artistry intact. In Road House (1948), Lily Stevens overcomes professional and personal obstacles by employing wit and husbanding her emotional resources.

These singers exemplify how jazz can be not merely a way of playing, but a way of living, in which improvisation serves as a survival technique suited for the modern world.   Although few noir films featured jazz scores, many used tunes from the Great American Songbook to evoke moods and reveal characters. Songs like “I’ll Remember April” (Phantom Lady), “I Hear a Rhapsody” (Clash by Night), “Your Red Wagon” (They Live by Night), “Never Let Me Go” (The Scarlet Hour) and “One for My Baby” (Road House), as well as the classic title themes from Laura and Body and Soul, comment powerfully on the action. In the 1950s and afterward, jazz soundtracks evoked urbanity and menace in films such as The Big Combo, in TV crime series such as Peter Gunn and Mike Hammer, and in neo-noir films like Chinatown.

If you’d like to hear these tunes and learn more about film noir, you’ll want to attend “Night Songs: The Music of Film Noir” on January 24th, at Germano’s Piattini. At this concert, I will show brief film clips and introduce the tunes; then my group, Cold Spring Jazz Quartet (CSJQ), will perform them. The show begins at 7:30 pm. Tickets are $10. To purchase a ticket and make a reservation, please call 410-752-4515. Afterward, you’ll be ready to walk those mean streets again.

To learn more about jazz and other themes in film noir, pick up a copy of Nightmare Alley. Click here to find out more about CSJQ. For the lowdown on what’s happening in Baltimore jazz, check out the Baltimore Jazz Alliance.

osteenMark Osteen is a professor of English, chair of the English Department, and founder of the Film Studies Program at Loyola University Maryland. His latest book Nightmare Alley: Film Noir and the American Dream is now available from JHU Press.


Filed under Book talks, For Everyone, Popular Culture

2014: The Year the UN Could Make Its Troops Fully Accountable for their Actions?

guest post by Arturo C. Sotomayor

Since the 1990s, UN peacekeeping has experienced a sea change in the frequency, nature, and purposes of its missions. During the Cold War era, for example, there were never more than five missions operating at any one time, while after the first Gulf War there were twelve. Troop levels have also increased from 78,000 soldiers in 1990 to over 97,000 blue helmets in 2013. Moreover, peacekeepers have been given increased responsibilities in maintaining peace abroad and tasked with highly complex functions. In places like the Democratic Republic of Congo (DRC) and the Central African Republic, the UN Security Council has authorized robust mandates to its peacekeepers, providing them with the authority to use force in offensive campaigns against belligerent forces, relying on drones for surveillance and patrols, and cooperating with heavily armored French and African troops.

Not only has there been a dramatic increase in the demand for peacekeepers, there has also been a radical change in the number and quality of the blue helmets supplied by troop-lending countries. The so-called middle powers (Canada, Sweden, Norway, and Denmark), for example, no longer provide the bulk of the UN’s peacekeeping contingents. Instead, more than half of the top eighteen UN troop contributors are newcomers, and almost two-thirds come from the third or developing world. The majority of these countries have poorly professionalized armies.

Unfortunately, the record number of UN peacekeeping operations in the world has not always been accompanied by increased measures to ensure troop accountability and transparency. As The New York Times recently pointed out, the UN has been charged with gross negligence, corruption, and serious oversight lapses in its own peacekeeping practices. In Haiti, for example, blue helmets have been accused of spreading cholera and committing sexual abuses. In Sudan and the DRC, UN soldiers have been found to be responsible for institutional corruption and bribery.

What is wrong with the UN peacekeeping system? Why are so many UN peacekeepers misbehaving? At the most basic level, there is a serious lapse in the current training system. The proliferation of peacekeeping training programs around the globe and the flexible system in which UN member states can prepare and train their own troops for peacekeeping have not flourished. Military organizations have trained troops according to their own standards, which has led to variations in levels of professionalism and, on occasion, to poor performance in the field. Quality and homogenizing standards are needed. The UN could start by certifying training centers that implement changes and apply good standards (inclusion of civilian instructors, training in humanitarian law, language instruction, etc.) while decertifying those that continue to rely on outdated, perverse peacekeeping doctrines. Reputational costs—prestige or the fear of being decertified—might compel member states to take training more seriously.

Moreover, UN peacekeepers are rarely punished and prosecuted when they misbehave. Currently, the UN is unable to hold peacekeepers legally accountable. The Status of Forces Agreement governing peacekeepers, which were agreed upon by most UN troop-lending states, allows for civil action to proceed only with the mission commander’s approval, which is virtually impossible to secure. Consequently, the UN itself offers few incentives to modify military behavior and norms of conduct. If peacekeeping is to become more credible, it will need to become more accountable for its actions. Just as the UN increases its role in promoting world democracy and civilian oversight, its peacekeepers too have to become more democratic and transparent. The time has thus come to modify the current system and allow the UN to prosecute the “bad apples” in international tribunals.


The views expressed in this blog entry are those of the author and do not reflect the official policy or position of the Naval Postgraduate School, Department of Defense, or the U.S. government.

sotomayorArturo C. Sotomayor is an assistant professor in the National Security Affairs Department at the Naval Postgraduate School in Monterey, California.  He is the author of The Myth of the Democratic Peacekeeper: Civil-Military Relations and the United Nations, now available from JHU Press.

1 Comment

Filed under Current Affairs, Foreign Policy, Politics

The Doctor Is In: Holiday Expectations

The Doctor Is In is an occasional series where JHU Press authors discuss the latest developments and news in health and medicine.

guest post by Susan J. Noonan, M.D., M.P.H.

At this time of the year, many of us are surrounded by people and environments that are wrapped up in the joy and chaos of the holiday season. You can’t seem to go anywhere without seeing festive decorations and feeling the energy of others running around.  If you are suffering from depression or bipolar depression, this can be a more stressful, burdensome, and irritating time than usual.  When your mood and energy levels are down, it is often difficult to muster the effort to participate in the activities of the season, especially since you may have no interest in doing so. That is part of the illness. But at the same time you may feel pressure to participate, either from within or from family members. Pressure to put on a cheery disposition around others. Pressure to think of gift-giving ideas and then to actually go out and buy the items at a crowded shopping mall—a real challenge! Pressure to prepare an elaborate holiday meal for your family. Pressure to attend the many holiday functions at work or school or with friends or family members. You may also feel a false sense of competition with in-laws or neighbors to prepare the more celebratory holiday experience. In addition, you may carry around a set of expectations for what you “should” do at this time of year and berate yourself for not following through.

Expectations are tricky. At the holiday time, they often appear as an artificial set of standards that you impose upon yourself, based upon some unreachable ideal in a magazine, on television, or that your great-grandmother was said to have embodied.  Trying to reach these unrealistic expectations will only bring you disappointment and more stress, not pleasure. Instead, think about where you are with your depression, and what you can realistically do now for yourself and your family. Set out small goals for your holiday season, ones that are attainable. Break each one down into small steps. Keep it all very simple, and you and others will enjoy the holidays more. Remember to remove the word “should” from your vocabulary—it often gets you into trouble.  Instead of saying “I should” do this or that, replace it with “I would like” to do this or that. And then, if possible, aim for your more realistic goal and don’t be upset if you cannot reach it today.

Expectations from family members are also difficult. They often have history associated with them, based on years past or long-held family traditions. Just because these demands and routines existed in the past does not mean that you have to be held to them this particular year. Again, consider where you are now with your depression, what you can do realistically at this moment, and hold yourself to that. Say “no” at times if you need to. Explain your present situation, and your loved ones will understand.

You can get through the stress of the holiday season by following the basics of mental health to take care of yourself and by using sound coping skills, all of which I describe in my book Managing Your Depression: What You Can Do To Feel Better. Make sure that you keep up with a regular sleep pattern, follow a healthy diet instead of relying on heavy holiday buffets, hold alcohol to a minimum, and get daily exercise. Keep up with your daily routine and structure, maintain healthy social contacts, and avoid isolation. Use effective coping skills to manage the additional stressors of the season. Stay well!

noonanSusan J. Noonan, M.D., M.P.H., is a board certified physician who currently works as a consultant to Massachusetts General Hospital and CliGnosis, Inc. Managing Your Depression: What You Can Do to Feel Better is available from JHU Press.


Filed under Emotional Health, Health and Medicine, Mental Health, Psychiatry and Psychology, The Doctor Is In

Grandparenting a “Mixed Bag” of Children with and without Physical Disabilities

guest post by Kay Harris Kriegsman, Ph.D. and Sara Palmer, Ph.D.

I look at her waking up of a morning. I sit right outside her bedroom door so she’ll see me there and feel safe knowing her granddad is there. We’ll build tents and take our flashlight inside. It’s given me an appreciation for what we have each day.”—Ned, a grandfather

Becoming a grandparent is most often a joyful event. Grandparents look forward to re-experiencing the wonder of new life, to playing with baby and watching him grow through the stages of childhood—especially fun when he can be given back to his parents at the end of the day! But grandparenting comes with responsibilities, too.  Single parents and two-career couples may rely on grandparents to pick up the slack, and older adults who are generally healthier and live longer than in past generations may have more energy and time to be part of their grandchildren’s lives. In fact, the American Association of Retired Persons announced in 2011 that about 70% of grandparents help with their grandchildren, providing emotional support, practical help, or financial assistance.

Even a grandparent who doesn’t help on a regular basis can be a parent’s “ace in the hole,” stepping in to soothe frayed nerves or care for children when the family experiences unusual demands such as moving, divorce, or changing jobs. When a child is born with a physical disability, or acquires a disability through injury or disease, the family requires a period of readjustment, and grandparents are often eager to help. Their support can be critical to their children’s success in meeting the needs of all their children.

Grandparents can enrich the family by helping in numerous ways, but “help” should not demean or undermine parental authority. Grandparents can be a sounding board for decision making, without trying to make the decisions for their adult children. They need to be sensitive to what their children and grandchildren need from them at different moments—is this a time to be “waiting-in-the-wings” or a “you’re needed now!” time?  If grandparents miss their cue, parents can redirect them; flexibility and communication are critical to managing parent-grandparent relationships.

Grandparents want to be in on the highs and lows of their children’s and grandchildren’s lives, but their location  may influence the ways they can be involved. When grandparents live far from their grandchildren and travel is not possible, technology is the answer; grandparents can connect with their grandkids through telephone calls, e-mail, Skype, Twitter, or Internet video chats. Increasingly, tech-savvy grandparents have even joined the world of texting! Grandparents who live near, but not with, their adult child’s family, can more easily balance regular time with the grandkids. But grandparents who live in the same house with their child and grandchildren are privy to the workings of their child’s nuclear family system, to some extent becoming part of it. This can be rewarding for everyone if roles and boundaries are clearly defined. The family will run smoother if grandparents remember to support the parents’ roles and rules.

When grandparents are able to give it, practical support (housework, babysitting, chores, etc.) is invaluable to parents. When Susie, a single mother of four, including twins with physical disabilities, had a ruptured disc, her parents kept the children at their house for six weeks. This gave Susie time to recuperate, and gave her parents some “real, 24-7” time with their grandsons, enabling them to get to know each other better. Other grandparents make play dates with their grandchildren or take dinner to their children’s family; Helen often babysits overnight so her son and his wife can have a break from parenting. This type of help for parents also doubles as time for grandparents to enjoy and get closer to their grandkids.

When a grandchild has a physical disability, he may have additional needs related strictly to his disability. As a grandparent, learning the new skills involved in caring for that grandchild—such as catheterizing and bracing—can be both challenging and  exhilarating. Learning these hands-on skills opens up new possibilities for spending time with grandchildren, for example, overnight visits or trips together. Grandparents handy with tools can help make their child’s home more accessible for the grandchild with a physical disability by putting in ramps or lowering light switches, building a “desk” that fits across his grandchild’s wheelchair, or making custom built-ins in her room so she can reach her toys and books.

Grandparents can also play a role in facilitating the family’s attitudinal adjustments. Typically inclined to love their grandkids unconditionally, grandparents may initially be more accepting of a grandchild with a physical disability. They can help the family focus on their grandchild’s abilities without losing sight of the fact that disability may add some challenges. Some  grandparents follow the parent’s lead  as they expand and redefine their vision of a “normal family,” but other grandparents are a step ahead in seeing the family in a new light.

Through their relationship with a “mixed bag” of grandchildren—those with and without physical disabilities—grandparents can enrich the life of grandchildren, give parents a break, and encourage the creation of an inclusive family. These grandparents have a unique opportunity to develop personal strengths—patience, compassion, self-confidence, optimism, a sense of gratitude for the small things in life—and to reach their own potential as human beings.

kreigsmanKay Harris Kriegsman, Ph.D., is a practicing psychologist and consultant on disability issues. Sara Palmer, Ph.D., is an assistant professor in the Department of Physical Medicine and Rehabilitation at the Johns Hopkins University School of Medicine. Together they wrote Just One of the Kids: Raising a Resilient Family When One of Your Children Has a Physical Disability and, with Jeffrey B. Palmer, M.D., are coauthors of Spinal Cord Injury: A Guide for Living.

1 Comment

Filed under Health and Medicine

Remembering Pearl Harbor

guest post by John Bodnar

Americans will soon be reminded again of the significance of December 7, 1941. For the past seventy-two years, the Japanese attack on Pearl Harbor has been recalled not only as the event that pushed America into World War II, but as a personal milestone for many who were alive on that date. Most Americans would never forget where they were when they heard about the raid on the Hawaiian naval base. Ever since the war, thousands of tourists have traveled to the site of the battle and stood above the decaying hulk of the USS Arizona, where the remains of American sailors still rest.

Pearl Harbor was evoked again in the aftermath of the 2001 terrorist attacks. On the sixtieth anniversary of the Japanese attack—just months after the collapse of the World Trade Center towers in New York—President George W. Bush proclaimed that September 11, 2001 would now stand alongside December 7, 1941 as a moment in which “our way of life was brutally and suddenly attacked.” The chief executive urged citizens to remember the sacrifices of the “greatest generation who defeated tyranny” as they embarked upon another struggle to “defend freedom” and “secure civilization.” In 1991, on the occasion of Pearl Harbor’s fiftieth anniversary, the president’s father, a World War II vet, used the memory of Pearl Harbor as an event that justified the nation’s need to remain vigilant against any form of aggression that might threaten the homeland. In both instances, spectacularly violent incidents were used to quickly mobilize sentiments for war.

The fact that Pearl Harbor has become entangled in our times with the American war against terrorism raises a number of questions about the way citizens understand both the world war of the 1940s and the struggles of our age. One obvious controversy has broken out regarding the American intrusion into Iraq. Pearl Harbor evoked widespread anger in the United States. Many men volunteered for the service as a form of retaliation for what the Japanese did. Americans could not identify a particular nation as a perpetrator in 2001, only a terrorist organization. Part of their response, however, did single out the potential damage Iraq might inflict on the American homeland. Thus, to the extent Pearl Harbor was invoked to justify American military action, the attack on Iraq revised the historical image by making an American war campaign preemptive rather than reactive.

I have seen another significant difference between our understanding of World War II and the global struggle against terrorism. In looking at the outpouring of memoirs by American soldiers who have served in Iraq and Afghanistan, I have been struck by the celebration of American heroism, a narrative strategy that has been widely accepted by the public. Books on expert snipers in Iraq and Navy SEAL teams carrying out daring raids—including the killing of Osama bin Laden—have captivated American audiences and attained best seller status. In many ways, these stories have continued the revival of the warrior hero in American culture that begun in the 1990s with the celebration of the “Greatest Generation.” All of this has come, in part, as a response to the more sordid legacy of Vietnam.

The commemoration of the War on Terror through the heroic exploits of special forces is powerful, but really at odds with the way many GIs wrote about their experiences in the immediate aftermath of  World War II. Certainly there was pride and heroism in the public discussion of the war in the 1940s—before it was known as a “Good War.” Yet the most prominent literary achievements of that era by veterans were stories that raised many questions about the war such as its legacy of extreme violence. This was the point of Norman Mailer’s great World War II novel, The Naked and the Dead. And the monumental trilogy of the war authored by James Jones was filled with references to the inequality and brutality he saw in the military itself.  Notably Jones began his famous series with a narrative set at Pearl Harbor—From Here to Eternity. Even Audie Murphy’s memoir, To Hell and Back, failed to present a celebration of American fighters. In his account  of the battles against the Germans,  he stressed the feeling that the men felt they were in a state of constant peril and, that while some fought bravely, others fought only because they sensed they had no other choice.

We cannot know how the War on Terror will be recalled decades from now. Yet, there may be a pattern at work currently that is the reverse of the “Good War.” That contest became more mythical over time. Perhaps the heroic tales of American forces today will seem less resonant in the time to come.

bodnarJohn Bodnar is the Chancellor’s Professor of History and the director of the Institute for Advanced Study at Indiana University. He is author or editor of a number of books, including The “Good War” in American Memory and Blue-Collar Hollywood: Liberalism, Democracy, and Working People in American Film, both published by Johns Hopkins.


Filed under Uncategorized

Wild Thing: Of Crazy Ants, Kudzu, and West Nile Virus

Wild Thing is an occasional series where JHU Press authors write about the flora and fauna of the natural world—from the rarest flower to the most magnificent beast. 

guest post by Russell F. Reidinger, Jr.

racoonRaccoons work hard to get into attics, sometimes destroying siding or roofing materials along the way. Once inside, raccoons may damage electrical boxes, wiring, or plumbing vents, or spread disease. Most raccoons strongly resist eviction, especially if they have young. Trying to get raccoons out, especially whole families, can take planning, resilience, and work. And, even with success, raccoons may try aggressively to get back in. Given that raccoons can bite and scratch, and the uncertainty of diseases, the better strategy might be to get professional help.

In retrospect, most homeowners have who experienced raccoon invasions would probably agree that preventing access to their homes would have been preferable to removing the raccoons. Regardless, raccoons in attics are mental images that often come to mind when thinking of wildlife damage. So are images of squirrels in attics or skunks under porches or deer jumping in front of cars.

But images of some animals—the crazy ant, for example—do not typically pop up when we think of wildlife damage. Yet crazy ants cause extensive damage to island ecosystems. Called “crazy” because of their erratic movement, the ants can form colonies in tree canopies and tolerate multiple queens. Supercolonies with 300 queens have been discovered. The ants are voracious omnivores that eat grains, seeds, and detritus. They “farm” scale insects and aphids. So, where is the problem? The ants spray red land crabs with lethal amounts of formic acid, then eat the protein-laden crab carcasses. Crazy ants have killed 15 to 20 million crabs since the late 1980s on Christmas Island alone. The absence of the crab, formerly a keystone species for the Islands, has caused dramatic changes in litter cover and species richness, along with a concomitant decline in some endemic species.

Kudzu_field_horz1Kudzu, also called “the vine that ate the South,” is another example of wildlife that causes damage. Brought to the United States from Japan for the 1876 Philadelphia Centennial Exposition, the vine was planted widely in the eastern United States for erosion control. Kudzu is now prominent in many southern landscapes, where its vines can cover entire canopies, telephone poles, or abandoned houses. The covered landscape provides a green, ghostly appearance from a vantage over the canopy, but suffocated native vegetation lies underneath. Kudzu carries soybean rust fungus. While efforts have been made to use kudzu for products such as soaps and jellies, and it has even been the subject of poems, kudzu remains a serious southern ecosystem problem.

Hailing from the West Nile Province of Uganda, West Nile virus was first identified in 1937. It appeared in New York City in 1999. The disease, transmitted by mosquitoes,  infects many vertebrate species, but most are asymptotic. The movement of the virus in the United States tracked closely that of some migratory birds. Species such as blue jays seem particularly sensitive and serve as indicators of the disease. While many people infected with the virus show no symptoms, a few will get meningitis or encephalitis. In the United States from 2009—2010, the Centers for Disease Control reported about 1,700 human cases, with 69 fatalities.

One can question whether problems such as these are part of wildlife damage management. Are the species domesticated or wild? Do they affront humans or their interests? The answers can be complex. In fact, it is the principles and concepts underlying answers to broad questions such as these that are part of the real substance of Wildlife Damage Management. If you are looking for a step-by-step manual on how to remove raccoons from an attic, this book is not for you. If, however, you want to understand the biological, ecological, and human dimensional concepts underlying wildlife damage management as it is currently practiced (and, we believe, how it will be practiced into the foreseeable future), this is the book for you. We review characteristics of damaging plant and animal species in North America and around the globe; summarize physical, pesticidal and biological control methods; and emphasize traditional vertebrate pests with abundant examples. But we take the position that today’s wildlife damage management also includes invasive plants and animals and wildlife diseases and zoonoses. And we include some speculation on how wildlife damage started anyway, beginning with Australopithecus afarensis, a preman who served more as prey than predator. I encourage you read our book.

reidingerRussell F. Reidinger, Jr., is a former director, National Wildlife Research Center, USDA APHIS / Wildlife Services, and an adjunct professor in the Department of Agriculture and Environmental Sciences at Lincoln University in Jefferson City, Missouri, and in the School of Natural Resources, University of Missouri, Columbia. With James E. Miller he is coauthor of Wildlife Damage Management, published by JHU Press.


Filed under Biology, Conservation, Wild Thing

Clara Barton and Mr. Jones: From Gettysburg to I Street

guest post by Marian Moser Jones

Last week, a man identifying himself as George Jones from Chicago left a cryptic voicemail on my office phone: “I have some information for you about Clara Barton. Please call.” In the months since the publication of my book, The American Red Cross from Clara Barton to the New Deal, I had encountered people who recounted stories of relatives who worked for the Red Cross, but nobody promising an inside scoop on its iconic founder. Barton, who gained wide renown for her Civil War aid work, organized the American Red Cross in 1881. Perhaps Mr. Jones had uncovered some new evidence about her—a blood-encrusted battlefield diary or a trove of steamy love letters.

The truth, of course, proved more mundane. It nevertheless raised profound questions about the historical importance of place—questions that have become especially relevant as we commemorate the 150th anniversaries of major Civil War events, including, today, the Gettysburg Address.

While researching the history of the site on which his daughter’s house in Washington, D.C., was built, Mr. Jones told me he had learned that Clara Barton had stayed at the original house on that site in 1877. He suspected that this might have been where the American Red Cross held its inaugural meeting on May 21, 1881, but he had been unable to confirm this. Did I know the meeting’s location?

I was embarrassed to say I did not, but a quick search in a database of historical newspapers led me to a May 23, 1881, article in the Washington Evening Star, which listed the meeting’s address as 1326 I Street. Mr. Jones told me this made sense, as Barton had been living at this address in 1881 with her friends Rev. William Merritt Ferguson and his wife, according to material Mr. Jones later supplied me from diary entries and an obscure biography.

Is the I Street location marked with a plaque or something similar? Again, I didn’t know. I drove to downtown Washington, D.C., to investigate, and quickly found that the row of houses that once stood on the site, according to a 1903 architect’s map that Jones supplied me, had been razed and replaced by a 12-story monolithic office building. I got out of my car and walked into the building’s imposing, columned lobby. But I found no historic markers—only a directory of law firms and bank offices.

Perhaps it is more honest to let the geographic past be overwritten by the present. The high-ceilinged lobby of 1300 I Street, with its massive, tree-trunk Greco-Roman columns and its shiny marble floor, serves as an apt monument to the power of large moneyed interests in twenty-first century American politics (as do similar buildings on K Street, across a small park from this one). Then again, a plaque to commemorate the founding of the American Red Cross might remind passersby that it has not always been this way—that a single woman, armed with little more than reputation, friends, and uncommon tenacity, organized on this very site an organization that later became a powerful global force for humanitarian aid and even convinced the U.S. to sign the Geneva Conventions. Such a plaque would both recognize and utilize the historic value of place. Commemorating the space where a world-changing event occurred with a marker or ceremony (or both) acknowledges the importance of that past event to humanity while imbuing the space with new meaning in the present.

This type of transformation is exactly what Lincoln effected when he gave the Gettysburg Address. While the ostensible purpose of the ceremony at which he spoke was to honor the dead, his speech served primarily to galvanize the living by utilizing the audience’s immediate sensory experience of fresh burial sites. “It is rather for us to be here dedicated to the great task remaining before us,” he stated, “that we here highly resolve that these dead (emphasis added) shall not have died in vain—that this nation, under God, shall have a new birth of freedom.” In imposing this new living meaning on the death-site of Gettysburg, Lincoln transformed a place of senseless internecine carnage into the birthplace of a new America, and re-conceived the war deaths as meaningful sacrifices in a struggle for the fulfillment of our national ideals.

Since the delivery of the Gettysburg Address, many battlefields and other sites of mass death have been commemorated in a similar manner. But why don’t we also commemorate sites of creation, of uncommon generativity? If blood has been spilled on ground, does this render a place more sacred than if it has held the sweat (even metaphorically) of those who labored on its site to create a new organization or movement?

I would say no. And yet there are far too few commemorative markers for such peacetime events. A museum slated to open earlier this year on the site where Barton organized her campaign to find missing soldiers after the Civil War, on 437 7th Street Northwest, remains unfinished, and everyone seems to have forgotten about the I Street location. Everyone, that is, except Mr. Jones. He may not have discovered the secret Clara Barton diary, but in acknowledging the importance of the past by working painstakingly to trace the physical site on which Barton founded the American Red Cross, he is doing something important.

jonesMarian Moser Jones is an assistant professor of family science at the University of Maryland School of Public Health  and a former DeWitt Stetten Fellow at the National Institutes of Health, Office of History. Her book, The American Red Cross from Clara Barton to the New Deal, is available from JHU Press.

1 Comment

Filed under American History, Civil War, Current Affairs, For Everyone, Gettysburg, History, Regional-Chesapeake Bay, Uncategorized, Women's History

The Health Crisis of the Civil War

guest post by Margaret Humphreys

In a recent article in the New England Journal of Medicine, Jennifer Leaning and Debarati Guha-Sapir explore the public health implications of natural disasters. At first the fact that wars and disasters kill people may provoke an eye-roll response—“Oh, gee, I didn’t know that”—but a closer reading evokes a broader perspective on the common disruptions of such events and the usefulness of pattern prediction for planning humanitarian response.

Some details will help make this approach clearer. A public health view of war predicts, for example,  that food supplies will be disrupted, infectious diseases will flourish, and displaced persons, especially women and children, will be at risk. Whether the war in question happens in Virginia in 1862 or Mali in 2012, these are factors that are likely to take lives; in an ideal world, such factors may be areas in which humanitarian intervention can make a difference.

The public health analysis of warfare points out that wars often reduce the available food supply. Agricultural workers are drawn from the fields, either due to military enlistment or because the workers flee from servitude or the threat of arms. No one has recorded, to my knowledge, whether the famous wheat field and peach orchard at Gettysburg yielded any harvest in the fall of 1863.  Hungry soldiers raided barns and fields, and then used rails from fences to roast their trophies. Lacking fences, the remaining livestock wandered off, and deer grazed on liberated crops. By 1864, southern civilians rioted for bread and salt, especially in war-ravaged Virginia. And, as in the novel Cold Mountain, desperate Confederate patients deserted from hospitals where they were slowly starving.

Infectious diseases did very well in the Civil War. Early on the military camp was an impromptu city with little infrastructure for clean water, sewage, or the provision of food. The number one disease category of the war, dysentery and diarrhea, flourished in this setting. The diseases that spread from one breath to another moved easily in the crowded camp conditions, leading to outbreaks of measles, smallpox, and tuberculosis. Malnutrition, especially vitamin C deficiency, meant that wounds did not heal well and gangrene sprouted in amputated arms and legs. Diseases were amplified in civilian populations as well, when soldiers disabled by disease took their fevers home with them.

The Civil War was no stranger to the displaced persons that are such a major part of modern civil conflicts, whether in central Africa or southeast Asia. Most prominent in the U.S. case were the tens of thousands of refugee slaves who fled to the Union army in hope of freedom. The men were often put to work by the northern leadership, as menial labor or, by 1863, as enlisted troops. This often left women and children to fend for themselves, with no income, no food, no clothing or shelter. Estimates of mortality in slave refugee camps range from 25 to 50%, where victims died of want or the diseases that took fire among this weakened population.

The American Civil War was not without its humanitarian response, led by state, regional, and national sanitary commissions that tried to supply soldiers with adequate food and clothing, and did what they could to help refugees. As is the case in modern warfare, it was impossible to reach some victims due to the persistence of conflict or the deliberate interference of contending forces. The United States Sanitary Commission tried to feed the Union troops imprisoned in Confederate prisoner of war camps, but they could not reliably get food into the prisons, as the southern government refused or confiscated the supplies.

Walt Whitman famously noted that the “real war” would never get in the books. He wrote of the “unending, universal mourning-wail of women, parents, orphans,” a marrow of tragedy that ran through American life by the end of the war. Seven score and ten years ago, Abraham Lincoln asked “that these dead shall not have died in vain.”

HumphreysMargaret Humphreys is the Josiah Charles Trent Professor in the History of Medicine, a professor of history, and a professor of medicine at Duke University. She is the author of Marrow of Tragedy: The Health Crisis of the American Civil War and Intensely Human: The Health of the Black Soldier in the American Civil War, both published by Johns Hopkins.

1 Comment

Filed under American History, Civil War, For Everyone, Gettysburg, Health and Medicine, History, History of Medicine