Hoy que el "imperalismo socio cultural" de Estados Unidos trata de impornerse en el mundo académico centroamericano a través del financiamiento de agendas de investigación, fomento de nuevas ONGs académicas y de investigadores es importante leer con más detenimiento la producción estadounidense. En efecto, hay una historiografía estadounidense que va más allá de la historia cultural, la teoría de juegos y la denominada nueva historia política.
Recientemente se publicó en castellano un libro de Eric Foner denominado La historia de la Libertad en EE.UU. por la editorial PENINSULA, este libro se une a las ya clásicas traducciones de historiadores marxistas estadounidenses como las de Howard Zinn (LA OTRA HISTORIA DE LOS ESTADOS UNIDOS. Madrid.HONDARRIBIA. 1997), las cuales deben formar parte del acervo de todo historiador centroamericano y de todos aquellos que disfrutan de la historiografía estadounidense.
Para aquellos interesados adjuntamos dos obras de Eric Foner:
Forgotten Step Toward Freedom
Tomado de http://www.ericfoner.com/articles/123008nytimes
from The New York Times -- December 30, 2008
We Americans live in a society awash in historical celebrations. The last few years have witnessed commemorations of the bicentennial of the Louisiana Purchase (2003) and the 50th anniversary of the end of World War II (2005). Looming on the horizon are the bicentennial of Abraham Lincoln’s birth (2009) and the sesquicentennial of the outbreak of the Civil War (2011). But one significant milestone has gone strangely unnoticed: the 200th anniversary of Jan. 1, 1808, when the importation of slaves into the United States was prohibited.
This neglect stands in striking contrast to the many scholarly and public events in Britain that marked the 2007 bicentennial of that country’s banning of the slave trade. There were historical conferences, museum exhibits, even a high-budget film, “Amazing Grace,” about William Wilberforce, the leader of the parliamentary crusade that resulted in abolition.
What explains this divergence? Throughout the 1780s, the horrors of the Middle Passage were widely publicized on both sides of the Atlantic and by 1792 the British Parliament stood on the verge of banning the trade. But when war broke out with revolutionary France, the idea was shelved. Final prohibition came in 1807 and it proved a major step toward the abolition of slavery in the empire.
The British campaign against the African slave trade not only launched the modern concern for human rights as an international principle, but today offers a usable past for a society increasingly aware of its multiracial character. It remains a historic chapter of which Britons of all origins can be proud.
In the United States, however, slavery not only survived the end of the African trade but embarked on an era of unprecedented expansion. Americans have had to look elsewhere for memories that ameliorate our racial discontents, which helps explain our recent focus on the 19th-century Underground Railroad as an example (widely commemorated and often exaggerated) of blacks and whites working together in a common cause.
Nonetheless, the abolition of the slave trade to the United States is well worth remembering. Only a small fraction (perhaps 5 percent) of the estimated 11 million Africans brought to the New World in the four centuries of the slave trade were destined for the area that became the United States. But in the Colonial era, Southern planters regularly purchased imported slaves, and merchants in New York and New England profited handsomely from the trade.
The American Revolution threw the slave trade and slavery itself into crisis. In the run-up to war, Congress banned the importation of slaves as part of a broader nonimportation policy. During the War of Independence, tens of thousands of slaves escaped to British lines. Many accompanied the British out of the country when peace arrived.
Inspired by the ideals of the Revolution, most of the newly independent American states banned the slave trade. But importation resumed to South Carolina and Georgia, which had been occupied by the British during the war and lost the largest number of slaves.
The slave trade was a major source of disagreement at the Constitutional Convention of 1787. South Carolina’s delegates were determined to protect slavery, and they had a powerful impact on the final document. They originated the three-fifths clause (giving the South extra representation in Congress by counting part of its slave population) and threatened disunion if the slave trade were banned, as other states demanded.
The result was a compromise barring Congress from prohibiting the importation of slaves until 1808. Some Anti-Federalists, as opponents of ratification were called, cited the slave trade clause as a reason why the Constitution should be rejected, claiming it brought shame upon the new nation.
The outbreak of the slave revolution in Haiti in the early 1790s sent shock waves of fear throughout the American South and led to new state laws barring the importation of slaves. But in 1803, as cotton cultivation spread, South Carolina reopened the trade. The Legislature of the newly acquired Louisiana Territory also allowed the importation of slaves. From 1803 to 1808, between 75,000 and 100,000 Africans entered the United States.
By this time, the international slave trade was widely recognized as a crime against humanity. In 1807, Congress prohibited the importation of slaves from abroad, to take effect the next New Year’s Day, the first date allowed by the Constitution.
For years thereafter, free African-Americans celebrated Jan. 1 as an alternative to July 4, when, in their view, patriotic orators hypocritically proclaimed the slave-owning United States a land of liberty.
It is easy to understand, however, why the trade’s abolition appears so anticlimactic. Banning American participation in the slave trade did not end the shipment of Africans to the Western Hemisphere. Some three million more slaves were brought to Brazil and Spanish America before the trade finally ended. With Southerners dominating the federal government for most of the period before the Civil War, enforcement was lax and the smuggling of slaves into the United States continued.
Those who hoped that ending American participation in the slave trade would weaken or destroy slavery were acutely disappointed. In the United States, unlike the West Indies, the slave population grew by natural increase. This was not because American owners were especially humane, but because most of the South lies outside the tropical environment where diseases like yellow fever and malaria exacted a huge toll on whites and blacks alike.
As slavery expanded into the Deep South, a flourishing internal slave trade replaced importation from Africa. Between 1808 and 1860, the economies of older states like Virginia came increasingly to rely on the sale of slaves to the cotton fields of Alabama, Mississippi and Louisiana. But demand far outstripped supply, and the price of slaves rose inexorably, placing ownership outside the reach of poorer Southerners.
Let us imagine that the African slave trade had continued in a legal and open manner well into the 19th century. It is plausible to assume that hundreds of thousands if not millions of Africans would have been brought into the country.
This most likely would have resulted in the “democratization” of slavery as prices fell and more and more whites could afford to purchase slaves, along with a further increase in Southern political power thanks to the Constitution’s three-fifths clause. These were the very reasons advanced by South Carolina’s political leaders when they tried, unsuccessfully, to reopen the African slave trade in the 1850s.
More slaves would also have meant heightened fear of revolt and ever more stringent controls on the slave population. It would have reinforced Southerners’ demands to annex to the United States areas suitable for plantation slavery in the Caribbean and Central America. Had the importation of slaves continued unchecked, the United States could well have become the hemispheric slave-based empire of which many Southerners dreamed.
Jan. 1, 1808, is worth commemorating not only for what it directly accomplished, but for helping to save the United States from a history even more terrible than the Civil War that eventually rid our country of slavery.
Historians Today: Pleasures, Prospects, and Predicaments
Tomado de http://www.historians.org/Perspectives/Issues/2000/0001/0001pre1.cfm
By Eric Foner
The presidency of the American Historical Association is the greatest honor that a historian in this country can receive, and I am profoundly grateful to my colleagues for bestowing it upon me. Thanks to the dedication and enlightened leadership of my predecessors, Joseph Miller and Robert Darnton, the AHA enters the new century stronger than ever, well prepared to continue the work of promoting the study of history and the dissemination of historical knowledge so critical to a democratic society.
I assume this office at an exciting time to be a historian. The 1990s has been a decade of unprecedented public interest in history. The History Channel is among the most successful enterprises on cable television, and attendance at historical museums and other sites is at a record high. Works of history (sometimes by professional historians) regularly appear on best-seller lists, and Hollywood, for better or worse, continues to churn out historically oriented films. If my own university is any indication, student interest in history as evidenced by course enrollments has never been greater.
Despite a proliferation of partisan jeremiads lamenting the decline of historical scholarship, I believe that overall, the study of history is in a healthy state. It hardly needs reiteration that the past two generations have witnessed a remarkable expansion of the subject matter of history, as new methods and concerns have vastly expanded the cast of characters included in historical narratives and the methods employed in historical analysis. The professorate itself has changed so that it more fully reflects the composition of our society. In eight years of undergraduate and graduate study at Columbia University in the 1960s, I never encountered a single female or nonwhite teacher. Such an experience would be virtually impossible today.
Of course, recent changes in the study of history have produced their own concerns, about the fragmentation of scholarship, the difficulty of constructing coherent narratives (or whether narrative itself is ultimately a form of fiction), and many other issues. Rather than a sign of weakness, I see today's debates as evidence of the strength of our profession. The study of history is so immense and varied that it is less susceptible than other disciplines to radical swings of outlook or the sudden triumph of new fads. No single method or point of view can ever completely dominate it. The very clash of approaches and interpretations is what gives our enterprise vitality and advances historical understanding. I believe that in the sheer output of works of excellence, the 1990s can compare with any previous period. The rediscovery of the centrality of history in other disciplines—the "new historicism" in literary studies and anthropology, for example, provides further evidence of the vitality of historical scholarship.
Of course, the 1990s also saw history emerge as a political "wedge issue," with public officials and private pressure groups seizing upon developments like the proposed National History Standards or the Enola Gay exhibition at the Smithsonian Institution to score points by blaming "revisionist" historians for many of the ills, real and imagined, of American society. No one was more surprised by being suddenly thrust into the public spotlight than historians themselves. Among other things, these debates revealed that aspects of the study of the past that we take for granted, especially the conviction that the constant search for new perspectives is the lifeblood of historical understanding, are viewed with suspicion by many outside of academe.
Ironically, even as popular interest in history has burgeoned, widespread ignorance flourishes about both historical methodology and historical knowledge. The recent impeachment of President Clinton revealed that large numbers of journalists and political commentators possess an appalling lack of information about basic elements of our constitutional structure and key moments in our national past. I vividly recall that a year ago, as a scholar of the Reconstruction period, I was inundated with calls from journalists who had just discovered that a previous president, Andrew Johnson, had been impeached. Not one possessed any real knowledge of Johnson's presidency, or, more important perhaps, of the era during which he served. If they did harbor thoughts about Reconstruction, they derived these from the old Dunning school, which viewed Johnson as a courageous defender of the Constitution sabotaged by vindictive Radical Republicans bent upon punishing the South after the Civil War, a point of view that has been rejected by most historians for at least three decades. My intention here is not so much to chide journalists and "pundits" for failing to do their homework, but to remind ourselves that we continue to face a daunting task of historical education.
Another serious problem confronting the profession is the rapid growth of part-time employment among historians. Judging by the rising number of listings in Perspectives, the job market has taken a turn for the better of late. Yet part-time and temporary employment continues to proliferate. As president, I hope to devote a considerable part of my energies to investigating this problem and devising ways for the AHA to help combat it. The first step is to gather accurate information about the extent of part-time employment and the working conditions of such historians. The most recent national survey, conducted by the U.S. Department of Education in 1993, found that part-time and adjunct faculty had increased from 22 percent of faculty appointments in 1970 to more than 40 percent. The proportion was far higher (64 percent) at community colleges, but colleges and universities of all sizes and types were also found to rely extensively on part-time and temporary instructors.
Impressionistic evidence suggests that the numbers are even higher today, but facts are hard to come by. Thanks to a Chairman's Grant from William R. Ferris, head of the National Endowment for the Humanities and a devoted friend of historical scholarship, the AHA, the Modern Language Association, and a number of other learned societies are undertaking a national survey that should provide up-to-date data about the extent and working conditions of part-time employment in undergraduate education.
The AHA has also initiated an e-mail survey of part-time and temporary faculty asking them to convey their own experiences, conditions, and frustrations.
I hasten to add that it would be quite wrong to assume that adjuncts are less able scholars and teachers than full-time employees. The point is that we must insist that all historians have a right to work under dignified conditions, with adequate compensation and benefits, a voice in academic decisionmaking, and decent prospects for promotion.
This issue will come before the AHA Council in January. In a subsequent column, I hope to detail what steps the AHA can take to try to curb the proliferation of part-time employment in the teaching of history and to bring such historians more fully into the life of the profession.
—Eric Foner (Columbia Univ.) is president-elect of the AHA. He will assume the office of president at the AHA Business Meeting on January 8, 2000.
Copyright © American Historical Association