All materials on this page are written by Byron Kho and are his sole property. For permission to quote or a list of citations for each essay, please contact him at

Assorted Essays

Section I:

Brief: Modern Iraqi Affairs (circa 2003)
Struggles in Palestine
Romantic Love and the Individual
Individuality in The Red and The Black
Obstacles to Unity
A Crisis in Europe
Economic Imperialism, Globalization and Threads of Resistance
Colonialism and Power Relations in Houseboy
War and the Destruction of the Politic
Reformations From Within
Social Objectives and the Usage of the Noble Savage
Echoing Empire: Frances Edens Tigers, Durbars and Kings
Erasmus on Folly
Folk Music and the Economics of Musical Heritage
Section II: View Essays in Section II

• Thoughts on Blowback
• The Census as Imperialism
• When Corporations Rule The World
• Prospectus on American Resistance to Imperialism
• American Resistance to Imperialism: Public Response to the Spanish–American War
• The Legacy of Maria Callas
• The Everlasting Bohemian
• The Origins of the Scientific Revolution
• Technology Policy and the Economic Development of South Korea and Taiwan
• Russian Music in the Nineteenth and Twentieth Century
• Nationalism and the Protestant Reformation
• Literary Points
Section III: View Essays in Section III

• Cultures of United States Imperialism Before 1945
• Doukas
• Talking Points
• History of the Middle East: After the 1800s
• International Politics
• Capitalism in Asia
• The Fountainhead>
• Clemens on Slavery
• Mismatched
• Life in Art
• A Modernist Tragedy
• Thoreau on Nature

Brief: Modern Iraqi Affairs (circa 2003)

The beginnings of the state of Iraq laid the groundwork for its problems in the modern era. Under a British mandate, Iraq was established out of the provinces of Basra, Baghdad and Mosul. Further settling of Iraqi borders in 1923 decreed that Kuwait was to have most of the coastline territory, effectively preventing Iraq from ever becoming a great naval power in the Gulf area. Iraqi policy has since continually refused to acknowledge Kuwait as a separate state, until very recently. The British installed monarchy ruled over the ethnically diverse country until 1958, when an army officer overthrew the government because of its pro-West stance. The new government had to deal with the problems left from the previous era, including the under-representation of the majority Shi’a Muslims, the independence movement of the northern Kurdish population, and what was seen as Western appropriation of Iraqi resources through the Iraq Petroleum Company.

The destitution of 80% of the population led to the popularity of political parties that catered to the lower classes, like the Ba’ath party. American involvement began with monetary support to these dissenting groups as a counter to the regime’s arms buildup and military threat to oil interests in the area as shown by its resumption of hostilities with Kuwait. A Ba’athist coup in 1967 led to the solidification of power under Ahmad Hasan al-Bakr and then Saddam Hussein in 1979. Their repressive methods saw the gradual removal of the party/state distinction in all areas of Iraqi life and brought stability to Iraqi politics by crushing all opposition, including the Kurdish rebellion and the Shi’a rebellion of 1979, led by the ulama-supported Islamic Call. Wealth and international recognition came to Iraq in 1972 with the nationalization of the Iraq Petroleum Company, whose revenues helped to propel industrialization and social welfare schemes into reality. Politically, Iraq aligned with the Soviet Union by signing a 15-year friendship treaty and acquiring arms from them.

Relations with Iran, bad to begin with, were momentarily patched up with the Algiers Agreement, which conceded territory to Iran in return for discontinuing support to the Kurdish rebels. However, the rise of the Ayatollah Khomeini’s universal Islam movement in Iran provided too large of a threat to the secular nationalism inherent in the Ba’ath regime. In 1980, the decision to invade Iran was made with the moral and financial support of Kuwait and Saudi Arabia, who felt they were also being threatened. A protracted war followed that began to affect oil shipping in the region. Alarmed at the menace to the Middle Eastern oil supply, the United States reestablished diplomatic relations with Iraq and supplied intelligence, monetary aid and military support to Iraq and Kuwait to bolster the effort against Iran. The war ended in 1988 by an agreed ceasefire. Significantly, the Iraqi Shi’a majority had chosen to side with the Ba’athist regime rather than submit to the religious calling inherent in the Iranian message.

Saddam Hussein’s tenure was marked by the building of a vast personality cult and an intention to be the foremost country in the Arab world. After the failure of the Palestinian intifada, he was the only Arab leader to champion the Palestinian cause in the face of growing Israeli power, and he was able to justify his enormous rearmament expenditures by arguing that Israel would only recognize Palestinian rights if Arabs could achieve military parity. At the same time, he began to resume hostilities with Kuwait, for several reasons. The destruction of Basra during the Iran-Iraq war had left Iraq landlocked, and the continual refusal of Kuwait to cede offshore islands to Iraq for a naval base had prompted Iraqi aggression in 1973 and again, after 1988. The $60 billion war debt owed to Kuwait was an annoyance, as was the Kuwaiti production of oil over OPEC limits, which decreased Iraqi oil revenues by billions as oil prices began to lower. If Iraq were to gain control over Kuwait, the debt would be removed, oil production would be normalized and economic problems would no longer be an issue; also, they mistakenly feared no reprisal from Britain or the United States, who had been their staunch supporters throughout the Iran-Iraq War. The combination of these pressures and benefits led to the Gulf War.

Iraq invaded Kuwait in August of 1990, followed promptly by the beginning of Operation Desert Shield in October. The United States, alarmed at the danger to oil production in the Persian Gulf, sent a large military force to the Gulf following a formal request from Saudi Arabia. The troops had been ordered to protect Saudi borders and enforce a trade embargo on Iraq, which was supported by the Western nations, the Soviet Union and the Arab League. By November, though the venture was successful, President Bush had decided to go to war to protect Kuwaiti oil interests and to guard against the loss of the large and vital Kuwaiti investment in the Western economies. The UN Security Council had set a January 15, 1991 deadline for Iraqi withdrawal, which was ignored; American aggression then began under Operation Desert Storm. After air raids quickly eliminated the civilian infrastructure and Iraqi air defenses, Hussein sought to confuse the issue by sending Scud missiles into Israel and Saudi Arabia. During Desert Shield, he had brought attention to the US double standard in treating the Israeli occupation of the West Bank and Gaza Strip and the Iraqi occupation of Kuwait. With this aggressive move, he again brought up the Palestinian issue and aroused Arab resentment by daring the Israelis to retaliate. If Israel had done so, US Arab allies would have been forced to withdraw instead of being caught supporting an alliance with Israel against another Arab nation. US pressure persuaded them to back down. The ground war began on February 24, after which Iraqi troops scrambled in retreat back to Basra; US air strikes killed thousands before President Bush controversially declared the war over on February 27, without ensuring the overthrow of Saddam Hussein’s regime.

The aftermath of the war proved disastrous: a southern Shi’a rebellion followed by a short-lived Kurdish rebellion in the north were quickly crushed. The eventual death toll was horrendous, with 100,000 killed during the war, 6,000 during the southern uprising, and 20,000 during the Kurdish flight into the mountains. Under the terms of the ceasefire agreement agreed upon by Iraq and the UN Security Council, Iraq was required to return Kuwaiti property, pay damage claims, accept newly set boundaries and provide locations and amounts of all chemical and biological weapons caches to the UN Special Commission on Iraq (UNSCOM). If Iraq complied, economic sanctions would be lifted and Iraq would regain oil revenues. Due to American suspicion that Iraq was not being truthful with UNSCOM, the embargo stayed. Without oil revenues, Iraq was unable to pay reparations and provide humanitarian aid to the Iraqi people. As a concession, a food for oil agreement was implemented in 1996, whereby a set amount of oil was sold in order to buy humanitarian supplies. It provided little relief, as the majority of the monies were taken by the UN to pay for reparations, Kurdish supplies and UN operations in Iraq, leaving a remainder that was too miniscule to provide for the destitute Iraqi population. Civilian infrastructure had been completely ruined and food and medical supply shortages caused great child mortality, malnutrition and disease, enhanced by the inoperability of water purification systems and sewage plants. High inflation and currency devaluation reduced the Iraqi middle class into poverty, and the destruction of industry led to rampant unemployment, which inevitably led to rises in crime and prostitution. The embargo had also isolated the intellectuals of Iraq from outside influences, and prevented the introduction of new technologies like computers and the Internet. Meanwhile, Saddam Hussein was able to stay in firm control. The obvious corruption of his family among such hard times did not make him any more loved, but his ruthlessness in wiping out opposition discouraged all dissent.

The humanitarian disaster in Iraq began to persuade Arab and Western nations that it was time to end sanctions; this left the United States alone in its insistence on retaining them. The American policy of dual containment replaced diplomacy with force. In December 1998, Hussein demanded that sanctions be ended before UNSCOM would be allowed to continue with its weapons search. Almost immediately, Britain and the United States started Operation Desert Fox, bombing the country for 3 days as a show of might. This did not frighten Hussein; he ordered his air defense to fire on allied planes, which returned fire. The bombings destroyed the established UNSCOM monitoring system and attracted heated denunciations, terminating the Security Council consensus needed to maintain economic sanctions. Arab allies were no less pleased at supporting what seemed like a greedy superpower. However, this did not stop American and British aggression. Air raids and maintenance of no-fly zones continued into 2001, as did proposals, rejections and counterproposals for the sending in of weapons inspectors into the country. In 2002, Iraq offered several proposals for the entrance of weapons inspectors. These were refused by US officials, but an agreement was finally reached upon the acceptance of a UN proposal. After the discovery of several inconsistencies in Iraqi information, President George W. Bush prepared the United States for war with his January 2003 State of the Union address. In March 2003, during a concentrated US effort to go to war against Iraq, other Security Council members threatened to veto any US proposal to enter the UN. Due to unseemly behavior and misguided attempts at diplomacy, the United States was unable to find sufficient support for the war within the UN and decided to push on alone, declaring war on Iraq on March 19 under Operation Iraqi Freedom. The short war ended in April, when Saddam Hussein’s regime fled Baghdad. Currently, US troops are maintaining order in Iraq until a provisional government can be stabilized and maintain power over all of Iraq.

The Iraqi people still remain mostly destitute and in need of a strong infrastructure that can provide the populace with jobs, social services and food. Under Saddam Hussein’s agrarian reform laws, land was redistributed among the peasant farmers but agricultural upgrades were not given assistance, so production lowered while imports increased. The Gulf War and sanctions on exports did not help matters, as Iraq could not pay for food. Social reforms also could not be paid for, as mot of these services had been provided free of charge by the social welfare state sustained by oil wealth. The many social construction projects that provided jobs for the people had strong ties to industry and oil export, but the destruction and unrest within Iraq had eliminated the projects and the job prospects of a nation. These problems continued through the 1990s though the war was long over. To combat these problems as well as control the continuing fractionation of the various ethnic identities within Iraq, a strong leadership must be started in Iraq. In control of these efforts is the US military, which has not been able to find a suitable person to head the country and relieve the US-appointed interim governing council of duty. It seems that the Iraqi people are jubilant about escaping the tyranny of Saddam Hussein’s rule, but none are happy about the US occupation of the country and its involvement in the politics of the country, and would like to rule and express their freedoms for themselves.

Nevertheless, the instability of the situation seems to call for a strong presence in the area. Thus, it would be wise for the United States to stay in the area. Similar to British actions after World War I, the US presence should be minimized and rule of the country left as much as possible to the Iraqis themselves. Though unfortunate, the current revenge killings of former members of the Hussein regime and his widely-feared mukhabarat, or intelligence services, will have to be tolerated to some degree, lest overt antagonism against Iraqis not involved in actions against Americans be misperceived. The cruelties of the fedaheen expeditions – Hussein’s paramilitary forces – must also be addressed with some measure of force, but only as a last option. After what the United States has had to go through to topple Saddam Hussein’s regime, it would be detrimental to leave before a stable and favorable regime could be implemented in Iraq. Though US involvement has been foolhardy, starting with its appearance in the Iran-Iraq War, and though it has already been condemned internationally for what was seen as a reckless and unpopular war, the United States would draw harsher criticism should it fail to provide an adequate leadership for the Iraqis. US influence should be applied to ensure a fair political treatment of the majority Shi’a Muslims, who should also have a majority of the power. The safest route to contain the various ethnic identities and power struggles would be a coalition government that calls for a mixture of identities within the cabinet and bureaucracy without allowing for extended political polarization. Again, with both Afghanistan and Somalia in mind, it is important for the United States to establish stable power and sovereignty over the whole country before leaving a fledgling government to manage on its own.

Iraq should be persuaded to accept Kuwaiti sovereignty and borders without having to issue the dark threat of US retaliation; otherwise, this could inflame anti-American and pro-Arabist intentions amidst the existing grievances against the ruling families of the Gulf states. Instead, it would be prudent to let Iraqi international relations run its own course but secretly apply pressure to limit arms sales to the new Iraqi government to slow military rearmament and reestablishment of guerrilla forces. Iran, as another point of contention, could be a problem for the Iraqi state through the negative influence of pro-Iranian Shiite Muslims groups that could vie for power. However, the repression of the Muslim regime has prompted some recent anti-Muslim opinion within Iran alongside the power struggle between the president and chief ayatollah over Muslim influence, so the appeal of a pro-Islamist message to the Iraqi population would most probably have some sort of counter within Iran. Also, the negative public opinion impact deriving from the continued casualty rate in Iraq – 439 currently – could prompt a reversal in current policy in order to save political careers, but a sufficient deterrent should be the awareness of the possible detrimental effects on Iraqi and Middle Eastern political structure and stability. President Bush is already experiencing this pressure, as shown by his decreasing standings in popularity polls. In any case, a wise policy would be to stand down, but not leave Iraq, to ensure that the transition runs smoothly.

Struggles in Palestine

The increasingly frustrating reality of the Jewish - Arab situation during the British Mandate over Palestine (1922-1948) proved to be a fertile ground for passionate discourse on the nature of the co-existence of the two races in the region. From this intensely zealous debate came many forums for political expression in the form of scholarly dissertations, inflamed rhetoric and even novels, such as Amos Oz’s Panther in the Basement and Yahya Yakhlif’s A Lake Beyond the Wind. Both authors lived through those troubled times as citizens of the state of Israel and displaced Arab Palestine respectively. Central to this multifaceted conflict was the question of a right to existence - a particularly sensitive question in a time of growing anti-Semitism and political uncertainty. From afar, most of the struggles seemed to be centered on religious and economic rights; however, at base, it was a fight against political helplessness and for the right to live that was deeply rooted in history.

The quest for survival was not a new one for the Jewish people, whose history was pockmarked with turmoil and retribution. Though they had existed in Palestine for centuries, their numbers were relatively few compared to the Arabs, who were the main sources of population for the Middle East area. The first large-scale immigration to Palestine came in 1882, after the Russian state-sponsored pogroms or massacres drove many Jewish to safer environs. These waves of immigrants were to continue well after the establishment of the Israeli state in 1948; this was to be the source of much of the friction between the Arab and Jewish people. In 1906, the Zionist Congress declared Palestine to be the new Jewish homeland, and just three years later, Tel Aviv was founded to house the immigrant population that was overcrowding nearby Jaffa.

The involvement of the Allies, and Britain in particular, tended to make the already present tense situation more complicated. The end of World War I signaled a general redistribution of the Middle East areas beyond the previous allocations by the colonial powers and the Ottoman Empire. Palestine became a wild card when the Sykes-Picot agreement of 1916 left Palestine to management by the League of Nations. During the same year, the British had enlisted the support of the powerful emir of Mecca in return for the subsequent establishment of an Arab state in and around this area; the Arabs felt betrayed when the British publicly announced its support for ”a national home for the Jewish people”, in the Balfour Declaration of 1917.

It wasn’t until 1922 that the Palestine question was settled. The Mandate of Palestine was released by the League of Nations, providing the British with full administrative powers over the Palestinian territory to move toward self-autonomy and protect the civil and religious rights of all the inhabitants of Palestine, irrespective of race and religion. The agreement provided 75% of the allotted area for a new country, Transjordan, and the rest for what was to become the state of Israel, in recognition of the ”historical connection of the Jewish people with Palestine and to the grounds for reconstituting their national home in that country.”

Very rapidly, the relations between the Arab and Jewish people began to break down. As more Jewish immigrants flooded into Palestine, the Arab peoples rapidly grew disgruntled and fearful of the growing Zionist influence. Racial troubles began in earnest after the 1929 anti-Jewish riots in Hebron, where 133 Jews were killed before the riot was crushed by British forces; its roots lay in Arab mistrust of the Jewish influx. To the further chagrin of the Arab community, the British paid little attention to the founding of a Jewish defense organization, the Haganah, that was quickly integrated into the Jewish political structure. The Arabs lost trust in the British administration after the repudiation of the Passfield White Paper in 1931. This document had supported the setting aside of lands for the many Arabs displaced from their homes by Jewish settlement. To regain some of their rights and their land, an Arab High Committee was founded in 1936, and a general strike called against Jewish products. This led to increased violence on both sides, culminating in a dismissal of the strike after a thousand Arabs had been killed.

An analysis of the Jewish - Arab situation led to the conclusion by the Peel Commission of 1937 that the Mandate was no longer tolerable, and that cooperation was impossible. It suggested more direct control of Jewish immigration. Upon publication, Arab violence began again and eventually forced the authorities to actually limit immigration. As World War II started and the Holocaust began to systematically murder Jews in Europe, the harsh enforcement of British decrees against immigration quickly alienated the Jewish people in Palestine, many who had family members who were unable to enter the area and were deported back to Europe and almost certain death. Jewish protest led to the British decision to begin withdrawal in 1947.

In Panther in the Basement, Amos Oz describes a Jerusalem during the final months of the British occupation under the Mandate. As residents prepare for the establishment of the first Jewish state and expected animosity from their Arab neighbors, fear is rampant and the enemy - both the Arabs and the British - is roundly hated. The community is composed mainly of refugees: ”not called ’refugees’, nor were they termed ’pioneers’ or ’citizens’: they were described as the ’organized community’” (18). This organized community is united in a common front by their hatred of the ”bloodthirsty Arabs” and the British, whose actions ”were casting a deep shadow, and the Hebrew nation was called upon to withstand the test” (5). It was understood by all that ”the excuses for hatred change, but the hatred itself continues forever” (20). To Oz’s protagonist Proffi - a little boy indoctrinated with the ideals of Eretz Israel - the reasoning that the Jews are hated because ”we have always been right” does not seem to make being right a worthwhile option, for, as he realizes, ”because we are the few and we are in the right... surrounded on all sides and without a friend in the world” (106). His parents do not clarify the matter much, for his mother tells him that ”we should try not to hate.” His father, the objective one, tells him with authority that ”we must not be weak. To be weak is a sin” (20).

It is this attitude, of pride and strength in the face of adversity, that attest to the considerable achievement of the survival of Israel and also to their inability to come to a qualified peace. Like his countrymen, Proffi felt it was his duty to be a part of the struggle, and to show a strong face to his enemies: ”from now on a new age will start: the age of the panther” (20). Meek no longer, for this boy’s dreams were filled with ”the possibility of a coordinated lightning strike on the British naval bases...” (27). As founder and second-in-command of the Freedom or Death organization, it was their juvenile responsibility to ensure the survival of the Jewish state with imaginary battles. It was interesting, then, that their main targets - as were the main targets of the Underground and his father’s radical slogans - were British rather than Arabs, their discontent neighbors.

However, he comes to encounter a deep problem that is central to the situation that his entire nation is caught up in. With a certain naivete and innocence, plus a liberal dose of kindness, he strikes up a friendship with a British soldier who escorts him home after being caught outside after curfew. For this, he is punished by his father. Much more painful to him is the reaction of his friends, after he is caught continuing this friendship with weekly meetings in a cafe, to learn English. He excuses himself: ”by doing this I would be a thousand times more useful to the Underground...henceforth I was a spy” (43). He styles himself after Tyrone Power, in the imaginary movie ”Panther in the Basement” - a man able to disappear into the fog, continually changing his identity. At his trial, Chita Reznik hysterically refers to Proffi as a ”low-down traitor and a liar” (66). Ben-Hur, the compassionate but ultimately objective general in this ragtag army, calls Chita’s behaviour that of a ”little Nazi” (66). In a parallel with the ideological positions of the ’grownups’, Chita is a militant, and Ben-Hur, the majority that seeks to distance himself from the cruelties of the militant but not subject himself to the weakness of those who cannot hate. Ben-Hur goes on: ”it’s because you love the enemy. [It] is the height of treachery” (69).

Proffi comes to a genuine impasse here, as his mother has told him that ”anyone who loves isn’t a traitor” (2). As the novel closes, Proffi begins to understand that this is impossible for Israel. To love at such a level would mean a sacrifice of the body for an abstract ideal; the idea of survival cannot co-exist with such a heretic doctrine. He realizes that this everlasting suspicion will remain for Israel: ”the opposite of what has happened is what might have happened if it weren’t for lies and fear” (147). He doesn’t resent his role as friend and listener, or spy, for he thinks that ”there are other secrets in the world apart from liberating the Homeland... were they perhaps really brothers who were pretending, for some reason of their own, to be strangers and enemies? One must observe and keep quiet” (142). There is hope, he seems to say, that the Jewish people and their enemies will stop pretending, and live as brothers.

On the final day of the Palestine Mandate, May 14, 1948, Israel proclaimed its independence: ”we hereby proclaim the establishment of the Jewish State in Palestine to be called ’Medinal Israel’... [we are] open to the immigration of Jews from all the countries of their dispersion.” Since April 1948, they had utilized Plan D, which allowed field officers to conquer and level Arab settlements within the Israeli state to protect its borders. This announcement of superiority and their groundbreaking independence sparked the First Arab-Israeli War, which began on May 15, 1948. The Israeli forces won an upset over the larger Arab army, made up of units from Egypt, Transjordan, Syria, Lebanon and Iraq. Amidst the violence, at least 600,000 Arabs fled Palestine for neighboring countries, ending the Arab majority in Israel.

Yahya Yahklif’s A Lake Beyond the Wind is set during this war. He portrays Samakh, a town fearfully awaiting the arrival of the war: ”there was nothing to fill the space of the small town except anxiety; nothing, any more, to evoke a sense of security” (1). As opposed to the people of Oz’s Jerusalem, the Arabs of Samakh focus their hatred on the Jews, and not on the British, who were withdrawing from the area – though they were roundly cursed for bringing reinforcements to the Jewish armies. Their war hinged too on survival, but with a muted sense that their existence depended on it. ”Scores of men were going east...enlisting in the Arab Liberation Army, seeking out arms and khaki uniforms, dreaming of heroism and courage and medals” (7). They looked to regain and liberate the lands that they had lost.

Many of the characters, like Haj Mahmoud, leader of some of the riots in the 1936 general strike; Abd al-Karim, a world-weary shopowner; and Mansour, a ticket seller who takes great delight in observation, seem to view the entire war with a sense of grimness, as if they are expecting a defeat at the hands of their enemies. ”Tiberias fell after bitter fighting... there was no break in the wailing and crying... Mansour nodded sadly, as if knowing that it would be Samakh’s turn next” (179-180). For these people who were losing their homes, livelihoods and their world, it was another seeming blow to their heroic stances by the pervasive influence of the Jewish onslaught. Yakhlif’s outlook is depressing: ”I realized then that everything had been lost, and that all paths led to exile and dispersion. Such a melancholy prospect. Such a lonely road” (214). His view doesn’t tend to include the possibility of a reconciliation – his creative image of the Arab nation is one that has given in, and must attempt its existence elsewhere, and alone, without friends. It is poignant then, that for this same reason, the Jewish people draw their strength to continue their fight.

However hopeful or dejected these novels seem to be, there is no escaping the utter impasse at which the two cultures arrived at since early in the 20th century. To the fearful Jew, there are wolves a-plenty, even among their allies and protectors; to the displaced Arab, there is only betrayal. With this sense of mistrust – to the Arabs that would kill Proffi’s relatives, and to the Jews that would shoot Abd al-Karim’s friends – there is no answer but death, and the mounting frustrations have only found outlets in bloody riots, to which a character in A Lake Beyond the Wind says: ”there are dark days ahead” (1). There is always the possibility that things will change. As Proffi realizes, as he approaches the time for his trial for treason in Panther in the Basement, ”maybe the natural laws of our own time were also temporary laws, and would soon be replaced by new ones?” (62).

Romantic Love and the Individual

The idea of a romantic love existing between individuals presupposes the existence of art within people. Art, by its nature, draws heavily on the emotional life of the individual and demands that the inner aspects of the soul be revealed and given over to the medium in which it will be expressed – whether it be the canvas, the screen, or physical love. Romantic love sought to communicate art through a deeper expression, made possible through greatly idealized sex. Sexuality had become more important; beauty and love were interpreted mainly through the erotic experience. This convergence of two bodies marked the emotional union of two souls, and thus, the communication of art. However, the idea that a union was needed to transmit art lessened the need for individuality. With the rise of romantic love, the courtship based on individual traits lost its place in social interactions. No longer did the beloved have to be good or beautiful, for the emphasis on emotion had emancipated people from this aged definition of desirable. What was desired was the unknown of eroticism, a mixture of emotional dependence, a desire for physical and emotional union, and an increased empathy for the target of one’s affections.

Romantic love, with its basis in eroticism, found many different idealized forms: the innocent and pious virgin that von Innstetten desires in Effie Briest to the reckless femme fatale that Freud’s Dora wishes to be. The relationship of Effie and von Instetten left much to be desired; von Instetten truly believed in the romanticism of their dynamics and believed that his idea of a woman could be brought about within Effie. However, his view of the romantic life clashed with hers. In one scene, Effie tells him that he is ”an affectionate just don’t want to show it, you think it’s not proper...” (89). Later, she asks him if he was ”aware that’s what I’ve always wanted to be. We have to be seductive, otherwise we are nothing” (90). Even Effie’s father notices: ”[she feels] a love of pleasure and ambition...her ambition will be satisfied [by his ambition], but what about her desire for fun...he won’t be much fun...and the worst of it is, he won’t even seriously address the problem” (28–9). For her, love needed to be reckless and inspired, and thus she found outlets for that, including an affair with Crampas and a secretive reserve under which she hid her turbulent inner feelings.

The reactions to Effie’s desertion remarkably illustrate his capitulation to the old social standards and her championing of Romantic love. His decision to shoot her lover comes from his fear of being a social outcast: ”...going against it is unacceptable; society despises you for it, and in the end, you despise yourself...I’ve no choice” (173). In the end, he could not pretend to be a Romantic any longer; his ambition, his pride, and his unwillingness to sacrifice himself on any level to Effie meant that he had failed to achieve any emotional transcendental state with Effie. Effie is angered by this. She believes in freedom of expression: ”You don’t have to be allowed. Honour, honour, honour...And me to blame...I’m disgusted at the thought of what I did; but I’m even more disgusted when I think of how virtuous you both are” (202). Later, she understands: ”Everything he did was right...there was a lot of good in his nature, and he was as noble as anyone can be who lacks the real capacity for love” (216). After Effie’s death, Mrs. Briest reflects that ”I wonder if perhaps she wasn’t too young” (217). Her inability to share her emotional experience with her husband was key to the destruction of their marriage – it opened her eyes to his dependency on his ambition rather than to her. Their love failed; she wanted their marriage to be a perfect union while he merely wanted her to increase his own standing as an individual in society.

Sigmund Freud, in his prefatory to Dora, acknowledges that sex is a strong driving force in mental analyses, but reminds his readers that ”[it is] the spirit of the age, owing to which we have reached a happy state of things in which no serious book can any longer be sure of its existence” (Dora 3–4). In his descriptions, the sexual drive plays a large role in the development of romantic relationships throughout the life of a person, in this case, a young girl. In the complex mental associations of a pubescent mind, many preconceptions are developed and left to affect the emotional expression of the developing child. In Dora’s case, her relationships devolved into a perverted expression of her inner desires. Her romantic attraction to Herr K was masked and transferred to a feeling of possessiveness for her dad. Her jealousy was directed first to her mother, who competed with her for her father’s attentions, and then to Frau K, who was his lover and the homosexual object of Dora’s affections in the past.

She wanted to be loved, and thus came her desire to be a different kind of Romantic ideal, the femme fatale. ”The dream confirms...that you are summoning up your old love for your father in order to protect yourself against your love for Herr K” (62). She is afraid of accepting her desires, because ”not only [are you] afraid of Herr K, are still more afraid of yourself, and of the temptation that you feel to yield to him...these efforts deeply you loved him” (62). Her love for Herr K had no basis in his individual traits; she merely sees the sexual void that he experiences and so desperately searches for in her: ”so you are ready to give Herr K. what his wife withholds from him” (52). Her frantic displays of illness at home toward her father’s absences can be depicted as a longing for love and attention from her beloved, which again, is her father in lieu of Herr K. ”Her sympathetic imitation of her father [by taking over his cough as her own]...showed itself capable of representing her relations with Herr K; it could express her regret at his absence and her wish to make him a better wife” (75). The cough which she had taken was another significant realization of her quest for union: the irritation of her throat ”concerned a part of the body which in Dora had to a high degree retained its significance as an erotogenic zone” (74). For this psychological analysis, the trappings of love only concerned sexuality and inner erotic desires, a perfect fit to the Romantic definition of love.

Neither of these two literary works describes an ideal Romantic relationship; however, they illustrate the hardships associated with developing romantic love. A successful, though physically unquenched love occurs in Thomas Mann’s Death in Venice. The homosexual Aschenbach is able to observe the child Tadzio from far and admire him; Mann’s words evoke an animal sexuality and tension that perfectly described the erotic experience. ”He was more beautiful than any words could say, and [he] felt, as so often before, that language can only praise, but not reproduce, the beauty that appeals to the senses” (Mann 340). Here, Aschenbach is able to worship and far, and even experience a quasi–emotional union with the child: ”Tadzio smiled; smiled at him; his lips slowly opened into an eloquent, intimate, charming and candid smile. It was the smile of Narcissus leaning over the mirroring water...” (340). The expression of such a wonderful emotion, likening the receiver of the smile to the smiler himself, is an incredibly deep communication of the desires of the child’s soul, and of Aschenbach’s as well. In this relationship, built on observances from afar, individuality is not even brought into question; both understand that a further relationship cannot be – it would ruin the apex of romantic love that they had reached. Aschenbach finally finds this out for himself: ”Beauty alone is divine and visible at once, and so it is the path of the the spirit...can [he] ever find wisdom and true manhood if he takes the path of the senses to reach the spiritual?...passion is our exaltation, and our longing must remain love – that is our bliss and our shame” (363).

For a Romantic relationship to succeed, both partners must be emotionally dependent on each other to the same degree, and the erotic experience must be shared; obviously, Effie and von Instetten were not able to conjure up that intimacy and were never able to progress beyond feeling that the other was merely good. Individuality and the importance of the self was disciplined and diminished – rather, devotion to the other was a common characteristic. Dora was hysterical and her dependence on love and emotional involvement displayed itself as a range of symptoms that had been used to control her relationships and erotic desires. The expression of art from a common intimacy came from a feeling that the great unknowns present in the act of physical love could bring emotional satisfaction – to achieve union, both partners must be willing to bare their sexual desires to each other. For Aschenbach, this is easy enough; by watching and being aware of his passion, he has achieved Romantic love – the knowing smile given him by Tadzio only confirms that his passion is received and reflected. The idea of romantic love relied on a heightened sense of mutual eroticism and sexuality and not on an individualist philosophy.

Individuality in The Red and The Black

The response of society to individuality in the years after Napoleon were extremely limiting. The French Revolution and later Napoleon had turned previous social dogmas into more egalitarian systems, which had ruined and destroyed the self–definition of many of the wealthy upper–class. Those who merely opposed the new dogma were also turned into scapegoats; these became an embittered middle class that had terms to settle with those in power. With a release from such a contradictory system, people were allowed to recover a sense of identity, which meant a return to the old trappings of money and power. This quest to be more than just a Revolutionary involved suppressing Napoleonism in all its auspices – the social caste system became more defined, presumptuousness from the lower classes was frowned upon, and hero–worship became illegal. In a race to regain the impression of wealth, society tried to hold on desperately to its presumptions of class and turned a blind eye to the spirit of the ”individual” inherent in the old Revolutionary and Napoleonists. This usually involved a pretense of sophisticated urbanity and education with a keen sense of superiority and snobbery, with no allowances for deviation.

Stendhal’s hero, Julien Sorel, in The Red and Black, aimed to enter such a society. Being at such a comparatively low level, he would have to work his way into hypocritical social circles by seduction, a method he carefully learned from Napoleon’s example. The only pathway open to him was through the clergy, a reactionary institution to the army, the monarchy and the social caste system – respectively the Black and the Red of the title. Napoleon himself entered the military and achieved his ambitions through military victory, winning over Europe within a few short years. Julien understood the driving ambition that drove Napoleon to rise from obscurity to become virtual ruler of France by age 27, as ”Julien had never let an hour of his life pass without telling himself that Bonaparte, an obscure lieutenant without fortune, had made himself master of the globe with his sword.” However, he understood that ”when Bonaparte first made a name for himself...military prowess was necessary and in fashion” (26). For his own career, ”the army [meant] all my ambition, the priesthood – that fine profession which opens all doors” (22). In resolving to be a priest, he took into account the difference in social mores and thus allowed himself the chance to enter high society, something a poor peasant’s son might never hope to achieve.

Napoleon was not only a great military strategist, but a skilled propagandist as well. His stranglehold over information in France and going back to France from the battlefield was legendary; only what he wanted was published and nothing else. This stranglehold also covered the theater and public education. Bonaparte once remarked famously that ”education must impart the same knowledge to all individuals living in the same society”. He was also skilled at shaping human nature to his needs – the Pope agreed to come and crown him at his coronation ceremony in 1804, allowing him to further enhance his situation by very symbolically taking the crown from the Pope and placing it on himself. One of his greatest public relations moves was to commission great paintings of himself. Within David’s brilliant renderings of the Emperor, he is displayed as an authority above the Pope during the coronation scene, as a hard working statesman and diplomat in his study at the Tuileries, and as a conqueror in Hannibal’s footsteps on the Alps. For Julien Sorel, these were large footsteps to fill.

Like Napoleon, Julien was very well prepared. He had no illusions about what he had to do: ”to win over old Father Chelan, on whom....his future depended, he had learnt off by heart the whole...New Testament in Latin” (22). He planned out his different personas to present to others: to the priests, he was a devoted scholar; to the Mayor, a gifted tutor; to Mme. Renal, a sensitive lover. Had he commissioned a painting, he would have solidified his image as Napoleon did. He took advantage of any situation and ably exploited those whom he was around. His control over the people in his life was almost complete. When he had worked his charm, he was well–endowed with trust, love and devotion by all whom he encountered.

However, his ability to change his nature to suit others implied that he really had no individuality. He seemed to be what everyone else needed him to be. Even his goals were transparent and hypocritical: though he scorned society, he still tried to gain more power and status in a society that promoted money with a herd mentality. His drive was relentless but plagued with doubt. He continuously and strenuously compared his love affairs and social maneuverings to military advances; his idol, Napoleon, once said that to ”live defeated and without glory is to die every day.” To Napoleon’s generation, that martial rhetoric was revolutionary and radical, but to later generations, it was merely a social disruption and not allowed. The very fact that Julien was trying to follow in Napoleon’s footsteps increased the folly of his ambition – in aiming to become a successful individual, he was merely taking over someone else’s role.

The individual was now defined less by enthusiasm and feverish ideas than the appearance of having ideas. Even the appearance of having ideas was more or less scorned. The French Revolution had shown people the folly of having ideas. Liberty had failed, and a new kind of repression was back in style. The new man, in France, was one who was bored: according to Julien’s friend Prince Korasov, ”looking miserable is never in good taste: looking bored is the done thing. If you’re miserable, there must be something you’re’s showing yourself to be inferior. If you’re’s what tried in vain to please you that is inferior” (407–8). The narrator of the Red and the Black made it even clearer: ”since the fall of Napoleon, any appearance of gallantry had been strictly banned from provincial mores. People are afraid of being deprived of office...hypocrisy has made great strides even among the liberal classes. Boredom has become acute” (47).

Julien took this new lesson to heart. He had yet another face to show to society: that of the bored fop. His use of this device in his love affair with Mlle. De La Mole had ruined her and made her bitter. She had ”ceased to be bored over the past two months...Julien had lost his chief this moment in life which gives some tender illusions...she was at the mercy of the most bitter reflections.” She knew that, to him, ”her whole existence was staked on the toss of a coin” (358). However popular this facade was to the rest of society, his Napoleonic ambition could not withstand such easily visible insincerity.

Julien’s empire, like Napoleon’s, was built on complete control and total victory: at the first sign of weakness, both were liable to crumble and fall. For Napoleon, this meant losing control over his ministers, then the countryside, then Spain, and finally capitulating to the combined might of Austria, Prussia and Russia. For Julien, this meant losing control over Mlle. De la Mole, the Marquis, his own goals, and finally capitulating to the Court after a fatal shooting. Not only does Mlle. De la Mole realize that Julien’s insincerity can ruin them, but his plans for future success are thwarted by his inability to stay firmly in a persona. Mlle. Fervaques was distressed by that fact that ”M. Sorel isn’t a proper priest! One could admit him to some sort of intimacy, but with that cross, and the almost bourgeois suit he wears, one lays oneself open to cruel questions” (432). His military bearing was not enough to disguise his strange and nervous love affair with Mlle. De la Mole, leading to its discovery by the Marquis, who was ”beside himself with rage” (450).

The moment that most revealed Julien’s loss of control over himself occured when he supported a counter–revolution, knowing full well that putting more authority into the hands of the nobles and priests would fully undermine the work of his idol, Napoleon. He fully understood what was going on but nevertheless allowed his social ambition to better his primary support for his work. He had, by his own hand, invalidated Napoleon’s primary contribution to French society and thus made himself a pure tool. He had lost his philosophy and thus had no more self–definition in a world where he needed it to rise above the bland masses.

The individual in the nineteenth century was defined by a total anti–Napoleonism. Society was encouraged to reject personality and spirit in favor of an encompassing hypocrisy and a return to all the old ways. History’s course in the post–Napoleon years were to fully remove any effect he had made on Europe; even Julien Sorel, who wanted to succeed as Napoleon did, was forced to reject Napoleon in order to achieve anything within this new society. In doing so, he lost whatever individuality he had. Just as society held on desperately to its pretensions, Julien clung to his vision of himself, a vision that pathetically, had never ”looked so poetic as at the moment [his head] was about to fall” (527).

Obstacles to Unity

Until the last decade of the twentieth century, there seemed to be little evidence of a common integrated Europe, though this had been an omnipresent goal since the end of World War Two. Many of the obstacles had to do with differences in political agendas. The polarizing suspicion – and reality – of attempts at global hegemony and domination by each of the two superpowers, America and the USSR, colored European relations and aided in the estrangement of populations during the Cold War and afterwards. Even within each sphere of influence, there were problems with unity. In Western Europe, the internal political agendas of nations frequently got in the way of proposals and partnerships for the common good. The Communist satellite states encountered increasing degrees of friction with the USSR as they attempted to apply socialism in their own countries; by the time of the breakup of the Soviet Union, many of the Eastern European nations were still floundering for adequate governance. However, most of these obstacles were overcome by the 1990s, a decade which saw the fall of the Berlin Wall, the reuniting of Germany, the founding of the European Union and growing stability after the fall of Communism.

The Cold War, which had started around the end of the Second World War, paralyzed the Western collective conscience in its symbolic battle of good vs. evil. The forces of democracy, and to a lesser extent, of monarchy, attempted to halt the spread of virulent Communism, and the Russian behemoth attempted to return the favor. By the Seventies, as the age of detente and perestroika began, there was a general relaxation of the strict separations invoked during the Cold War. However, the ever–present doubt within both governments of the good intentions of the other provoked a continuous vacillation in position. The unreliability of superpower foreign policy – particularly American reversals from detente to containment and back again – frustrated much of Western Europe, which had already suffered losses in influence due to the already overriding importance of the war against Soviet expansion. Catastrophes like the Vietnam War not only outraged European opinion, but it led to a general questioning of the Truman Doctrine and the good intentions of democracy; many Europeans had come to accept the need for post–Atlanticism – an independent Europe without America.

Indeed, many had lost the respect for the American model of freedom through democracy. Also, Communism was beginning to lose its negative image with the relative success of destalinization and Eurocommunism within Western Europe. Increasingly, the new political aims of the younger generation were beginning to come through in shows of terrorism and violent revolution all over Western and Eastern Europe. The world did not seem to be such a dangerous place any more. The theologies of communism and liberalism could no longer unify entire populations; in fact, it was only the excess action of some of these revolutionary groups that prevented general acceptance of the revolutionist’s modifications to political life, as in Italy, Germany, Spain and Northern Ireland. After all, dedication to either system had not prevented the economic disasters that had characterized the Seventies: the energy crisis, ever–increasing inflation and extremely high unemployment. Thus, the wandering intellectual atmosphere became a great barrier to the effectiveness of European integration.

When the Communist system finally started to lose its stranglehold on its contiguous identity, many of the Eastern European satellites encountered problems of their own in trying to establish their own national presence amidst crises in ideology and power politics. Detente and perestroika allowed for a legal re–evaluation of Communist theory; this not only allowed the republics breathing room when deciding policy, but an atmosphere where political discussion could be maintained at a dangerously high level. This began to erode the stability of rule from afar, especially after the introduction of De–Stailinization, which was the first occurrence of the party recanting its own line. In places like Poland, the additional political and national pride in maintaining separate Communist ideologies increased tensions by being overly oppressive without the full backing of the central state apparatus. The main problem was that the entire structure had failed to promote the idea of one unified, multicultural state to its populations. Aggravated nationalistic tendencies, combined with such an accommodating atmosphere for political dissension, meant that calls for freedom and independence would only increase – which they did, in often violent ways. The Prague Spring marked a defining moment in the clash of freedom and Russian authority; though it failed, it both outraged the world and members of the Communist movement, allowing for splintering and disunity among the ranks. Into the late 1980’s, Russia showed great evidence of a collapse in command: violent religious outbursts and independence movements in the satellite states could not be stopped, and it was only a matter of time before these states would break away, striking one more blow at European unity.

In order to salvage Western Europe from ”Eurosclerosis” or permanent economic decline, various political leaders lobbied for the creation of a true common market within the confines of the European Economic Commission, which had been in place since 1957. Though a ”common market” had existed on some level since 1957, there was a general disunity in action, promoted by national pride. The reduction and elimination of tariffs, the abolishment of banking and taxation systems, as well as the creation of a common currency worried many, most notably Kohl in Germany and Thatcher in Britain. Britain itself, as a major economic power important to the success of the EEC, had not been a member until 1973. France, worried about British challenges to French sovereignty within Europe, had continually vetoed Britain’s inclusion into an organization that was attempting to truly integrate Europe and provide a better fate for each of the countries involved. More aggravatingly, differences in agricultural and cultural standards and production had made alignment of common economic goals extremely difficult. The process toward actual integration didn’t start until fears of an imminent European decline were publicized in the mid–1980’s, giving willing politicians the political leverage in which to totally rework existing trade and taxation systems to specifications. Under the Single European Act finally ratified in 1987, a fully ”common market” would be instituted by 1992.

These obstacles, though destructive to general European integration, were overcome in the 1990s by political reconciliation. The vacillations of American and Russian superpower hegemony became much less dangerous to European stability as events like the fall of the Berlin Wall and the reunification of liberal and communist Germany allowed the public to see that the superpowers could agree upon a greater good. Also, renewed American support for liberalism and the independence movements growing across Europe as well as continued maintenance of ”neutralism” in Germany did much to improve the position of the US in the eyes of the visionaries leading political currents in Europe. With the coming advent of the European Union’s common market in 1992 and the American willingness to adapt and learn the new economic models, the success and acceptance of the new economic system was ensured, and another obstacle to integration surmounted. Finally, the continuous clash of liberalism and communism was dissipated by the collapse of the USSR and degeneration into Russia and the independent satellite states. The difficult transition period for Eastern Europe was overcome by successful examples of nationalization in other nations and broad support overseas for the stubborn independence movements that were able to remove dictators and oppressive governments, like Romania’s Ceausescu. Across Europe, integration became easier because of the similarity in problems faced; most countries were more or less liberalizing rule. Even though problems continue to plague European nations, they tend to be more individualized, as the greater trend is toward greater cooperation within Europe, as shown by the expansion of the European Union this year and greater involvement in NATO and other cooperative agencies.

A Crisis in Europe

The decade after the end of the Great War was characterized not only by desperate attempts to stabilize the peace of Europe by the Allies and moderate elements in Germany, but by a deepening crisis – apparent in the frequent outbursts of radical sentiment and rebellion against the culture of the old. World War I had engendered a growing sense of despair in the state of mankind. Both the tattered civilian population and the army corps suffered a great disillusionment in the ability of European society to undo its wrongs and justify the decadence that had led to total war and what was seen as the decay of civilization. In response, popular movements took charge over archaic structures in all areas of human interaction. The subsequent disorder in transition created a distinct imbalance – between debtor and creditor nations, between victor and vanquished, as well as between supposed allies – that required resolutions not forthcoming until later decades.

Before the Great War, European citizens had taken it for granted that their nations and culture were at the height of civilization; afterwards, many were not quite sure. Paul Valery noted in his essay ”Crisis of the Mind” that although European domination had been the case for many centuries, this was actually ”an extraordinary upset in equilibrium” which would ”allow us to foresee a gradual change in the opposite direction” (Valery). However, it was not the gradual disappearance of European technical mastery that most inclined toward this end, but the general social apathy and mistrust in politics generated by the prolonging of such a senseless war. Many mourned the excessive waste of life during the war: ”[Abraham] would not [offer the Ram of Pride], but slew his son, and half the seed of Europe, one by one” (Owen, ”Parable”). Wilfrid Owen, in another war poem, commented on the ”great lie [of] dulce decorum est,” the patriotic slogan broadcast to the British under the rule of Sir Herbert Asquith and later David Lloyd George (Owen, ”Dulce”). On both sides of the war, people noted that the reasons for war had depended less on their needs than on foolish political games: ”I see the keenest brains of the world invent weapons and words to make [the war] yet more refined and enduring” (Remarque). Perhaps it was more bitter for the soldiers who returned home to a population that did not understand the reasons for such a war; it was akin to a betrayal.

The Russian population had already responded to the inadequacy of monarchist excuses for continuing the war: a violent revolution provided them with an outlet to address the national losses in manpower, resources and land. Finding their provisional government unsuitable, Russians wholly turned to communism under the leadership of Lenin. Under the Bolsheviks, Russia was able to negotiate peace and appease its people using a philosophy of class equality. However, this growth in Russian stability created a critical tension in relations with other European states, their supposed allies. In 1920, the Communist International, which regulated Russian policy and that of communist parties in other nations, proclaimed that its task was to ”liberate the working people of the entire world” (Gilbert 184). As the power of most Western European states was based on their colonial empires, this attempt to inspire and prolong native independence movements presented a serious attack on their positions. This was not an idle threat either: communist revolutions had broken out in parts of Eastern Europe and Germany, as many saw liberalism and democracy as failing forces. Though Russia was to abandon world revolution in favor of ”socialism within one country,” Russian influence was undeniable, and this conflict with Western systems would build in intensity until its resolution at the end of the Cold War, seven decades later (Gilbert 220).

The Paris Peace Conference introduced Western efforts to deal with the influence of Soviet Russia as well as with intransigent Germany. Its goals were two–fold: to prevent Germany from waging war again, and to establish ”a cordon sanitaire which would separate Bolshevik Russia from the democratic states” (Gilbert 185). The settlement very frequently adopted stances that seemed not only spiteful and unfair, but which opposed the Wilsonian principles of national self–determination and open diplomacy (Keylor 481). The indignation of the American and British press at the veil of secrecy over the proceedings probably increased the overall negative effect of the conference. Often, secret negotiations came to light in embarrassing public episodes that in the end contributed to the treaty’s rejection by the American Senate (Del Gallo). Though the diplomatic arrangements during the conference were evidently in disagreement with the moral basis by which the Western powers had entered and continued the war, it was a necessary truce, due to the rising conflict between British and French aims. The British wanted to limit the terms of the treaty to prevent Germany siding with Russia, but they had to make concessions to the French, who demanded harsher measures on account of their own economic hardship (Gilbert 168).

Internationally, the treaties appeared unjust as they seemed to punish civilians more than it did their belligerent governments. To the Germans, it appeared even more so, for the high reparations, limitation of army size and permanent demilitarization and occupation of the Ruhr would extend crippling blows to an already stagnant economy. German anger was aimed at the Allies and at the Weimar government, who had signed the treaties in representation of the German people; it became manifest in unrest and assassinations of important German politicians involved in fulfillment. Putsches attempted to wrest control of the government in favor of obstructionist policies, as many felt the international community would not be able to enforce any of the demands of the Versailles treaty (Gilbert 187). Though the military was successful in putting down the rebellions, the possibility still existed that enforcement of treaty demands would not be forthcoming, as Britain and France were feuding and most nations did not take Wilson’s League of Nations seriously as a deterrent.

Germany continued using a policy of fulfillment of the terms of the Versailles treaty, under the chancellorship of Gustav Stresemann. A practical moderate, Stresemann was able to please the Allies while dealing with the recalcitrant German population, enabling him to negotiate brilliantly on Germany’s behalf at Locarno in 1925. In return for agreeing to keep the western boundaries permanent, Stresemann was able to move back the Allied withdrawal date, further build up the army against treaty limitations, and win support in establishing eastern boundaries in Germany’s favor (Thimme 77). However beneficial the Locarno treaty was for the political and military strength of the nation, it did not completely appease the German people’s sense of insulted national pride. The seeds of discontent remained, leaving the way open for rabid nationalism as practiced by the Nazi party in the 1930’s; this was in opposition to the traditional balance of power within Europe.

The economic status of Europe moved to even shakier ground as Germany was added to the list of debtor nations to the United States, which had become the world’s largest creditor. The Dawes Plan, negotiated by American bankers, allowed Germany to organize the payment of reparations on a fixed scale with the help of large loans based in New York banks. Though the Dawes Plan stabilized German currency, lowered unemployment and brought inflation under control, many Germans were opposed to foreign management of the German economy (”Dawes Plan”). In any case, it did not matter for very long.

Rampant speculation in the confident, flush post–war rebuilding economy proved fatal in 1929, when the stock market crash caused the collapse of the American economy. As most European nations were tied financially to the United States, it proved disastrous for European economies as well. The lack of money caused a lowering in trade, wages, employment and mass bankruptcies, causing resentment as a method of spreading blame. In addition, it caused great physical trouble to small nations who relied on agriculture; when no one could buy, there was no cash flow whatsoever. Unsurprisingly, Germany’s staggering reparation debts were soon cancelled – no one expected repayment during such bleak times (Gilbert 226). The after–effects of the crash extended well into the next decade, leaving a trail of enmity against those who were seen as the cause of the crash; this became a focal point in Germany before the Second World War.

By the end of the 1920’s, it was evident that Europe had not healed from the ravages of the Great War, and was experiencing an intensifying crisis. The end of the war had generated an apathy toward government and politics that was evident in the writings of the soldiers and survivors returning from the atrocities of the war. The combination of economic hardship, feelings of vengeance and fear of the communist ideology from allied Russia caused the settling of a peace on Europe that not only undermined the moral basis from which the Allies had won the war, but created new resentments which could not be resolved. The enmity and suffering of the time were further exacerbated by the economic devastation wrought by the Stock Market Crash. It seemed that the 1920’s had ended on a lower note than did the war: for this time, there were no victors, only vanquished.

Economic Imperialism, Globalization and Threads of Resistance

Though the American populace may profess ignorance of a common ideology of American imperialism, there can be no denial of the extension of American hegemony over political, economic and cultural dialogues worldwide. Whether or not it is purposeful, or even knowingly supported by a majority of the US population, this imposition of national influence is nevertheless understood by various international audiences as a declaration of a collective American intent. This claim to superiority has been propagandized in several contexts, including that of humanitarianism, defense–oriented militarism (eg. against the forces of international terrorism) and pop culture. The overarching theme of these efforts is globalization, a structure utilized in implementing American–style capitalism and thus an American hegemony over the international economic scene. The acceptance of what is essentially an alien trend seems absolute in a world where America is understood to be the number one superpower, and American values, American companies and American pop icons are everywhere. Yet, threads of resistance still exist against the efforts of globalization in most, if not all, affected communities, even when local motives align perfectly with the external motives of American government or business. The dialogue maintained by these dissenters is innately conscious of its status as anti–imperialist; however, the general nature of such claims often shadows the specific and localized arguments brought up in such cases.

The importance of American–sponsored globalization in the 20th century and onwards owes much to a main tenet of Sun Tzu’s approach to war. The argument exists that a successful war depends on successful deception: the belief of weakness or benign motives will provoke decisions that can be then exploited. In supporting efforts for globalization, Bill Clinton’s administration was able to destabilize rivals it treated as enemies. Under the banner of free trade, the institution of American–style capitalism urged a free flow of capital which meant the lessening of governmental influence over their economies, arguably leaving them ”significantly more defenseless in the international marketplace.” In Asia, the United States moved this process along, by forming the Asia–Pacific Economic Cooperation Forum which, among other things, promoted more open investment into its member countries. Though some of the capital entering these industrializing economies came from overseas, a significant portion of investment originated in the coffers of American business and government. With the help of the International Monetary Fund, many Asian economies were essentially incapacitated during 1997 by economic destabilization and the subsequent deflationary plans forced upon them in return for a ”bailout.” The social fallout was deemed a consequence of ’bad finance’, but at the same time, it was American–influenced ’bad finance,’ making any consequence ”blowback” against American efforts. Regardless, the end result of this combination of benign involvement and later aggression was a solidification of American control over countries that could be seen as attempting to remove themselves from the American sphere of influence.

The question was asked by observers and pundits: ”what is the purpose of globalization then, if resultant in so much destruction?” Accepted economic thought argued that globalization would improve average incomes such that they would converge as poor countries started growing more rapidly than richer ones. Instead, standards of living have decreased all over the world, and the consequences of not being the most efficient part of a modern capitalist economy have included unemployment and social exclusion. Wherever US influence had insisted on free trade and economic cooperation – including Latin America, East Asia and Russia – economies tended to collapse or be prone to corruption. Naturally, these countries wish to be among the affluent, otherwise there would be no acceptance of what are basically alien trends in finance and economics. Their earnest collusion would be further persuaded by the assertion that globalization was not controlled by anyone; basically, one would be responsible for his own destiny in the New Economy. What was accepted by the silent majority, especially with the compassionate look of American foreign relations in the 1990s, was that globalization existed for its own sake and was essentially good – but there were dissenters to this internationally accepted opinion.

Resistance to the positive and edifying effects of globalization extended to only a minority of individuals during the 1990s. Even today, most people probably view globalization in the same light as the world did when it had been propagandized by President Clinton. Much of the dissent was portrayed as a general attack against democracy and sound economics, and international leaders usually denounced protesters as anarchists or as Italian Prime Minister Silvio Berlusconi put it, the ”Talibanized hordes.” Neoliberalism had a large following in American politics and academia, and any anti–imperialistic arguments were offered unworthy insults and contradictions in economic logic in order to set the agenda in favor of the proponents of the free–market. They were assisted in this pursuit by big media, which frequently under–reported dissent or displayed their arguments in negative fashion. In such an environment, it is little wonder that the detailed arguments of global injustice were paid such scant attention.

The general impetus of the anti–imperialist message lay in exposing the undemocratic methods of global institutions of humanitarian purposes, which were dominated by rich nations and America, the ’sole remaining superpower.’ Additionally, it was argued that globalization left little voice to the poor and arguably made them poorer. Though the antithesis of this statement suggested tacit support of communism, protestors and like–minded academics instead pointed out that the two systems were remarkably similar, with their projections of inevitability and a ”search for happiness through the material progress disseminated by the Industrial Revolution” that required some sacrifice. However, inasmuch as it was a general attack against international economics – and against US hegemony over international economics – complaints against globalization also had specific and localized aspects that emphasized patterns of paternalism.

The 1999 Seattle conference of the World Trade Organization earned a moment in the spotlight when protesters converged on the location, picketing against the overarching role of the United States in world activities. Their arguments came from their experience: environmentalists protested the demise of environmental protectionism in the face of a soul–less capitalism, economists denounced malfeasance at supposedly glowing American examples of globalization, and trade unionists bemoaned worker exploitation in factories mandated by the harsh realities of a hyper–capitalist global economy. Others argued that many national structures for administration were incapable of meeting the demands for globalization, and thus opened the way for rampant corruption or deceptive finance in those countries. Much publicity came from specific complaints against manufacturers – especially clothing labels – that utilized outsourcing as a means of cutting costs. While this is true across the board, US companies were singled out because of the contradiction of a system that purported to be for general economic health, but received it to the detriment of other countries. Academics later joined the attack against globalization: the Meltzer Report revealed IMF and World Bank incompetency, George Soros warned that it was folly to ignore the message of anti–globalization protestors, and numerous published papers note that ”there is no known case in which globalization has led to prosperity.”

International acceptance of American hegemony, especially when referring to globalization, is mostly universal due to the success of America’s political and social influence overseas, a form of imperialism in itself. For foreign leaders, it is usually a foolish idea to oppose such measures, due to American influence in other aspects of politics and economics that will invite other consequences. The current globalization trend may be compared to the state of economic reform in the 1950s: Marshall Plan activities in Europe during the Cold War was portrayed by Communist ideologues as a method for advancing American interests by keeping the Third World dependent on the First. While American political and economic theory accepts this function of foreign aid during that portion of American history, many are loathe to accept that this may be true for globalization. They will not deny that dependencies may be unintended consequences of capitalist business and evolution; as the philosophy goes, the stronger get stronger, and the weak weaker. The fundamental inequality of this kind of system is part of what experts would argue is part of the success of such a system: the threat of an underdeveloped economy forces government to completely move away from capitalism, or attempt to industrialize their economies as soon as possible. At the same time, American aid to industrialization can be argued to be evidently self–interested when they claim vetoes at the IMF, the World Bank and the UN, and when they are allowed ultimate selection of the head of the World Bank. This is the general argument of most dissenters to this form of American imperialism: still, there are many other aspects of discontent with American–style economic ambitions that have tended to be clouded by the radicalism of dissent.

Colonialism and Power Relations in Houseboy

A typical understanding of imperialism during the heyday of European empire usually contained significant elements of moral paternalism. Its theoretical basis posited that the extension of national sovereignty over foreign domains should ultimately benefit the colonized by providing the overwhelmingly positive effects of cultural and economic influence over morality and tradition. Whether or not this is true is of a subjective nature, but the existence of this philosophy did serve to underpin the colonial power structure whereby the ”colonizers” maintained theoretical and practical control over most aspects in the lives of the ”colonized.” In descriptions of cross–culture relations from within the colonies, as in Ferdinand Oyono’s Houseboy, these power relations existed by means of an abstract ’line’ upon which racial humanities were divided.

On the one hand, the colonized were seen as a humanitarian enterprise, as the natives were still viewed as primitively human and deserving of reform and sympathy. But to maintain political power in colonial relations, it was imperative that society reflect a permanent superiority, such that significant cultural reminders – in the form of degrading daily routines and public obeisance to the privileged classes – would keep the status quo. Beyond these limitations, there did not exist room for any real equality; the negative cultural outlook on racial closeness undermined any acceptance of the philosophical benefits of education and moral reform on the colonized. Even more undesirable was the state of passive knowledge concerning the colonizer’s moral status in the menials that populated their daily lives. Not only did this deny the inherent rightness of paternalism, but it ’crossed the line’ by including some abstract semblance of equality between the races: by acknowledging imperfection, their state of humanity could be said to have equalized.

Toundi Ondoua, the main character in Houseboy, lived within the confines of French Cameroonian society in a racialized atmosphere so common to the rest of colonized Africa. Cameroon itself was symbolic of the immense cultural energy that went into shaping the African nation to European ideals. Having been occupied by Germany before World War I, Cameroon was then split between Britain and France by diplomats intent on preserving a balance of political power rather than any social sentiment for the African populations. Compared with the British method of indirect rule using divide–and–conquer, France was more obviously paternalistic and involved with the daily rehabilitation of its peoples in that it ruled directly, with little concern for cultural identity. These aspects of political power were made plain in the person of the Commandant and the police chief, individuals with whom Toundi had frequent contact and who were the major representatives of French opinion and influence in their area.

The French educational emphasis made it clear that they were trying to integrate the Cameroonian people solidly into French culture. While echoing the tenets of religious education in most colonies, Toundi’s experience with French and Christian culture displayed the extent to which a thorough reformation was thought necessary: in their church, the nave was reserved for Africans who were required to listen to the sermon and were watched closely and punished by the catechists. The spirit of forced education also existed in regular schools, like the Government School in Dangan. According to headmaster Jacques Sylvain, ”young African children are just as intelligent as ours.” But it is not African culture that can relay intelligence into utility, as Sylvain believed only immersion into the French education system from infancy could manage such a task. His wife agreed, mentioning that ”except for the children Jacques is educating, all the other natives here are not worth the bother.” This arrogant belief in the superiority of white culture only furthered the racial conceits that the colonizers reflected in their dealings with the natives. The racist and humanitarian grounds for these attitudes stemmed from an intellectual movement in Europe that interposed the idea of a racially pure and advanced ”whiteness” against the uncivilized black masses. As such, Europe was expected to ”take up the White Man’s Burden,” as Rudyard Kipling so ably suggested.

The utility of such an education suggested to Toundi that there was the possibility of equality, that he could ”learn about the city and white men and live like them.” This hopefulness translated into a fondness for those who were willing to take him in, such that ”everything I am I owe to Father Gilbert.” The totality of his conversion promised great things, for ”nothing can be more precious than [knowing how to read and write], even if I have to badly dressed.” At the same time, Toundi was aware of some of the limitations that he might have. The head European was a step up, but that didn’t necessarily raise his chances, for ”the dog of the King is now King of the dogs.” From the beginning, Toundi understood that their relationship was to be servile and, to some extent, expected to be examined closely by his masters. When the Commandant threatened him if he ever stole, Toundi remarked that ”I took that for granted.” These positions were solidified once Toundi’s first punishment was in the works. Before the mistreatment started, the Commandant told him that ”I’m not a monster... but I wouldn’t like to disappoint you.” Even so, Toundi was able to maintain his dignity in the face of the white man – initially, in any case.

Others were much less hopeful of their chance at racial equality. To Sophie, the black mistress of a white man, there existed an invisible line which her blackness disallowed her from crossing. Though she may have had a fulfilling relationship with her lover, extenuating circumstances proscribed that she become invisible in the company of other white people. If this was how things were despite love, or friendship, it was clear that ”whites haven’t got what [blacks] can fall in love with”; that is, an appreciation of who and what a person may be for his or her own sake. Her color would always be a barrier to her free association with the white colonizers. Those who believed in equality were given gold rings as ’alliances’ in political games and called friends of France, like chief Akoma of the Sos. His counterpart, chief Mengueme of the Yanyans, was ”wise without traveling” and a political chameleon, understanding the inequities behind diplomatic niceties such that he was ”the only one of the elders who has survived his own generation.” Being treated well was not the same as being treated equally, and recognition was not the same as respect or survival either. Their political power did not help them cross the line, either.

The contradictions of imperial culture are illustrated in the positions of Sylvain and the other Europeans in Dangan. While Sylvain’s mission was in the Social Darwinist stream, with its own brand of paternalism, other Europeans saw it as an affront to their power. In their eyes, Sylvain was a ”traitor...[for] stirring the natives up against us. You keep telling them that they are as good as we are – as if they hadn’t go a high enough opinion of themselves already.” This applied directly to Toundi, who sought approval from the Commandant, and also from the Commandant’s wife, who forgot about Toundi’s existence when he wasn’t there. Toundi’s presence there underlined the colonial certainty in their rightness: it didn’t matter if servants were around listening to that kind of talk, as it was true, in their eyes.

It was more practical knowledge to keep the servants in their place; in other words, the civilizing mission was not one of true humanitarianism but one to keep the natives agreeable to their positions as colonized. The Commandant’s wife makes this clear to Toundi when she points out that he is ”doing a houseboy’s job while waiting for something else to come along.” Even with his big ideas, ”everyone has their position in life.” Toundi must be happy with this – and he is, replying that ”we have to believe the white man’s stories, more or less.” Later, when he has knowledge of her love affairs, this reply is turned back on him after his liberties have threatened the power relations of master and servant. Toundi’s knowledge of the ’white man’s stories’ and their errors in judgment have little to do with civilization and much to do with keeping the servant in his place, as ”he knows now just how far he can go.” Knowledge is more dangerous than anything else: Toundi has his life shattered and the shackles of oppression placed on him by his observance of errors in the private affairs of the most important white man in Dangan. This allowance of imperfection caused the Commandant to become self–aware of his relative equality to the ordinary man, such that he felt himself like ”all the houseboys in Dangan.”

The imperial philosophies of humanitarianism, reformation and civilization were contradicted by the realities of colonial relationships. While the idea of equality may be batted around, this ideal was certainly not desired within a colonial power structure. Equality would negate any further need for cultural reformation toward the European example and for the maintenance of a servile status in the population, two factors that guaranteed the colonizers the right to rule over the colonized. These cultural standards allowed the Commandant to be paternal but heavy–handed toward his houseboy, and the police chief leeway to act in his job as keeper of the peace. The invisible line or barrier of color prevented any cross–cultural relationship from approaching equality and preserved the status quo. In Toundi’s case, his dreams of equality and his closeness to his benefactors were both instances of crossing the line, and prompted the Commandant to feel like he was being judged by Toundi’s silence in the midst of his wife’s love affairs. The ability to judge is identified with equals, and this offense in manner did not fit in well with imperial culture and preserving the status of the ”colonizer” over the ”colonized.”

War and the Destruction of the Politic

War becomes an intensely personal journey for those involved in the fighting – the soldiers, their families and those that can view the carnage for themselves. While the overall motives for the start of such a perilous undertaking are never immediately obvious, war always comes to be a satiation of one’s national ego and belief in the system, as well as a stimulation of a sense of binding purpose for the millions of individuals that together comprise a nation. For some, the matter of life and death can instill a higher trust between the people and the national institutions created to protect the people – it revives a sense of patriotism and a reason for existence in both parties. War can be a matter of ritual purification; it can, like the terrorist manhunt slash war after September 11, cleanse the national psyche of its mistaken belief in divine protection as the emblem of freedom around the world. It can also be a purification in the Greek sense – a revenge killing of the attackers can remove the source of pollution of both the attacker and defender states. However, when the personal and individual motivational and moral aspects are removed or defeated, the war becomes invalid, and thus becomes a national self–pollution; worse, the twin themes of violence and sacrifice lose their intended meanings to the political nature of a people. In such a case, the usage of a memorial has many more implications in the political nature of humanity – including the need to have a purpose for their lives in a purposeless war littered with purposeless deaths.

This picture depicts a makeshift memorial for soldiers of the 1st Brigade, 101st Airborne that were killed during the Operation Wheeler offensive near Chu Lai in Vietnam between September 11th and November 25th, 1967. The Vietnam War became a watershed in the perception of the American state. It was a crisis, both on national and international terms – the role of Big Brother ushered in the American image of meddling pariah to not only the surrounding Asian nations directly affected by the violence, but to the American people and nations trying to develop their sense of loyalty in the conflicting atmosphere of the Cold War. It was not only one of the first failures of American foreign policy, but the largest and most involved war to not involve the continued existence of the nation as a major goal, or any goal at all.

Before the Vietnam War, the Revolutionary War, the Civil War and both World Wars had had political meanings attached that were directly attached to the people. A sacrifice of sons across the nation meant that the family threshold could be kept pure and unblemished from the soiling of the enemy presence, whether of foreign invaders or brother citizens. Those wars were morally justifiable and relevant to the citizens of the nation. President Lincoln, in the beginning stages of the Civil War, justified the conflict breaking up his own nation by announcing that ”the fiery trial through which we pass, will light us down, in honor or dishonor, to the latest generation. . . . In giving freedom to the slave, we assure freedom to the free.” His affirmation of a support of freedom for the free, a central political right guaranteed to all in the contract to citizenry in the United States, placed his usage of violence in the war toward a higher moral purpose, involving both charity to the slave and longevity to the family line. Lincoln’s statement also indicated that to be found guilty of dishonor would tarnish the national spirit forever. The Civil War proved to be a massive realignment of the individual’s loyalty to the political presence of the American Republic. The individual was forced to choose sides; Lincoln’s subtle use of Nietszche’s idea of the contractual relationship reminded the people that it owed a debt to the Presidency and the Union for their position as ”free”. Survival revived the collective belief in the divine protection of the Republic and in the ability of the American state to hold its promise of protection to its people.

The Vietnam War, in contrast, did not have a moral justification or collective guilt that could be understood by the average American citizen at home, the young soldier sent to die in the steamy jungles, or the thousands of hippies gathering in fiery criticism of the political decisions that led to the conflict. Application of the utilitarian Greatest Happiness principle resulted in an undeniable vagueness of what was better: an ”evil” communist Vietnam, or millions of deaths to achieve a democratic Vietnam. The Kantian, with a deep understanding of duty and universal good, would be left confused – as were the public, who did not necessarily see any benefit, morally or physically, in the endless bloodshed that in the end, didn’t get anywhere. Instead of utilizing the spiritual force mentioned by Virilio, this war proved ineffective by going against a military truth and remaining a purely brute force war in an environment so used to brute force. It is to be remembered that the previous 1000 years before the Vietnam War contained only 2 years of peace within Vietnam; a millennium of conflict had not settled any accounts.

The moral debacle that was the Vietnam War also prevented a way out of the conflict. The very idea of a surrender – though it would have been a surrender of the people that the American nation was supporting – was unacceptable for a proud relatively young nation that had never before harbored a defeat. To sue for peace, the imperative would have resulted from the impossibility of winning legally or morally. To lose would have meant many things. Not only would the American nation be shown to be incapable of winning against ”jungle bunnies”, but it would also be forced to confront the lack of morality in starting the war. The obvious result would be a step down in prestige in the masochism contest of the Cold War, but the damage to the concept of the political in the American citizen was far worse than a lowering of international acclaim.

The Vietnam War would have proved to be fodder for Schmitt’s claim of politics as irrationality had he been around to witness the upheaval. To him, it was necessary to understand the friend–enemy distinction in politics to see who ”we” are and decide who will take up responsibility for our own existence. The political is finally defined when one can, with clarity, know that his enemy is his enemy. He thought that people can only be responsible for themselves if the realities of death and conflict can remain present – if the existence of the enemy has been realized. The Vietnam War was strikingly absent of a concrete enemy; the generals and diplomats pointed to the Communist North as the enemy, but images of the burning monk and murdered civilian in Saigon pointed out to the public the existence of a terrible evil even on the ”good” side. For all those people watching the nightly news or tuned to the radio in the army camps in Vietnam, the absence of a true enemy eliminated any responsibility or moralized rationality that could have been felt. The inclusion of visual and audial media, as Virilio points out, can do much more than words released by the President. Very quickly, the hazy state of the public politic turned into a strangely surreal version of anti–war sentiment and self–loathing, a state which was inescapable considering the unrealistic philosophical grandstanding of the American bureaucracy. Schmitt says that a claim to any moral good, which is essential in the political, will recognize no limits. This is inherently visible, as the American defense of liberalism to date has been a drastic failure after billions of dollars given in foreign aid, mirrored in the tragic Afghanistan conflict and the Gulf Wars.

The feeling of disillusion and loss of the moral high ground led to bitterness and an unwillingness for sacrifice in the American public; no one was very willing to step forward and die in such an ambiguous conflict. The sacrifice of so many soldiers in what was considered a pointless war seemed to be a negation of their humanity. The callous attitude of then–President Nixon to what the public should know about the war was exemplified in the photograph of the shooting at Kent State of anti–war protesters. The dual message conveyed around the nation was that the public did not have a say in this war, and that their government did not care what they had to say. Rather, the war became purely for government’s sake rather than for the people’s sake. It was a pollution of a different sort – it would seem that the government had perverted its duty to act for the people and instead acted for itself.

To many soldiers, the drive for memorials commemorating their deaths was imperative to restore some semblance of humanity to the lives that had passed. When soldiers returned home to ridicule, despair and bitterness, they had only their colleagues alive or dead to remember them as honorable people who died justly and served with conviction, and so, they built more expensive memorials. Even in combat, the need for memorialization was strong; no one would want to disappear into death like he had never existed. Makeshift memorials were often the only way to respect and remember a comrade’s death in the field before being killed too themselves. In this picture, the ritual cross as remembrance of passing has been replaced by the blatantly secular combat gear. The soldier’s existence in a morally righteous war would have allowed for the existence of God and an ultimate purpose for each combatant’s lives; their participation in what they felt was an unjustified war led to a definition of their lives and consequently, their deaths, on a purely military and physical rather than spiritual or emotional level. They understood they were just foot soldiers to be ordered around, rather than men fighting for their ideals, for their families or for the glory of their country. Even if the war had been won, what glory was there to be had? The critical spirit necessary for proper group bonding in the successful Army was lost; Freud’s premise that neglect and omission of a leading idea within an army equaled a practical danger came true in the severe lowering of morale and rampant depression among the troops. Schmitt also realizes the unjustness of a demand for sacrifice on the basis of an economic expediency; this action, he states, contradicts the principles of a liberal order and cannot be justified by the norms and ideals of an economy so conceived. In the end, some lives were just reduced to the empty symbols of what they had fought for: a lonely gun, helmet and boots for each soldier, a nameless death swallowed into the unchanging fields.

The American government painted the Vietnam War as a war against communism, and essentially, a war against the influence of the Soviet Union. At the same time, the Soviet Union and the North Vietnamese were described as the aggressors and regarded as the corruptors of nations – a source of pollution to be ritually discredited and removed by words and force. In the Greek sense, the idea of a pollution involved an act of revenge as a cleansing; this act of revenge was in return for a personal injury. In the case of the Vietnam War, only American policy had dictated that the war was in revenge of a ”personal” injury – against democracy and freedom. It was said that any step forward for communism was one step back for democracy. However, all other nations did not perceive it that way.

For them, it was a war that had been provoked by an attack that seemed suspiciously induced by the United States, similar to the Japanese cipher affair before Pearl Harbor in World War II, or the Hearst–arranged explosion as preliminary to the Spanish–American War. It was cleverly arranged, or so it was thought, to be an aggression against the United States – in a situation that was clearly wholly Vietnamese. The United States seemed to think it was the invincible protector and bastion of freedom throughout the world; apparently, no one else even considered the idea that this could be a personal war. Citizens themselves asked the question: if we were dropping bombs on Hanoi, why weren’t they dropping bombs on Washington? The propaganda war was clearly lost and the ritual purification of the United States and Vietnam implicit in American actions was clearly not valid. It was one more offense that the American public would not take. It was believed that America had soiled itself, and that the nation could only come clean by removing the source of the pollution: the President himself. The flower children and other angry citizens had already voted Johnson out and Nixon in, as he had promised to gradually get us out of Vietnam. Nixon’s lies, first about military figures and actions to the public, eventually caused his own removal from office – when the Watergate tapes were found, the revelation of more of his public treachery would have borne a quick impeachment and arrest, had he not quickly resigned and arranged for a mysterious pardon from President Ford. The ignominious loss in Vietnam then led to three more decades of American ambivalence to its own political nature.

The attack on the United States during September 11 started out as a personal sacrifice. To thousands of Americans, the agony of being attacked in the heart of our greatest city became a symbol of the American national spirit. All of a sudden, Americans had found their long lost patriotism, after decades of non–use from the Vietnam War. The United States had been attacked and effectively, it was a declaration of war. The bonds of family and country were never stronger. It was quickly accepted and understood that violence and sacrifice were needed to make the country safe again; America was not invincible and it had to defend itself. A moral reasoning had again been restored, and the impetus for war was unstoppable. The average American again had a reason to bind himself with 30 odd million other Americans in a vow to remove the source of this pollution. As opposed to the Vietnam War, this war was morally justifiable, and purposeful in a way that would make any American proud to die for his country. Some would argue that Schmitt’s premise is still applicable: that the political is irrational. After all, the public had identified an enemy that is not and was not justifiably clear as the right enemy – however, this is for others to decide.

The picture provides a vivid reminder of the varying degrees of rightness in the political sense of a war. Knowing that these symbols of an army, standing at attention, are relics of men that had once fought in the Vietnam War and died for no visible purpose is enough to remind people of their humanity. The simple memorial strengthens our ties to our own lives by revealing moments when dying without a real purpose were so irrational. Having the September 11 disaster happen in the current generation has led to a drastic reversal of American attitudes toward war since the Vietnam War, but it cannot cover up a cynicism and bitterness felt by Americans who lived through that war, based on no morally justifiable issue or idea. War, supported by the themes of violence and sacrifice, can serve as a purifier or as a stimulation of belief in one’s sense of country and family – when these reasons are removed, the war becomes invalid and humanity annulled in the destruction of the politic.

Reformations From Within

As a theological behemoth with powers ranging much beyond the spiritual, the Roman Catholic Church has been the focus of various forms of criticism throughout its long existence. Particularly vehement opposition existed during the Renaissance, parallel to the advent of strong extra–Catholic sects under Martin Luther and other leading theologians. Though their accusations range from hostility toward science to outright corruption, the crux of their collective argument was the Church’s opposition to freedoms that threaten their ability to maintain a theocracy in Europe. Not all of the critics supported religious schisms that would weaken Catholic spiritual dominance; in fact, many of these actually advocated a reformation from within the Church, working from various capacities inside and outside the Church domicile. Some, like Desiderius Erasmus through In Praise of Folly, aimed to influence religious practices and moral positions of the Church as a whole – in essence, challenging the Church to return to morality. Others emphasized reform on the local level, targeting selected individuals as having gone astray without relating to the institution as a whole, like Bartolome de las Casas in An Account, Much Abbreviated, of the Destruction of the Indies.

A lifelong scholar, Desiderius Erasmus (1466–1536) dedicated his life to frank analyses of public affairs without any national, academic or religious affiliations that would interfere with his freedom of expression. Of particular note to him was the academic accuracy of Christian theologies and texts, leading him to provide systemic translations and paraphrasals of the New Testament in order to ascertain their ‘true’ meaning. These ‘truths’ would allow a purification of doctrine, necessitating great change from within the Church if it were to be applied. Erasmus refrained from extended discussion of doctrine; his emphasis was on academic dissertation. Erasmus’ other writings, like his seminal work In Praise of Folly, were considered by him to be leisure writing and not treated as seriously. Written in 1509, In Praise of Folly criticizes the increasing vanity of the clergy and the monastic orders as well as the wayward beliefs of the citizenry that did not fit with his conception of basic Christian values. Considered to be one of the most important works in Western literature, it was written after a trip to Rome from which he had returned disillusioned at the corruption of Julius II and turned down a curial post and possible career advancement under a monastical hood.

In Praise of Folly did not contain any hostility against the actual doctrine as taught by the Church, but rather against the worldliness exemplified in many of its officers, several of whom were supposed to be models of the good Christian life. Regarding the popes that ”supply the place of Christ,” Erasmus asks that ”if they should endeavor to imitate His life... who would live more disconsolate than themselves?” He points out that ”scarce any kind of men live more voluptuously or with less trouble,” and yet the Church hierarchy is established such that some ”will sooner endure the broadest scoffs even against Christ himself than hear the Pope... be touched in the least.” Evidently, Erasmus felt that the Church – encompassing the clergy and the monastic orders as well – was beginning to become corrupt instead of upright: ”he that so taxes the lives of men, without naming anyone in particular, whither, I pray, may he be said to bite, or rather to teach and admonish?” The religion they stood for was beginning to get lost in what he termed an attempt to ”murder [Christ] by the evil example of their pestilent life.” He warns monks in particular that they must reclaim their piety, because ”they forget that Christ will condemn all of this and will call for a reckoning of that which He has prescribed, namely, charity.” Their vows of poverty mean little if they do not at least ”admonish them that a priest should be free from all worldly desires and think of nothing but heavenly things.”

Much of the Church–centered polemic within In Praise of Folly stressed the corruption present within the system. The corruption fostered and excused greed, though it wasn’t an inherent flaw within the system. It was purely a moral lapse on the part of ”wicked prelates, who not only suffer Christ to run out of request for want of preaching him, but hinder his spreading by their multitudes of laws merely contrived for their own profit [and] corrupt him by their forced expositions.” There were those who would ”willingly expend not only their wealth but their very lives for the flock of Christ,” but yet, ”what need at all of wealth to them that supply the room of the poor apostles?” However, though the folio was immensely popular, the response of the clergy and other reformers likely disappointed Erasmus. Pope Leo X thought it was funny, even though prelates like him were directly targeted. As successor to a pope Erasmus did not admire, Leo X was not much better, passing to posterity the reported phrase ”since God has given us the papacy, let us enjoy it.” Martin Luther was inspired by the work and was later to criticize Erasmus for not taking sides since he had already expressed his opinion on the corruption of the Church.

Erasmus’ idea of reform only extended to falling in line with basic Christian principles, a vision he was to elaborate on for the rest of his life in such works as the Manual of the Christian Gentleman and Institutio Principis Christiani . He was particularly concerned with formalism as an evil, meaning the respect for traditions without consideration to the true teaching of Christ.

Within In Praise of Folly, Erasmus pointed out many examples of the public following religious routines that weren’t really religious. He declares that people will ”promise themselves anything and everything” but will not ”enter the joys of the saints” until ”when the pleasures of this life, to which they cling with all their might, have finally slipped through their fingers.” The common practice of ‘purchasing’ one’s sins away as well as the habit of purchasing reliquaries and other religious paraphernalia Erasmus dismissed as ”fictitious pardons” and ”self–delusions.” Rather pointedly, he asks the reader to ”imagine [a merchant] who think that that if he throws into the collection basket one coin from all his plunder, the whole cesspool of his sinful life will be immediately wiped out.” He notes that praying to the saints and to the Virgin Mary are essentially useless, as there is never ”the least acknowledgement from anyone that had left his folly, or grown a hair’s breadth the wiser.” Indeed, ”some saints have a variety of powers, especially the virgin mother of God, to whom the ordinary run of men attribute more almost than to her son.”

What he proposed was no less than an extremely critical self–evaluation to be done by everyone – as his satire did not exclude any group from consideration. Erasmus’ method of reformation was daring in that he was an outsider who could not be pressured, unlike would–be do–gooders from within the system. That he was respected by scholars the world over allowed him to more widely disburse his theories, increasing the effectiveness of his message.

Unlike Erasmus, Bartolome de las Casas (1474?–1566) was forced to deal with reformation from within the Church hierarchy, a position that required much more politicizing and finesses than was necessary for Erasmus. De las Casas, rather than rejecting the monastic life, embraced it by converting to the Dominican brotherhood after witnessing injustices during his years as a Spanish colonist. His proselytization on behalf of the Church in the Indies eventually won him the title of ”Protector of the Indians,” an honor conferred by King Ferdinand. His main intention was to create a less injust society in which the natives were subjects rather than slaves. De las Casas did not outwardly advocate changing class distinctions, though there is evidence he may have wanted to; instead, he argued that senseless butchering was unnecessary and that a more justified approach was in order. In doing so, he argued that the violence of the colonists was preventing the Church from a successful missionary effort in the Indies, a project that could net millions of souls for the Church’s spiritual jurisdictions. As Spain was taking on the burden of being the Catholic Church’s main defender, during the assaults of liberal theology from the Protestant Reformation, de las Casas was careful not to alienate royalty in his spiritual and moral quest – such that he had their backing and support until his death, even as his controversial ideas scandalized the court.

While de las Casas was constantly critical of actions committed in Spain under the banner of the Church, he worked directly with the clergy to fix the problems. In a major coup de grace in his efforts for the indigenous American population, de las Casas was able to influence Pope Paul III to release the papal bull Sublimis Deus in 1537. The bull condemns ”many of his lackeys who, desiring to fill the measure of their greed, dare to assert indiscriminately that the Indians...on the pretext that they have no share in the Catholic faith, are to be reduced to our service like brute animals.” In essence, the Indians were declared to be rational, free people that had inherent sovereignty, but needed help to be converted to the Catholic faith. The order reduced the ability of local administration to conduct indiscriminate slavery and ”just wars,” proclaimed when corrupt clergymen pronounced that Indians were not spontaneously receptive to conversion or trade. In effect, de las Casas was purging immoral practices within the Church representation in the New World just as much as he was outside of it.

An Account, Much Abbreviated, of the Destruction of the Indies was published in 1542 as a plea to its dedicatee, King Philip II, to correct wrongs done by Spanish violence. In his presentation to Philip II, he notes the ”perditions of infinite souls” who died, unconverted to Christianity and thus sent to hell, at the hands of ”those who think nothing of spilling such immense quantities of human blood...and stealing incomparable treasures... this is a thing, my most high lord, which is most sorely needful and necessary so that God might make the... royal crown of Castile prosper.” He lamented atrocities committed against individual who have never ”done any wrong or evil to any Christian without first having received wrongs and thefts and acts of treachery from them.” Furthermore, a mass killing ”on account of the tyrannical actions and infernal works of Christians” against creatures that were ”tractable for all fair doctrine, excellently fit to receive our holy Catholic faith and to be imbued with virtuous customs” was obviously immoral. That these were Christians acting so injudiciously was supposed to horrify readers at the waywardness of Spanish citizens in the colonies. This was no doubt an embarrassment to Spain, which had been caught not enforcing Christian doctrine during its role as defender of the Church. In response, Philip II quickly released the New Laws in 1542, in which he declared that ”no cause of war or other reason... shall justify making a slave of any Indian whatsoever, and that it is our will that they be treated as subjects of the Crown.” Due to threats of colonial rebellion, the laws were retracted in 1550.

While de las Casas’ work indeed indicted hundreds of people in the brutalities, it in no way reflected badly upon the Church back home. In fact, it was appealing to the Church for cooperation, and informed them that their presence was necessary. The abuse of natives who were ”altogether without subtility, malice, or duplicity, excellent in obedience, most loyal to their native lords and to the Christians whom they serve” was unallowable. De las Casas was focusing his attention on the colonists, who had seemed to lay aside their Christian beliefs in capturing the New World for their own. Though the resulting social struggle was a large problem for the crown, it discharged the possibilities of reform from within the Church. After all, the Church had been first to respond though they had less power to require obedience in the New World, whose inhabitants had always been understood as on a lower level than Europeans.

Although both de las Casas and Erasmus attempted to reform the immoralities practiced by a supposedly Christian population, their attempts differed on many levels. At one end their audiences differed, as Erasmus targeted everybody, while de las Casas was more concerned with royal attention to his pleas. Their targets were different as well: Eramus threatened the power structure, while de las Casas chose a population that had less power in the ongoing political struggle, and thus were less likely to be able to suppress or retaliate against his message. Unlike many would–be reformers from within the Church, de las Casas was able to get much done almost immediately, surely due to the respect both Church officials and royalty offered. Though he experienced some suppression effects during his time at Philip II’s court, he was never prevented from speaking his mind. Erasmus too was free, but to a much greater extent. As an independent scholar desired by academic communities across Europe, he was in no danger of suppression. At the same time, he was confident of reformation from within the Church structure, noting that all the problems were matters of personal failings in officials rather than from a flaw in the system. Neither wanted schisms and thus, never joined or publicly sided with popular movements as led by such savants as Martin Luther, Savonarola, Huss and others. Whether it was for ideological purity or for social justice, these voices from within were able to submit their criticisms of Church failings effectively without having to advocate schisms that would have weakened the power bases and structures of the day. This achievement is all the more effective in that neither had to dilute its message – though de las Casas, by diplomatic necessity, would have had to choose his words carefully – to be able to increase the moral righteousness of the oldest power in Europe.

Social Objectives and the Usage of the Noble Savage

The literary depiction of other civilizations, whether anathema or friendly to the current regime, can serve to create a standard against which one’s own civilization can be compared. In some ways, the work may act as a moral guide for its intended audience, reflecting social objectives that the writer feels is important in relations between the civilizations. Such is a possible motivation for Tacitus’ Germania and Bartolome de las Casas’ An Account, Much Abbreviated, of the Destruction of the Indies. Both works tend towards a presentation of the ”noble savage” in these relatively unknown communities, the moral superiority these foreigners exude as a result of their comparative innocence, and the depravity present in their own civilizations. Included in this representation is an attack on certain social values of the writer’s home civilization and an appeal to readers to change their way of thinking, in order to reclaim dominance and self–respect.

Tacitus, a Roman senator and historian of the first century, wrote the ethnographic work Germania in an age when Romans were quite aware of the growing importance of the assorted Germanic tribes within the Roman Empire. Throughout Germania, Tacitus seems to have a respect for most of the cultural values of the Germans, as they are similar to those of the Romans. He describes bravery and loyalty as inherent in these individuals, who believe that ”to abandon your shield is the basest of crimes; nor may a man thus disgraced... enter their council.” As Rome was built on the strength of its military machine, this had particular resonance with its internal motifs – military service to one’s country was necessary, but loyal and brave service was an even more necessary quality. Similarly, the descriptions of state structures served to provoke a respect or identification with their way of life. Like the Roman system, where assemblies composed of citizens deliberated domestic issues and elected the chief magistrates, the Germans ”also elect the chief magistrates, who administer law in the cantons and the towns. Each of these has a hundred associates chosen from the people, who support him with their advice and influence.” This reminder of military capability and organized deliberative government served as an identification of common ideals between the two civilizations.

At the same time, Tacitus issued an implicit warning against moral corruption in the Roman populace. As a criticism against the current state of Roman values, Tacitus relates how the Germans prized the institution of marriage and the family: ”their marriage code, however, is strict, and indeed no part of their manners is more praise–worthy.” To Tacitus, it is a sign of their virtue that vice and adultery are considered taboo, and that marriage is so important. Indeed, he remarks that this allows them to – unlike the Romans during this period – ”live uncorrupted by the allurements of public shows or the stimulant of feastings.” Furthermore, he notes that ”no one in Germany laughs at vice, nor do they call it the fashion to corrupt and to be corrupted.” By explicitly mentioning these various treatments of vice, Tacitus points to personal experience with such behavior – which one can assume to be from Rome itself.

Despite Tacitus’ assertion of a decadent civilization, he still allows Rome the benefit of being the superior contestant in the race. Though she has allowed her moral dignity to relapse somewhat, Rome still outweighs these ”barbarians” – Julius Caesar often used this term to refer to the warring German and Gaulish tribes – by virtue of being the more civilized of the two. Though the Germans ”care but little to possess or use” silver and gold, ”we have now taught them to accept money also,” referring to the Roman overseers. The Germans are criticized for thinking it ”tame and stupid to acquire by the sweat of toil what they might win by their blood,” as well as for the ”strange combination in their nature that the same men should be so fond of idleness, so averse to peace.” Tacitus then mentions how passing ”an entire day and night in drinking disgraces no one,” contrasting a Roman social stigma with an everyday occurrence with the Germans. Binge drinking, he says, allows them to ”be overcome by their own vices as easily as by the arms of an enemy.”

This belittling of the foreigners amidst the moral lessons allows his intended readers, presumably Roman, to learn from the Germanic civilization without having to lose their claim to moral superiority or self–respect – after all, he could not expect that any of his intended audience would not be ashamed if found drinking all day and subsequently losing all inhibition. As a tool of rhetoric, the contrasted moralities can be quite effective in ensuring the non–alienation of a target audience while allowing the message to hit home. Since Tacitus was a Roman senator, it was not in his political interest to be entirely respectful when writing about non–Roman folk, yet he still wished to present a message for change that would have been unpopular had he perceptibly diminished the worth of the Roman Empire.

This idea of the ”noble savage” – the unfettered creature defined mainly by childlike innocence but also nobility and virtue – grew as a notion that could legitimize unequal political and economic treatments of foreign peoples by counterbalancing it with a supposed and arbitrary designation of ”morally superior.” This form of discourse was able to justify imperialism, and in Tacitus, allow his readers to be chastised but fully realize that the barbarians would still require their guidance to attain a higher level of civilization. Even in later centuries, the idea was still in circulation. John Dryden, who coined the term in his 1670 play The Conquest of Grenada, regarded the noble savage ”as nature first made man / Ere the base laws of servitude began.” Jean–Jacques Rousseau celebrated the savage as the ideal state of nature in his Discourse on Inequality; he argued that any step away from this blank state was a loss of virtue, alluding to the inequality of man in current civil society.

The term came to more controversial usage in the morally ambiguous colonial expansions throughout the New World by the sixteenth century. As Spanish and Portuguese explorers opened up an entire continent, the ambition of colonists and traders as well as religious leaders collided in their rampant manipulation of indigenous peoples. On the one hand, the natives were portrayed as corrupt to justify impartial slavery and butchering of peoples. Juan Gines de Sepulveda, one of the influential jurists of the sixteenth century, remarked that it was ”lawful not only to subject these barbarians, polluted with heinous acts... to our dominion in order to bring them to spiritual health and the true religion... but also to castigate them with yet more severe war.” The belief was that the influence of laws and religion – and liberal doses of violence – would somehow counteract their savage depravity. A much more acceptable argument regarded the savage as not oppositional but amenable to civilization and religious conversion by their very innocent disposition. New World explorer Christopher Columbus, in a December 1492 letter to King Ferdinand and Queen Isabella of Spain, remarked that ”in all the world there is no better people nor better country. They love their neighbors as themselves, and they have the sweetest talk in the world, and are gentle and always laughing.” This kind of reasoning made it much easier for explorers such as Columbus to persuade the crown to support his voyages, as violent campaigns tended to sound much more expensive.

Bartolome de las Casas, a Spanish bishop and colonist, took the image of the noble savage a step farther in An Account, Much Abbreviated, of the Destruction of the Indies. According to de las Casas, the native Indians were naturally good and have never ”done any wrong or evil to any Christian without first having received wrongs and thefts and acts of treachery from them.” Though once a slave–owner and recipient of a royal encomienda, de las Casas had given up his laborers after being horrified at the injustices he had witnessed during this travels in the New World. He eventually converted to the Dominican brotherhood and proselytized on behalf of the Catholic church in his role as ”Protector of the Indians,” a title conferred on him by King Ferdinand.

His message was not popular with settlers, as de las Casas’ intention was to create a less injust society in which the natives were subjects rather than what amounted to slaves. Taking the example of certain Indian peoples in Mexico, de las Casas related how ”persuaded by the friars, the Indians did a thing that never before in the Indies had done before... twelve or fifteen lords... subjected themselves of their own will to the rule of the monarchs of Castile... for their supreme and universal lord.” Combining his religious motives with his understanding of the crown’s political and economic goals, he sought to persuade the crown that the power of the Church had won more for Spain than did the cruelties and overt greediness of the conquistadores. The settlers believed this message to be no less than a call for termination of the encomienda system, which relegated to each settler a designated number of Indians to be used for labor. Their usage – and ill–usage – had brought the settlers much wealth from working the gold mines. If the Indians became subjects on equal footing, the colonists would have no other source for cheap labor other than working the land for themselves, which no one was wont to do.

De las Casas crafted a narrative in which the peoples were ”altogether without subtility, malice, or duplicity, excellent in obedience, most loyal to their native lords and to the Christians whom they serve,” laying the grounds that the innocence of these people were being taken advantage of by the supposedly Christian and civilized colonists. He painted a picture of the Indians as impoverished and weak and thus easily dominated, as ”they well knew that being not just unarmed, but naked, on foot, and weak, against people so fierce... they could not prevail.” He was also careful to state that they did maintain similar structures of government with the Europeans, with kings and lords classed higher than the peasantry. Thus described, his intended audience would simultaneously lose fear of this ”competing” civilization while also viewing the Indians with some measure of respect, as a common bond had been established. As if to embarrass Spanish actions, de las Casas related that most of the Spaniards were ”served... with all that they had, especially giving them victuals as was meet, and all that they were able.” This was an entirely different image than the idea of the ”brute” savage that had been bandied around. These Indians displayed nobility in their hospitable treatment of strangers, as they rose above their so–called reputation with their comparatively morally upright actions.

Most importantly, the Indians were ”tractable for all fair doctrine, excellently fit to receive our holy Catholic faith and to be imbued with virtuous customs.” To Spanish society and to the crown – who were regarded as protectors of the faith by the Catholic church at this point in European history – the ability to proselytize widely was a moral boon, especially in a situation where they would no doubt understand that they were much more superior. De las Casas does warn that much of this intention will be lost, as ”there have been above twelve million souls... killed, tyrannically and unjustly, an account of the tyrannical actions and infernal works of Christians.” Furthermore, he regrets the ”perditions of these infinite souls.” Unbaptized and unconverted as they were, these Indians were dying without the benefits of Christianity – a travesty to the Catholic church, whose doctrine stated that unsaved souls would go to hell.

The political and economic implications of this rampant greed was not lost on others in the New World. According to de las Casas, the judges being sent to the colonies merely provided financial reasons for which the cruelty should be stopped: ”none have taken the pains to examine the crimes and perditions... save for saying that because so–and–so has perpetrated cruelties upon the Indians, the king has lost of his rents and tributes so many thousand castellanos.” No doubt this reasoning was in some way effective, as the expense of maintaining the faith around Europe and protecting colonial interests was a heavy burden on royal coffers.

In the end, de las Casas was effective in promoting his social agenda by contrasting the ”noble savage” with the mean spirits and cruelty of his own civilization. Both the Catholic church and the Spanish crown had taken his moral lessons to heart. Pope Paul III issued the 1537 papal bull Sublimis Deus in which the pope ascribes to the devil the inciting of ”many of his lackeys who, desiring to fill the measure of their greed, dare to assert indiscriminately that the Indians...on the pretext that they have no share in the Catholic faith, are to be reduced to our service like brute animals.” This condemns the men who believe in the depravity of the Indian soul, who – according to de las Casas – think that ”all those victories are given them by God because their evil wars are just.” To justify defense tactics implemented by these ”helpless” Indians, de las Casas also suggest to the crown that ”no man is or may be called a rebel if he is not first a subject.” Perhaps this reasoning – or the lesson that these peoples could be subjugated if taught with Christian love, or even the practical reasons – finally persuaded King Ferdinand to issue the New Laws of the Indies. In it, he commanded that ”no cause... shall justify making a slave of any Indian whatsoever, and that it is our will that they be treated as subjects of the Crown of Castile, for so they are.”

While An Account, Much Abbreviated, of the Destruction of the Indies may have had a different intended audience than Germania, their utility was the same in determining social values. They were both used to magnify the morally superior positions of civilizations that were otherwise described as far inferior; this way, lessons could be learned without denigrating the writer’s home civilization. The image of the ”noble savage” was used to paint an obvious contrast to the stem of moral corruption being practiced by both the Romans and the Spanish. In both cases, neither population is attacked directly. The enthymematic assumption in Germania allows an implicit conclusion of Roman guilt, and de las Casas attacks only the colonists, though by logical extension they are still Spanish. By appealing to resident ideals – bravery and loyalty in the case of Rome, and religion and piety for the Spanish – both Tacitus and de las Casas were able to draw some form of identification with these foreigners for their readers, though they were also adversaries of each respective civilization. The standard created by each work was in essence a low bar to rise above: as these civilizations were depicted as simpler and more barbaric, the intended audiences need only adjust their social attitudes very slightly to regain what they regarded as their wholly superior position.

Echoing Empire: Frances Eden’s Tigers, Durbars and Kings

Across the boundaries of empire from colony to home and hearth exists a general delineation between those who set out to expand the ontological sense of empire, and those who merely observe such machinations. The romantic and mystical bent of colonization may have seduced those that are less disheartened by the practicalities of civil strife in foreign lands, but does not decrease an acute awareness of the inherent racialism and paternalism within such an imperial scheme, whether it be thought moral or immoral. The triumph of the British Empire over India in the 19th century invited no less of a distinction; the casual or serious visitor could not be expected to miss the enormous difference in alien understandings and levels of so–called civilization, visible in British policy and doctrine as well as in the daily observances of native life. One visitor to the Raj gently noted this tension in letters home from 1837–8: as a gentlewoman attached to the Governor–General of India’s party, Frances Eden would experience firsthand the attentions of the entire country, the great inequalities of Indian kingdoms and rajah–hoods, and a growing sense of moral maturity in her attitudes toward British treatment of the native population. Though Frances maintained an air of pleasure throughout her meanderings, the vicissitudes of Indian life and customs became a reproach to certain themes in the psyche of empire – the pursuit of destruction, a respect for cultural violence and the prevalence of form above humanity. But beyond this, her hope overrode any doubts; like a good servant of empire, Frances Eden was able to accept the inevitable.

In her initial letters compiled within Tigers, Durbars and Kings, Frances Eden displayed a significant sense of apprehension to the unknowns of the wild to which she was to be heading. To her friend Eleanor Grosvenor, she writes that ”William won’t hear reason as to the horrible dangers he is going to take me into.” By all accounts bored with society life in Calcutta, she claimed that she would not make it safely back ”because the wild beasts in this country are real wild beasts who will not listen to reason.” Though it is a sarcastic noting, it hints at her expectancy of primitivism and the uncivilized; the transfer of meaning from animal to human has an appreciably racialist basis. Instead of being afraid, Eden was hopeful that the ”journey is to make me strong again.” Rather, she treated it rather like a game: ”if we get back without any calamities from tigers or jungle fevers, this journey will have been a most successful experiment.” Her desire to escape the rigidity of her life in the intensely traditional domain of Indian imperial offices echoed those of the redoubtable Miss Quested in A Passage to India: ”In front, like a shutter, fell a vision of her married life... she would see India always as a frieze, never as a spirit.” It cannot be said that she only desired to escape, for upon leaving for the Upper Provinces, she wistfully remarked that ”we, stepping gracefully aboard, ‘clad in paradoxical emotion,’ the suit in which the immortal author of San Sebastiano always clothed his heroes” – for as much as she wanted to go, there was so much she was leaving behind.

Unmarried and attached to the household of the most powerful man in India, Eden had a right to expect a lot out of her Indian encounter, for their trek would come across the totality of the Indian royalty’s subservience to the British Empire. She was puzzled, however, by their quiescent acceptance of these positions. She was curious as to what would make ”the Nawaub of Morshadabad fancy he likes the Order of the Guelf,” for after all, he had received the royal award in exchange for costly gifts he had given the King. Scathingly, she observed that the munificence of the Rajah of Benares was likely to be passed off as inconsequential by the military men: ”I dare say in return for all his presents he will be generously presented with something worth 3s & 6d.” Though she does not dwell on the fact, Eden noted the magnificence of past empires – sometimes over the current one – as she viewed their ruins. Without irony, she remarked that ”in the hall of audience where the pavement and pillars are still inlaid with precious stones, there is the Arabic original of Moore’s line: ‘if there’s bliss upon earth, it is this, it is this.’” Passing by an old battlefield where Hindu and Muslim had clashed in ages past, she termed it ”the real field of Waterloo,” in a disparagement of the original. For all her attention to these details, her anger never surpasses her excitement; indeed, she was always able to move on. As a result, she effectively accepted the consequences of benevolent assimilation on the Indian provinces, even as she was cynically commenting on its dark side – that there would be a paternalistic form of leadership over the country that would pay little attention to the respect due to their cultures from either past accomplishments or the status of their nobility.

Their arrival at Amritsar and the land of the Sikhs signaled a change in the general outlook of their party. No longer were the British officers fretting over getting the Indian noble’s shoes off in time before stepping on their white cloth; they were suddenly polite and preparing far more ostentatious and lavish gifts than ever before, including a battalion of horses. Eden was impressed by the status of Sikh leader Runjeet Singh: ”if you could but have seen the splendor of the sight we saw this morning, you would simply have died of it... [it] was the finest thing I ever saw anywhere.” More interestingly, she never challenged Singh’s ability to rule in any of her letters; by all respects, she was awed by his political control, such that ”we are all left with a great respect for Runjeet’s talents in the peculiar line of governing Sikhs.” In this feeling, she may have been coached, as British aspirations in Afghanistan required Runjeet Singh’s political cooperation and thus he was to be treated with respect regardless. There was no doubt that she did respect the ability to uphold their wealth and prestige, as well as maintain their populace – observing that in some places, ”the mere refuse of their jewels would be a fortune” while the natives under British rule in Calcutta ”are all too poor to have any valuable [jewels].”

The cultural component to Eden’s letters is not often judgmental, but its rare occasion offers up a troubled morality which can be taken as a general humanist critique of British ideals. Initially, she took joy in hunting, reporting that ”there dear, I have killed my first tiger”, justifying the act by declaring that ”one stroke from such a claw would not leave one to linger.” But her joy in the happiness of the more militaristic persons in the company were tempered by her observances that the animals they were continually shooting ”seemed to be such a fine strong beast so exactly fitted to its own jungles – I do not see our right to take our love of destruction there.” Her imagery as to the love of destruction extended to the Thuggees – the bandits who murdered travelers as an art form – which English society routinely pictured as evil. However, she saw discontinuities in attitudes toward these supposedly morally repugnant individuals, and the propensity for supposed ”gentlemen” to ”go and see a regular Thug exhibition.” Even in capital cases, ”all the Europeans who have had much to do with their examinations view them in a most romantic light, and look upon those who are hanged almost as martyrs.” Even trade, the lifeblood of the British nation, did not escape her secret glimpses. Eden confessed upon seeing an opium godown that ”it was a curious sight – first to see there is so much opium in the world, and next that there should be consumption for it,” knowing full well that this one station was a mere link in the chain of British hegemony over the opium trade.

As entranced as Eden was by the trappings of power – the great silver sticks, peacock–feather chouries, and gold–embroidered punkahs [large fans] carried by their servants – she also noticed, with some dismay, the mechanisms of power which brought her the luxury she was experiencing. While ”there is no country in which people may be left to dream with greater impunity,” she lamented that the junior civil servants of the East India company ”are quite alone, no other European within reach... and why they do not go melancholy mad I cannot conceive.” Their domain extended to over a million persons per civil servant, and their responsibilities included the outlawing of customs deemed barbaric by British standards. Eden noted that they were limited in many fashions, for ”government will take what measures they can that are safe to stop such horrors, but such customs can only be done away with by degrees.” In a country of wallahs – those employed to do something: a beast of burden, in most imperialist senses – Eden realized that these sorry officials were only a white version of the wallah, and that they were taking the burden of empire on to their broad shoulders. In frustration, she wonders why her brother George, the Governor–General, ”should not take on like a Maharajah and turn magistrate for himself.” Even more distressing is her observation of the military forces required to keep control in India: ”hardly any of the men who came out in it are left... most had died under 20.”

Like most other white visitors to the Indian nation – indeed, any other colonized area of the world – the imperial dialogue of racialism and paternalism created expectations that Frances Eden echoed when first departing on her trip. She was as much attracted to its foreign–ness as repelled, and thus the ”paradoxical emotion” that she experienced. However, the combination of conscience, her critical eye for cultural detail and a position in the highest circles of British power in India allowed her to meet both the best and worst that India had to offer, by way of meetings with royalty and chance encounters along the road well traveled. Troubled by moral contradictions in the British language of colonialism and the realities of colonized life, and especially their predilection to destruction in the cultural and physical sense, she nevertheless ‘got over it.’ In the end, ”I am glad to see – and shall be more glad to have seen India – but it is fearful to think how entirely even five years will break short of our English lives.” Military men would call this an allegiance to a higher sense of duty, rather than falling for mere humanist sentiment. But this was important in the greater meaning of empire: without both a tacit understanding of the necessity of British discipline and reform, and a paternalistic hope in cultivating lower individuals to a better position, the project of an imperial destiny was pointless. In the end, Frances Eden was merely echoing empire.

Erasmus on Folly

As a leading evangelical humanist and Christian reformer, Erasmus hoped to influence for the better many of the religious practices and moral positions taken by the populace. By writing ”In Praise of Folly, Erasmus extended his aim of reformation not only to the institutions of the clergy and the monastic orders and what he saw as their increasing vanity, but also to the wayward beliefs of the citizenry that did not fit with his conception of basic Christian values. This spiritual programme was utilized by other leading contemporary theologians and intellectuals as well, albeit in a more aggressive and extensive form. In the grand scale, they advocated religious schisms that would weaken the spiritual dominance of the Catholic Church, whereas Erasmus looked to inspire reformation from within.

Much of the virulent satire in this work – exemplified in the obviously satiric title – centers on the difference between what men say and what men do. This pretentiousness in action, according to Erasmus, should force people from all walks of life to question themselves. The philosophers ”announce that they alone are wise,” but instead, ”the fact that they can never explain why they constantly disagree with each other is sufficient proof that they do not know the truth about anything.” To the theologians, Erasmus announces that ”they are full of big words and newly invented terms,” that ”they behave as if they were already in heaven,” and that they ”explicate the most hidden mysteries according to their own fancy.” Merchants lie and cheat and are still respected because they are rich. Still, as biting as his words are against all these classes of people, most of his venom is still reserved for the clergy.

Erasmus argues that the clergy does not act like spiritual leaders should. He puts the question to their vanity more precisely when he asks ”and for popes, that supply the place of Christ, if they should endeavor to imitate His life... who would live more disconsolate than themselves?” More scathingly, Erasmus mentions that ”scarce any kind of men live more voluptuously or with less trouble.” All this action serves to ”murder [Christ] by the evil example of their pestilent life.” The monks, who take vows of poverty, are not spared – they are described as taking ”extreme pains, not in order to be like Christ, but to be unlike each other.” Erasmus then emphasizes that what is being preached is not what is being practiced: ”they forget that Christ will condemn all of this and will call for a reckoning of that which He has prescribed, namely, charity.”

Most of this wrongful action points to some form of greediness, be it money, like the popes and princes of the realm, or pride, like the monks. However, it does not lessen the brunt of responsibility of the public, who buy into their claims. Erasmus declares that people will ”promise themselves anything and everything” but will not ”enter the joys of the saints” until ”when the pleasures of this life, to which they cling with all their might, have finally slipped through their fingers.” Rather than solely admonishing the clergy who promise these ”fictitious pardons,” or ”pious imposters” who create fake relics, Erasmus spreads the blame by labeling these ”self–delusions” of holiness by money exchange as ridiculous. He asks that his reader – assumed to be a thinking and logical individual – to ”imagine [some merchant] who thinks that if he throws into the collection basket one coin from all his plunder, the whole cesspool of his sinful life will be immediately wiped out.” The common practice of praying to saints or the Virgin Mary is also satirized: Erasmus notes that ”some saints have a variety of powers, especially the virgin mother of God, to whom the ordinary run of men attribute more almost than to her son.”

In the end, however, Erasmus sees hope where folly has arisen – in the churches and among the peoples which have erred. In these places, ”Folly is so gracious above that her errors are only pardoned, those of wise men never.” As evidence, he lists biblical wise men Aaron, Saul, David and Paul, who do not act as wise men but instead admit their folly and are then forgiven by the Lord. This view holds that wise men can change, indicating that the leaders of the Church can reform it by admitting to their folly. Unlike violent reformers like Martin Luther, Erasmus did not see these grievous problems as unsolvable through the Church bureaucracy; instead, he notes that the recognizance of folly has always been and will be the beginning of true knowledge and change: ”and now judge yourselves what an excellent thing this folly is, whose very counterfeit and semblance only has got such praise from the learned.”

Folk Music and the Economics of Musical Heritage

The heritage of a nation is both a precious and a precarious thing. On the one hand, it is visible through the daily activities of the ordinary man as folk culture, but time does flow on, obscuring the past and the ideas that make a people who they are. Even if fleeting, the understanding of a cultural heritage is necessary to understand the present. A veritable source of lore on this commonwealth of culture present within a country comes from the collective song of the common people, known as folk music. By its definition, it is necessarily diverse and expresses everything from the political ambitions of an age to the private desires of the individual. Its preservation can thus capture the spirit of a generation in time and in space, and allow their influence to continue on through the presence, into the future. However, the understanding of a cultural value within a music can be extended to have an economic value; an interest in the past comes at a price. Folk artists need to have an audience to tell their stories to in order to survive, requiring money for publicity and distribution. The artist’s relationship with the hand that feeds it – from corporations to public funding sources – has a direct impact on the atmosphere and perpetuation of one’s own cultural heritage, and affects how people will come to terms with what they understand themselves to be.

Folk culture exists as a collection of traditions that are disseminated orally and behaviorally, which make up the diverse and enduring identities of a group (American Folklife Center). Everything that a people does comes from centuries of repetition of a cycle, wherein a pattern which has value to the group as a whole is continually absorbed and assimilated, generation by generation. By discovering one’s own sense of cultural expression, one discovers the basis of a common humanity that exists to bind all individuals in the group together (Greenway 21). However, it is often hard to understand the essence of folk culture by viewing the disparate elements of text and art within an age; it is often necessary to find the common threads that are most clearly articulated in music.

Originally, folk music was understood as the song of the common people and was thus supposed to use colloquial language, be indigenous to a region, be vaguely older, and have very simple forms that could be performed with little accompaniment (Neff, “Definition”). Its source of lore stemmed from the various lyrics that were written to be sung along with the melodies. Today, the lyrics still possess the emotional and political charge associated with folk music, but the defining qualities of the music itself are no longer as they once were. Musical dilution, blending, cross-pollination, or simply expansion, depending on which way one would look at it, has allowed what is known as folk music to become too expansive (Spalding). Where in the past, folk would have been considered independent of blues, Cajun music, Gaelic music and other such indigenous styles, the folk umbrella now covers all of those categories and other influences such as rock and roll, pop and jazz. Gradually, the music community has moved away from using the word “folk” to label their music; contextual differences and a lack of understanding of the term in modern society has usually provoked confusion and a distaste for the label, and so a negative connotation is attached to the word. Furthermore, artists and producers seem to agree that the folk music label has no longer any meaning musically; they find that it is better suited as a label for the community that prepares and performs the music (Neff, “Definition”).

The existence of folk music or any music in general, is supported by a community that actively creates a relationship with the artists and their messages by paying to see performances, buying their music, and supporting the survival of the message so it can take its place in the accepted musical heritage of the nation (Greenway 9). The utility of music as a publicity tool serves as its own reason for its presence – it exists in every human society and mirrors the desires and ambitions of a community by its strong ties to region, racial identity and even time (Lornell 7). An endearing song from one region or race may not induce the same sentiment in others and so earns its own unique identity, an important requirement a prospective purchaser would look for. The sometimes important distinction of time can also affect the community’s willingness to provide financial support for certain artists; certain songs have lasting power and can thus invoke nostalgia, and sampling of such a song by other artists increases that lasting power. The development of mass media, industrialization and urbanization has also affected the bond between a music and its supporting community. The ability of technology to allow split-second distribution of music has allowed it to gain a value somewhat higher than that of textual and artistic ideas, which require more time for analysis in an increasingly impatient world (Pfeffer). Changes in music by the continuing creativity of folk artists are able to spread rapidly, as are changes to a music’s dependence on geography by a process called creolization, in which industrialization and urbanization in effect cause a cross-pollination of the music of two different cultures (Lornell 11).

A folk artist’s worth to the community and to the national cultural anthology to which he serves is assessed through the message that he carries. The mission of the folk artist is to depict life as it is; consequently, the words of the songs are very important, as they store references to ideas that can be understood in full or in part by those who view it later. One producer put it well: “to me, what distinguishes folk are the words. In all the genres, there's not any better writing, communicating thoughts and feeling, than in folk music…” (Neff, “Definition”). The ideas that must be expressed can change, but the essential nature of folk music means that it continues to represent the common man. “The subject matter and the musical style have changed with the changing times, but the fundamental principle of folk song and its relation to the people have remained the same” (Rhodes 15). However, a commitment to truth does not negate the need for a unique stance; folk artists must establish a voice from which to project a thoughtful message or risk being hard-pressed to obtain and deserve the respect accorded to an individual who by action, represents the entire generation. Unlike the aesthetic and often bland pop singer, folk artists must be differentiated by some sort of social commentary evident in the words of their ballads. Folk artists, unlike pop artists, usually aren’t built into legends by their producers, and thus the responsibility for obtaining their own cultural identity rests on themselves alone. This separate cultural identity not only would ensure that the folk artist would be remembered and respected as an individual within folk circles, but that they would be successful when exposed to the world market.

The folk artists’ choice of theme often dictated their economic and political survival. Some chose to be protestors: Woody Guthrie, of “My Land is Your Land” fame, was a noted union supporter whose trademark song was shorn of later verses for their often wildly angry tone. His earlier ballad “The Ludlow Massacre” recalled the early-century wanton labor atrocities as a commentary on the state of unionism in the Forties and Fifties (Greenway). His renowned connection with Communist politics severely limited his popularity and well-being during his lifetime; the House Un-American Activities Committee made it possible for artists like Guthrie, Seeger and other folk artists to be implicated as conspirators in an imaginary Communist plot to “ensnare and capture young minds” (Jackson, Reuss 2) . However, these artists, in return for a lifetime of annoyance and constant watching by the government, eventually earned large acclaim and enormous respect after their deaths and were finally included in the pantheon of great folk artists. Others chose to be entertainers: there have always existed unabashed representatives of filth and licentiousness – though their audiences adored the smutty lyrics of the bawdy song, the performers rapidly disappeared from sight, never to be heard from again, or hid their authorship well (Cray xvi).

By its definition, folk music has been around as long as there has been music. Its linkage to the economic traditions of cultural heritage is also as old; however, the rapid advance of technology, governmental support for the arts and public ideological movements within the 20th century have strongly affected our relationship to our “own” music. Starting in the mid-19th century, a great folk revival commenced, which, through the expending of much effort, concluded with the collecting and publishing of folk songs stretching back to the 17th century (Hyde). It wasn’t until the 20th century that contemporary styles of folk music were taken very seriously. Even though minstrel shows, vaudeville and medicine shows illustrated that the common man would pay to be entertained with contemporary folk music, the elements of racism and condescension presented a distortion of the mainly black music that was performed. The artists were often white men in blackface; however, many black artists were able to build a start to their careers through the medicine shows which were similar to circuses (Lornell 32). Though the traveling show was economically successful, the start of jazz and blues proved to be a better musical model for economic health and cultural integration.

The Great Migration into the North after World War I brought many black traditions into the North, starting with a form of music called the blues (PBS). Its similarity to ragtime, its easy syncopation, and verbal illustration of hard times allowed its audience to sympathize, identify and enjoy the music and culture being offered through the music. Many blues groups chose royal names in this period of economic musical success to point out their self-importance, betraying the commercial nature of their music, but also their pride in being representatives of the black race in such racist times (Lornell 35-36). After the advent of fascism and Nazism in the early Thirties, the interest in folk music grew accordingly, especially in left-wing circles. The support of the Communist Party for democratic New Deal programs to counter the rise of fascism and Nazism as opposing political forces resulted in a change in outlook for many people, and a subsequent gain of acceptance and respectability in American life. The move away from the ideals of the proletariat to a collective security by strengthening the “homeland” revitalized interests in native American traditions, as opposed to more ethnic traditions. Many folk music groups became acutely interested in American forms (country, Western, as well as the lilting bleakness of the Delta blues) and started to perform in the workers’ chorus style, which had its roots and influences in prison labor songs and union calls (Reuss 116). These groups earned their living by playing at the numerous rallies held around the nation, as well as on radio shows and at outdoor festivals. The “Lomax” singers, who performed in the left-wing tradition of the urban milieu, included such famous folk artists as Pete Seeger, Leadbelly, Jelly Roll Morton, Burl Ives, and the Golden Gate Quartet (Australian). However, the combined effect of political suppression, limited interest in such politically charged music and the Depression meant that folk artists were often poor.

After WWII, the disillusionment or depression of being involved with a violence of such magnitude found a kind of extroverted interest in the trappings of a material culture that could easily slip away. Few people were trying to find solace in folk music and were more into popular music styles, such as early rock and roll. The growth and popularity of black gospel music differed in that it had a strong religiosity to its message that black people took to heart, even with the lushness characteristic of music during this time (Lornell 223). It wasn’t until the stunning release of Harry Smith’s “Anthology of Folk Music” in 1952 that the nation would become introverted and self-studying once more (Good Music Guide). A major folk revival started at this time, with the founding of the Old Town School of Folk Music in Chicago to develop and support up-and-coming folk artists, such as members of the Byrds and Steve Goodman (Moore). The discovery of folk music by rock and roll artists through the “Anthology” created a whole new kind of artist that was interested in singing about the world around them, with a whole new cross-section of the population to be potential supporters of the music.

Interestingly enough, folk music began to be commercially oriented once more, as a result of the “Anthology”. Harry Smith had chosen the songs on the recordings in a purposeful manner: he stuck to songs that been recorded and sold for profit (Good Music Guide). The usage of this anthology to build up the repertoires of the fusion artists started a trend in having key songs used as appropriated covers by most of the artists, with Bob Dylan being the most prolific borrower. This precedent was shown in the later usage of other collected songs as thinly disguised versions of the originals. Fans readily accepted the new music and celebrated the political tensions inherent in the lyrics. The growing popularity of the hippie movement and rebel culture increased the sense of alienation away from the rest of the country during this period of unnecessary war, sexual freedom, racial issues and the modernist/traditionalist clash; the folk music that criticized and lamented the situation of the new generation became naturally popular, as it ensured remembrance for these people as well as a sympathy with their problems during their lifetime. Woodstock ’69 celebrated some of the most popular folk artists, such as Joan Baez, the Band, Joe Cocker and Crosby, Stills & Nash, as well as folk/rock and roll fusion artists Jerry Garcia & The Grateful Dead, Jefferson Airplane, and Janis Joplin (Barnes). Other folk artists such as Tom Lehrer, Peter, Paul & Mary and Simon and Garfunkel started their careers during this time (Stambler). The extensive support that these artists received from the fans that bought their records, attended all their concerts and frequently followed them around on tour gave the folk artist something of a super-stardom that wouldn’t have been possible in an earlier age, thanks to improved technology in musical distribution.

In the Seventies, folk artists began to move even farther away from the traditional folk music paths and chart their own; their lyrics still were cynically or broodingly descriptive of life in America, but their musical styles were more ingrained with popular culture. Bruce Springsteen, Neil Young, Cat Stevens and James Taylor were folk artists who were also soft rockers, a distinction that seemed too thin to mean anything solid (MSN). Joni Mitchell, unlike the rest, seemed to have a more traditional feel to her music, though her topics were radically different from anything that had been done before (Stambler). The line between popular music and folk music were blurred once more; folk styles had developed well beyond their grassroots status and were now functioning as a manifestation of popular culture. Consequently, a psychological distinction between those that were making good money as a tool of popular music and those that were unknown began to develop. The support of the folk artist by a recording corporation seemed to imply that the folk artist was a sellout; with the passing of the serious politically charged atmosphere present in the last two decades, the importance of the message that these folk/popular artists were projecting decreased. As it lost its profitability and identification with public ideology, folk music began to recede once more into the world of the bohemian.

These days, the general public takes folk music for granted. Folk music has returned to being an underground art form, in some ways caused by capitalism and a vicious cycle in the economics of the music business. Courtney Love, in a speech to the Digital Hollywood online entertainment conference, commented on the supreme position of the profit motive in the music business – more specifically, she was discussing how the entertainment companies degrade the quality of music throughout the country by essentially ripping off the artists (Love). Her main point was that capitalistic ideals had replaced those of musical and cultural ideals in the music industry, and that the removal of incentive (when the true value of goods and services are not reimbursed) creates a lack of production in “pure” music, thus allowing more opportunity for “trash” to take its place. Though she presented the case in a mildly denigrating way, “pure” and “trash” music represented folk and pop music respectively. The vicious cycle is seen when the national depression in “pure” music equates to lower demand for its kind, signaling a concurrent lowering of incentive.

Unlike in past decades, folk music has lost most of the active participation that had made it so exciting and sympathetic to public currents. The folk community, however, has stuck together in the face of such desperation in their industry, even taking an absurd pride in the fact that they are still folk artists when others are afraid of the label. One writer in the Folk Alliance newsletter commented that “to most popular media writers, folk music was a ‘60’s fad that almost caught on” (Mabus). There are some popular artists that have a small devoted following, like the Indigo Girls, Ani DiFranco, Tracy Chapman, Dar Williams, Pete Yorn, John Gorka, Shawn Colvin and Sarah McLachlan. These artists make up the fringe of what is usually identified with folk music in the United States: the singer/songwriter. Their music appeals to crosscurrents in the national diaspora, appealing to feminists, avant-gardes in the Village, aging bohemians, as well as adolescent teens that look for beauty as a measure of musical skill, though the majority of listeners are ages 35-44 (Neff, “Business”). Sometimes, their music dabbles in the mainstream for a little while. It is of note that these artists don’t use the term “folk” to represent themselves; more commonly, they and the media use the term “indie” and “indie rock” to describe that particular musical community. However, the majority of listeners refrain from listening to these modern folk artists: “the biggest players in folk music make barely a blip on the sonar of popular music” (Mabus).

It is at this point where the shift from active public support of folk music to federal financing of culture becomes apparent. A little over thirty years ago, the government started supporting programs for the arts, including the Public Broadcasting System (PBS), National Public Radio (NPR) and an organization known as the National Endowment for the Arts (NEA). PBS and NPR, with shows like World Café, air the music and thoughts of folk musicians to the rest of the world, as both guide and marketer. Even so, folk music is usually relegated to late-night or off-peak hours when considerably less people are viewing or listening. In recognition of the lack of a central body of support for individual folk artists struggling to make a living, the National Foundation on the Arts and Humanities Act of 1965 established the NEA (NEA). Along with the National Council on the Arts, which advises and makes recommendations to the NEA, the organization provides arts groups with grants and national recognition every year. In effect, they are responsible for supporting the work of folk musicians and artists throughout the United States in recognition of their cultural importance when the public provides little or no support. In the Midwest, the establishment of the Western Folklore Center in 1980 and the donation of seed money by the NEA in 1985 for the Cowboy Poetry Gatherings (which include many folk musicians as performers) rejuvenated the dying local support for folk artists in that region (Seemann). Even though the public indirectly supports all of these programs through taxes, many are ambivalent or even negative toward these programs. Since 1995, funding for NEA and the Corporation of Public Broadcasting that supports NPR and PBS has been slashed in half by Congress under pressure from Conservative Republicans and lobbyists. A quick comparison with Finland leaves a sense of embarrassment at our disdain for our own culture: the US spent $6 per citizen on the arts with a GDP of 8.5 trillion in 1998, while Finland spent $91 per citizen on the arts with a GDP of $100 billion (Bunk).

The spread of mass media has also played a large part in current attitudes toward folk music. The computer revolution has made it possible for anyone to become a creator of their own brand of folk music without knowing how to play a note – music editing, theory and performance can all be handled through differential programming, and musical intercommunications and interaction is as varied, multidirectional and meaningful as the user would like (Spiegel). Thus, the folk artist is no longer as special when someone can design their own versions on their own. Demand is consequentially lower. Distribution has also changed. In 1998, telephone companies began sending more digital data than analog voice information (Pfeffer). Most of this data is in the form of music and video files; at colleges nationwide, the majority of bandwidth is taken up by file sharing. When computers can travel in “perfect copies” independent of physical objects, the relationship of the artist to the consumer is degraded (Spiegel). A lot of the mystique surrounding the folk singer stemmed from the closeness of the listener to expression of the “truth” during live performances; however, the packaged music can possibly be exposed to much wider audiences by digital and online distribution.

Though it is agreed that folk music exists on a smaller scale in today’s society, it still retains quite a presence. Folk music sales totaled more than 12 billion dollars in 1994, according to the RIAA (“Study”). Its success in comparison to the rest of the music industry is unknown because the amount of radio play and successful tours that popular music has is wildly different from methods of distribution in folk music. Folk artists often support themselves through touring, as the business as a whole does not make much money. One modern commentator, Alan Rowoth, commented that “the genre isn’t about the money because it’s not making much, so we immediately eliminate a whole breed of sharks” (Neff, “Business II”) John Gorka, a famous modern singer, averages 130-150 gigs a year to audiences of 200-400 people, at venues ranging from coffeehouses to music festivals (Morris). Music festivals are among the most influential concert locations, as exposure at a festival links artists to a possible fan base and to a region, and also builds the sense of community within the folk music world (Neff, “Business II”). While the number of folk artists in the US is unknown, one folk music magazine has a distribution of 13,000 people, 98% of whom describe themselves as musicians (Sing Out!). These musicians often start with small fan bases in local communities and work to build their influence; they are helped out by national folk musician associations that attempt to raise public awareness of folk music and of individual artists.

While mainly traditional folk music has been discussed, other styles of music fall under the category of folk music. Rap and hip hop, two of the most influential musical styles in public music today, resulted from a creolization of folk and popular styles – a combination of Jamaican “toast” and folk percussion in black music. Economically and culturally, rap has been wildly successful. Cultural depictions of syncopation, protest and insult songs in black folk culture, call and response, and the usage of sampling have heightened the cultural heritage present in this lush and rapidly growing style of music (Lornell 232). Sampling involves using a short selection from another piece of music and inserting it into your own creation. In traditional folk, samples are usually taken from older folk songs (such as the Counting Crows and Vanessa Carlton remake of Joni Mitchell’s Big Yellow Taxi); in rap and hip hop, samples are taken from funk, soul, blues and other black folk music predecessors (such as Sugar Hill Gang’s remake, called “Rapper’s Delight”, of “Good Times” by Chic) (Psycho Jello). The hip hop phenomenon has brought in remixing as methods of sampling and creating new music. This usage of technology to add new elements to existing music has proved immensely popular and breathes new life into older songs, making them at once culturally respectable again and immensely profitable.

While folk music’s position as cultural phenomenon and representative of a generation is continuous and long-lasting, the public understanding of their present cultural heritage is directly related to the economic status of folk music. As a politically and emotionally sensitive musical style, the relationship of the musician to history and due process is that of a mirror to a person, directly reflecting the frustrations and ambitions of each age – labor disputes, the Vietnam War, drugs on the streets and sexual freedoms. Public involvement with its own identity is low during times of less contention; concurrently, folk music as an industry suffers, as it is during current times. The expansion of mass media, the loss of personal responsibility for folk music and the growing importance of capitalistic ideals to society has also lowered the importance once placed on folk music as cultural representation. No longer is folk music needed so much as a tool of expression, because the message is no longer needed. However, folk music will always exist because it is the music of the people; the factors that contribute to lowering the importance of folk music can also contribute to greater musical exposure and spread. Our cultural heritage will live on.

Discussions on Colonialism

In an overall sense, the creation and extension of empire has been to aid the growth of national economic power. Though the rationalization offered by colonizing nations has usually been of a paternalistic nature, there is little doubt that the economic returns of such a large-scale investment have been the major source, if not the sole reason, for such a militaristic approach to power relations. This selfish reasoning is hard to dispel, especially among the occupied, as a mass removal of individual liberties and independence can hardly be understood otherwise. Why should a land be occupied by foreigners if not for their gain? This economic imperative cannot co-exist comfortably with the Social Darwinist argument of reforming the ‘uncivilized.’ The moral conscience may ask how long the ‘uncivilized’ will remain as such, and what hope civilization may bring to their lifestyle. These issues are unstated: for the colonizers, any doubt of their own ideology would be a fatal blow to their claim to rightness and legitimate public representation. Rather, the issue they must confront is how exactly to achieve both ends without disrupting their assertions of power. This is challenging in a democracy, as institutions for popular support must somehow validate or compromise not necessarily democratic ideals in order to run the business of empire.

For practicality’s sake, the imperialists must justify their use of large armies abroad, with the associated monetary and social expense, by assessing economic and political gain (e.g. resources, markets for excess home goods, opportunities for gaining ascendancy in the European balance of power). On the other hand, they must appear paternalistic; that is, there must be a semblance of “benevolent assimilation,” the apologetic term used by President William McKinley during the 1898 American tenure in the Philippines. This is not only for the benefit of a home audience, but a colonial audience which may balk at the absence of equality, justice and fair play. The consequences of not doing so may be dire, a situation brought about more by the unintended consequences of empire-building rather than from an administrative failure to live up to its promises. Blowback comes in many forms: military recruiting may sap men from trade in the home country; forceful economic restructuring may cripple perfectly adapted native economies; political repression in one area may snowball into an empire-wide resistance; and colonial alliances may destabilize delicate political balances between the local elite and their mass constituencies. These all have economic and political consequences that may make the project of empire unprofitable and unpalatable after the fact; overextension, to any military strategist, is the first step to destruction.

The driving force for empire may come from enthusiastic individuals, but permission is supposed to be handed down through a process that is supposed to be less authoritarian than a colonial regime. All the countries supporting colonial objectives in the past had supported some degree of democracy – Britain, Germany, France and even modern-day America – requiring some modicum of public approval before major decisions could be taken, such as war or colonization. Unlike the lands of the colonized, public approval could be more easily manipulated by the airing of politically and socially respected voices by the press. These commentators can generally be placed into the camp of the imperialist or the anti-imperialist. The major impetus behind imperialistic propaganda naturally came from the government, who initiated most foreign policy in the balance-of-power game of the 19th century, and whose planners would eventually have to do the brunt of the work of empire. In the British Empire, citizens were cleverly handled by their beloved Queen Victoria, who declared it the mission of the Empire “to protect the poor natives and to advance civilization.” Civilization meant not only justice, but the benefits of “a better government; more complete security of property; a more permanent… tenure of land,” according to 19th century thinker John Stuart Mill. The argument of the Liberals proposed that, lest anyone forget, empire meant not subjugation but a prelude to greater independence and ability to function – at some unspecified point in the future. With the political specter of slavery gone through its de-legalization, the immediate moral problems of imperial domination could more easily be forgotten. In its place was the humanitarian glory of the New Imperialism, enshrined in the economic and political hegemony that world power brought with it. The fact that colonies had come into British possession rather easily, without much social investment, favored the imperialists in public opinion.

The anti-imperialist opposition found expression in many forms, both aggressive and non-aggressive. Possibly the most effective campaigns against empire existed through the force of words on public opinion. Public opinion would have an indirect effect on the checkbook – a very persuasive reason to change political direction. Colonial administrators could not ignore anti-imperialist arguments if they were being broadcast far and wide. During the brutalities inflicted by Belgium on the Congo in the late 19th century, Belgian national Edmond Morel ran a humanitarian crusade that echoed arguments used in the anti-slavery and other anti-imperial movements. With newspapers, books and public speeches, Morel and like-minded social reformers informed the Belgian public and the world that King Leopold was maintaining an abusive relationship to the colonized. Similarly, British reformers Emily Hobhouse and John Hobson decried British atrocities in South Africa. This kind of social reform was similar in tone yet different from reform movements coming from within colonized countries, an ‘inside’ movement as compared to an ‘outside’ movement. During the Indian independence movement of the 1930s – in the last days of the British Empire – Mohandas Gandhi supported a reform of British policy that would return Indian politics to Indian hands. Both Morel and Gandhi were effective in allowing written publication of their sentiments, allowing much wider audiences to understand the situation. Though colonial authorities would understandably try to curtail such communication, it was almost impossible to prevent word of mouth and foreign sympathizers from spreading similar messages.

Non-aggressive in tone, Gandhi’s movement was more radical and defensive in that this rebellion did not only threaten the status quo, like Morel, but promised increased brutality toward the colonial population from colonial administrators. The ‘inside’ movements, like Gandhi’s, were possibly more dangerous, as their non-aggression invited foreign sympathy by pointing out the difference between colonial authorities and the population. In a simple contest for colonial independence, the strategists of empire could be seen visibly abusing the rights of their peoples while the rebels were merely voicing their concerns and nothing more. This positioning required observers to question exactly what rights the colonized should have. If anything, would the colonized be deserving of at least freedom of speech? These peoples were ostensibly being civilized, giving them the ability to think independently and possibly vote. So by what right could the citizens of Europe be separable from a native population desiring the same status for themselves?

Colonial authorities could deal better with plain aggression. While aggression also reflected a desire for independence, it was not necessarily seen as the mark of a civilized population and could be dismissed as the acts of a barbaric and simple population that had not realized its full potential quite yet. Besides, the threat of violence against the colonizers could not possibly invoke any sentiment from home populations who would have the power to change the focus of foreign policy; a crime against one countrymen could be generalized as an act on the nation, sparking patriotic backlash against their anti-imperialist movements. By rebelling during the Indian Mutiny of 1857-8, the unfortunate Indian soldiers and insurgents rallied British sentiment on the side of Empire and promoted harsher tactics against any independence or anti-imperialist movements. The actions of the mutineers could be seen, on a psychological level, as falling into primitiveness, distancing any possible comparison with home populations. Aggression could bring about a perception shift, such that the plight of many suffering individuals was consolidated into the ungrateful rumblings of a mass enemy who was adversarial to all things European rather than merely abused.

The maintenance of empire would necessitate the interplay of many differing arguments that, taken as a whole, would justify the economic and social expense of occupying another country. More often than not, the underlying economic motive clashed with the overtly publicized motive of paternalism, an explanation discredited by resistance movements at home and inside the colonies. In response to either aggressive or non-aggressive movements, the colonial authorities had to deal with the fundamental social inequality between colonized and colonizer; the economic bounty promised by the occupation; the amount and tone of anti-imperial communications that could possibly sway democratic decisions regarding empire; and the psychological consequences of attempting to pass on ‘civilization’ but never attempting to allow its expression.

Some proponents of empire, like King Leopold of Belgium, practiced an entirely exploitative colonial rule that derived maximum benefits with little return to native populations, such that their colonies ended up in extremely deprecatory states. However, other European colonizers had a more democratic approach that took into account public opinion. While also exploitative, the occupation of India by the British contained elements of semi-autonomy and social reform for the Indian population. Rather than be responsible for the country, the British groomed a bureaucracy that had Indians policing Indians, decreasing the possibility that the British could be solely accused of injustices. Along with other nations, the policy of divide-and-rule let the occupier play the good guy while the negative effects of colonialism would be passed on to local tensions between neighboring populations. If necessary, the military could crush resistance. Aggressive movements would be much easier to handle than non-aggressive movements, as the latter appealed strongly to sympathizers rather than alienating them.

The very inconsistency of reasons for empire make any appeal to common sense or human rights shaky, as more powerful motivating factors will always be around. No appeal for justice will last long if pitted against a selfish movement for resources or power that has the impetus of a large government and strong military behind it. Dealing with these problems of morality and fair governance within an imperial government – and democracy – is almost hopeless unless their exists enough power in a home democracy to reverse foreign policy actions. Even with world opinion against them, Belgium only relinquished political control over the Congo in the 1950s, when independence movements were sweeping across Africa. Britain only gave up its Empire after it found that it no longer had the power to control it by World War II. Both countries had strong movements that were repulsed by the attitudes of empire but had only middling political effectiveness. To me, the kind of democratic power necessary to combat social wrongs within an imperial government does not exist and it is unrealistic to expect. Instead, I feel that violence will always be the response of empire towards any movement for change, and that little can be done within an inherently unequal political structure.

‘Colonial culture’ is the general reference to the outgrowth of a sociological juxtaposition of colonized and colonizer. The relationship is complex, with an unspoken tradition relating the narrative of rights that each party has, and what it will have for the near and unlimited future. Initial violence marks the possessive right of the colonizer to the land of the colonized, and it is this possession which allows some claim of a higher and more universal right than may be attributable to the colonized. The psychological divide between colonized and colonizer delineates an alienation between the two cultures, one subservient and the other assertive. This line may not limit mutual understandings, but it promotes a mutual suspicion that prevents any sense of comfort or trust. These cultural representations can be seen, directly or symbolically, in all the vestiges of empire.

Though the European powers may have attempted to sell colonialism off as a philanthropic and civilizing adventure, Aime Cesaire argues in Discourse on Colonialism that this is not the case. Conversely, he argues, “colonization works to decivilize the colonizer”; that is, it is not his subjects that may be regressing, but the colonizer himself. Cesaire’s understanding of the imperial struggle reveals his belief that the work of Western civilization in colonies only served to undermine African civilization in order to build a theoretical basis for their benevolent missions. The invention of “whiteness” as opposed to the Negro created a dialectic in which a racially pure Europe was posed against an uncivilized black mass, that only existed together in their blackness or negritude, using Cesaire’s terminology. In such a construction, it would become almost a spiritual fulfillment to pass along civilization to a ‘blank slate.’ The entry of Western reformers with such aims generates what A. Adu Boahen describes as a “deep feeling of inferiority as well as the loss of human dignity among Africans.” This inferiority may be deepened by the need for colonizers to ‘win’ by maintaining the status quo and their position as the more civilized of the two. Tension exists, according to Cesaire, because “it is the colonized man who wants to move forward, and the colonizer who holds things back.”

The fabrication of a state of equality encouraged many of the colonized to engage in active service with colonial governments as go-betweens or so-called representatives of the people. Similar methods of state education provided those in the Indian civil service, or even the French civil service in Africa, with an educational background that could rival their white counterparts though very late in the game. Interestingly enough, British and French education lacked significant mention of colonial enterprises during the 19th century, at roughly the same time. However, the racial barrier still prompted an inert racism in the colonizers such that these ‘turncoats’ to African identity came to lose their own identity. In Ferdinand Oyono’s Houseboy, the educated Cameroonian Toundi finds himself an alien from his colonial overlords though he has come much farther than the ordinary African. For his assumptions of equality – which can clearly never happen in the imperial relationship – he is discarded by black and white society and finds himself alone, asking “what are we black men who are called French?” By getting ahead of his place in colonial culture, Toundi disrupts the order and balance between the colonized and colonizer, and reveals the social “limits of the possible.” A topic often covered in colonial literature, the social barrier of class and race often prevented any possibility of friendship across the races. Much of this was because a friendship would subvert the understanding of the colonizer as power broker and money man; once the colonized felt themselves on the same level, even one individual, the spell was broken.

Thereafter, the empire would degrade in its symbolism to a mere collection of average people running a country instead of an institution of high and lofty ideals. Indeed, the acceptance of another race as equal was tantamount to a rejection of one’s own race: in Paul Scott’s Day of the Scorpion, one character surmises that “you English all felt that she didn’t want you, want any of you, and of course among exiles that is a serious breach of faith. It amounts to treachery, really.” Similarly, by withdrawing accusations against an Indian, Miss Quested of E.M. Forster’s A Passage to India had “renounced her own people.”

Though accepted as the order of things, temporary or not, the inequality between races promoted suspicion between the two that was accounted for by a general silence in discourse. Forster alludes to it in describing “suspicion in the Oriental” as a “sort of… mental malady, that makes him self-conscious and unfriendly suddenly.” In this context, the colonized exudes suspicion of the good intentions of the white man, extending to individual promises but also to the general support of their rights and justice under the imperial system. As a mainly defensive political structure, colonialism and its administrators must naturally be suspicious too of their populace. This wariness of divided loyalties and treasonous activity was to be expected in such an oppositional situation, and this sentiment extended to civilians, who would be at equal danger if the colonized were to rebel. This suspicion spilled over into education, where it was common that revisionism as expressed in consistent racial or political ideologies would prevent subjects from acknowledging their history as they would like. To allow a freedom in words would allow the possibility that their moral positioning could be wrong. Oyono remarks that “because you know all their business… they can never forget about it altogether… as far as they are concerned you are the one who has told everybody and they can’t help feeling you are sitting in judgment on them.” To an extent, that is true: given the opportunity, any man would judge.

The expression of a colonial culture in the terms of the colonized is impossible in an open way. The allowance of any tendency approaching an ancient nationalism could be seen as a breeding ground for rebellion, and colonialism would not be very tolerant of that kind of attitude. In fact, the establishment of an independent culture could be seen on any level as a rebellion, no matter how small or insignificant the feeling. As such, it is a matter of cultural innovation that radical sentiments can be portrayed while masking overt signs of opposition. The strangely subversive act of the colonized ‘knowing’ the actions of the colonizer is a passive mode of revolution. Silent defiance was a mark of all the colonized who suffered injustices, and this attitude became a powerful symbol when re-introduced as a non-aggressive movement against imperialism, under the helm of such powerful leaders as Mohandas Gandhi. But silent defiance in person did not mean a nullification of written rebellion: literature was a rich source of independent cultural claims, with a legacy stretching from Aime Cesaire, Frantz Fanon and Leopold Sanghor in Africa to Subhas Chandra Bose, Jawaharlal Nehru and Gandhi himself in India.

The establishment of a colonial culture meant the parallel growth of a discourse maintaining European hegemony over colonial civilization, and the undercurrent of rebellion in the colonized, who desired their own independence. In Hind Swaraj, Gandhi paints this active or passive rebellion as an obligation, as “what we want to do should be done, not because we object to the English [or any other colonizer] or because we want to retaliate but because it is our duty to do so.” The dialectic nature of colonial society promised a lasting psychological divide, demarcated by latent inequalities, strict limitations and a lasting suspicion between the colonized and the colonizer. The sum total of these positions developed into a delicate balance called ‘colonial culture.’