Information

How common was it for Americans to visit Europe in the late 19th century?


According to this article, in 2009, around 13 million people from the United States travelled overseas, of which 35% visited Europe. Given the United States' population of 320 million, we can estimate that currently, around 1.4% of Americans visit Europe annually.

What would that percentage have been in the mid- to late 19th century? All transatlantic travel back then was by ship, and likely considerably more expensive, so presumably, the fraction of Americans able to afford a trip to the Old World was much smaller than it is today. However, out of those who could afford to travel overseas today, only a relatively small fraction actually do, so that does not necessarily mean much.

Note: In the period that I am asking about, millions of people immigrated to the U.S. from Europe, and had thus "visited" Europe before they every reached American soil. Those people should not count towards the total unless they returned to Europe afterwards, with the intent to return to America later. A visit is a trip with the intention of returning.


The percentage of Americans traveling overseas doubled between 1860 and 1900, but overseas tourism was still very rare at the end of the century (only .16% of the population per annum). Americans in 2009 were around 10 times as likely to visit Europe as were Americans in 1900.

The Historical Statistics of the United States records how many Americans were "Ocean-Bound Tourists" each year from 1820 to the present. H.W. Brands asserts that for the late 19th century, most of these tourists were headed to Europe (American Colossus, 608). So the following shows what percentage of Americans traveled on the ocean in a given year, which is a decent estimate of what percentage of Americans visited Europe:


How common was it for Americans to visit Europe in the late 19th century? - History

More Italians have migrated to the United States than any other Europeans. Poverty, overpopulation, and natural disaster all spurred Italian emigration. Beginning in the 1870s, Italian birthrates rose and death rates fell. Population pressure became severe, especially in Il Mezzogiorno, the southern and poorest provinces of Italy. As late as 1900, the illiteracy rate in southern Italy was 70 percent, ten times the rate in England, France, or Germany. The Italian government was dominated by northerners, and southerners were hurt by high taxes and high protective tariffs on northern industrial goods. Southerners also suffered from a scarcity of cultivatable land, soil erosion and deforestation, and a lack of coal and iron ore needed by industry.

Unlike the Irish Catholics, southern Italians suffered from exploitation by people of the same nationality and religion. Rather than leading to group solidarity, this situation led to a reliance on family, kin, and village ties. Life in the South revolved around la famiglia (the family) and l'ordine della famiglia (the rules of family behavior and responsibility).

Natural disasters rocked southern Italy during the early 20th century. Mount Vesuvius erupted and buried a town near Naples. Then Mount Etna erupted. Then in 1908 an earthquake and tidal wave swept through the Strait of Messina between Sicily and the Italian mainland, killing more than 100,000 people in the city of Messina alone.

Italians had a long history of migrating to foreign countries as a way of coping with poverty and dislocation. During the 19th century, more Italians migrated to South American than to North America. The earliest Italian immigrants to the United States were northern Italians, who became prominent as fruit merchants in New York and wine growers in California. Later, more and more migrants came from the south and the communities and institutions they formed reflected the region's fragmentation. Italian immigrants established hundreds of mutual aid societies, based on kinship and place of birth.

Many Italian immigrants never planned to stay in the United States permanently. The proportion returning to Italy varied between 11 percent and 73 percent. Unlike most earlier immigrants to America, they did not want to farm, which implied a permanence that did not figure in their plans. Instead, they headed for cities, where labor was needed and wages were relatively high. Expecting their stay in America to be brief, Italian immigrants lived as inexpensively as possible under conditions that native-born families considered intolerable.

Italian immigrants were particularly likely to take heavy construction jobs. About half of all late 19th century Italian immigrants were manual laborers, compared to a third of their Irish and a seventh of their German counterparts. Contracted out by a professional labor broker known as a padrone, Italians dug tunnels, laid railroad tracks, constructed bridges and roads, and erected the first skyscrapers. As early as 1890, 90 percent of New York City's public works employees and 99 percent of Chicago's street workers were Italian. Many Italian immigrant women worked, but almost never as domestic servants. Many took piece work into their homes as a way of reconciling the conflicting needs to earn money and maintain a strong family life.

For Italians, like other immigrant groups, politics, entertainment, sports, crime, and especially small business served as ladders for upward mobility. Italian American politicians, however, were hindered by a lack of ethnic cohesiveness. Italian Americans achieved notable success in both classical and popular music. Italian Americans were particularly successful in areas that did not require extensive formal education such as sales and small business ownership. They tended to be under-represented in professional occupations requiring extensive education.

For many Italian immigrants, migration to the United States could not be interpreted as a rejection of Italy. In reality, it was a defense of the Italian way of life, for the money sent home helped to preserve the traditional order. Rather than seeking permanent homes, they desired an opportunity to work for a living, hoping to save enough money to return to a better life in the country of their birth.

Historians use the phrase "birds of passage" to describe immigrants who never intended to make the United States their permanent home. Unable to earn a livelihood in their home countries, they were migratory laborers. Most were young men in their teens and twenties, who planned to work, save money, and return home. They left behind their parents, young wives, and children, indications that their absence would not be long. Before 1900 an estimated 78 percent of Italian immigrants were men. Many of them traveled to America in the early spring, worked until late fall, and then returned to the warmer climates of their southern European homes winter. Overall, 20 to 30 percent of Italian immigrants returned to Italy permanently.

The same forces of population pressure, unemployment, and the breakdown of agrarian societies sent Chinese, French Canadians, Greeks, Japanese, Mexicans, and Slavs to the United States. Yet while these migrants tended to view themselves as "sojourners," as temporary migrants, most would stay in the United States permanently.


Late 19th & Early 20th Century Revival Period 1880 - 1940

The Late 19th Century and Early 20th Century Revival period is sometimes described as the Eclectic Movement in American architecture. The building designs of this era were intended to be more exact versions of earlier architectural styles and traditions. In the preceding architectural periods, elements of various European inspired styles were combined and arranged to create new styles like the Gothic Revival, Italianate, or Second Empire styles. In the Late 19th Century Eclectic or Revival Period, there was a desire to create buildings that were more closely modeled after the original forms that inspired them. Most significantly, for the first time the old buildings of early America were included as the inspiration for architectural style. Interest in American history and a sense of pride in our heritage was spurred by the country's one hundredth birthday celebrated at the Philadelphia Centennial of 1876. This focus on American tradition was continued at the Chicago Columbian Exposition of 1893.

The two most prevalent styles of this period were the Colonial Revival and the Classical Revival which were inspired by early American buildings of Georgian, Federal, or Greek or Roman Revival style. Of course those earlier styles had been designed to incorporate stylistic elements of ancient Greece and Rome, so many of same architectural details are common to all. The larger size and scale, and arrangement of details set the buildings of the later Colonial Revival and Classical Revival apart. The Spanish Revival style and to some extent the Tudor Revival style, also looked back to the buildings of America's colonial period. The Collegiate Gothic style was developed from the earlier Gothic Revival style and the original Gothic style buildings of Europe. The Beaux Arts style and the Italian Renaissance Revival style were all based on historic European design. This period of architecture was the last to focus on the recreation of past forms in all the architectural periods to come, the desire to make a new architectural statement took precedence.


Jewish Immigration to Pre-State Israel

Nazi Germany 1933-1939: Early Stages of Persecution

My Jewish Learning is a not-for-profit and relies on your help

One of the fundamental changes in Jewish life in the period under review [the 19th century] was the enormous movement, mainly from Eastern to Western Europe and overseas, and above all to the United States of America. This migration was the consequence of demographic, economic, and political developments. The high rate of natural increase created population surpluses that could not be absorbed in the traditional Jewish occupations. Capitalist development, which commenced at a rapid pace in Russia after the liberation of the serfs in 1861 and also reached Galicia and Austria at about the same time, opened up new sources of livelihood for a small number of Jews, but caused deprivation to greater numbers, as it had eradicated many of the traditional occupations.

This development was exacerbated by the expulsion of the Jews from the villages and their eviction from occupations connected with the rural economy. Many Jews became artisans and there was fierce competition among them, while others became day‑labourers and, in fact, remained without livelihood. These two groups, the artisans and the hired labourers, provided the main candidates for emigration. Under the backward conditions of Galicia, the increase in sources of livelihood could not catch up with the growth of the Jewish population, particularly when the Poles began to organize rural cooperatives and other economic institutions in order to exclude the Jews from economic life. In Rumania, the government and population conducted an economic war on the Jews, the declared aim of which was to drive them out of the country, while in Russia, oppression and harsh decrees were the official method of &ldquosolving the Jewish problem.&rdquo

Persecution was no less effective a factor than the economic causes. The great wave of Jewish migration commenced with the flight from pogroms. In 1881, thou­sands of Jews fled the towns of the Pale of Settlement in Russia and concentrated in the Austrian border town of Brody, in overcrowded conditions and deprivation. With the aid of Jewish communities and organizations, some of these refugees were sent to the United States, while the majority were returned to their homes. Jewish organizations to a large extent later lost control over migration, and it became based on individual initiative, as family members who had established themselves in the New World brought over their relatives. A factor of considerable importance in encouraging emigration, even after the first panic of the pogroms had died down, was the disillusionment of the Jews of Russia and Rumania with the hope of obtaining legal equality or at least ameliorating their condition. This emigration movement was largely a &ldquoflight to emancipation.&rdquo

The effect of political discrimination on migration is attested to by the increase in the number of emigrants after each new wave of pogroms. Migration from Russia increased greatly after the expulsion from Moscow in 1891 (in 1891 some 111,000 Jews entered the United States, and in 1892, 137,000, as against 50,000󈛠,000 in previous years.) In the worst pogrom year, from mid� to mid�, more than 200,000 Jews emigrated from Russia (154,000 to the United States, 13,500 to Argentina, 7,000 to Canada, 3,500 to Palestine, and the remainder to South America and several West and Central European countries). Between 1881 and 1914 some 350,000 Jews left Galicia.

Members of other nationalities, particularly from Southern and Eastern Europe, also emigrated in large numbers in this period to the United States and other over­seas countries, but Jewish migration was different, both in dimension and in nature. From 1881 to 1914, more than 2.5 million Jews migrated from Eastern Europe, i.e. some 80,000 each year. Of these, some two million reached the United States, some 300,000 went to other overseas countries (including Palestine), while approximately 350,000 chose Western Europe. In the first 15 years of the twentieth century, until the outbreak of the First World War, an average of 17.3 per 1,000 Jews emigrated from Russia each year, 19.6 from Rumania, and 9.6 from Galicia this percentage is several times higher than the average for the non-Jewish population.

The characteristic feature of Jewish migration was the migration of whole families. The percentage of children among Jewish immigrants to the United States was double the average, a fact which demonstrated that the uprooting was permanent. And in fact, in the last few years before the First World War, only 5.75 percent of Jewish immigrants returned to their countries of origin, while among other immigrants about one-third went back. Nearly half of the Jewish immigrants had no defined occupation, i.e., no permanent source of livelihood, as against some 25 percent of the other immigrants, but of the other half, about two‑thirds were skilled artisans (mainly tailors) as againstonly one‑fifth of the general immigrant population.

A further distinguishing feature of Jewish migration was that from the outset it displayed clearly ideological tendencies. A considerable number of the younger immigrants, members of the intelligentsia, were motivated not only by the desire to find a new refuge or a place in which there were greater chances of success. Their departure constituted a protest against the discrimination and injustices they had suffered in their old homes and reflected their ardent desire for a place in which they could live independent and free lives.

From the beginning, controversy existed between the &ldquoPalestinians&rdquo (Hovevei Zion, Lovers of Zion), who believed that independent existence of the people was only possible in their ancient homeland, and the &ldquoAmericans&rdquo (above all the Am Olam group), who hoped to establish a Jewish state as one of the states of the union to serve as the background for an autonomous, territorial, national experience, or who claimed that the &ldquoLand of Freedom&rdquo was the most suited to the free development of the Jews, even without an autonomous framework. It ­was not the ideological argument but the conditions of absorption that determined the direction of migration for the great majority of those forced to flee their countries of residence.


Early Research and Treatment of Tuberculosis in the 19th Century

The American Lung Association is dedicated to the cure and control of all lung diseases, but its formation in 1904 was in response to only one: tuberculosis. During the nineteenth and early twentieth centuries, tuberculosis (TB) was the leading cause of death in the United States, and one of the most feared diseases in the world.

Formerly called “consumption,” tuberculosis is characterized externally by fatigue, night sweats, and a general “wasting away” of the victim. Typically but not exclusively a disease of the lungs, TB is also marked by a persistent coughing-up of thick white phlegm, sometimes blood.

There was no reliable treatment for tuberculosis. Some physicians prescribed bleedings and purgings, but most often, doctors simply advised their patients to rest, eat well, and exercise outdoors.[1] Very few recovered. Those who survived their first bout with the disease were haunted by severe recurrences that destroyed any hope for an active life.

Kentucky TB Association Ad, ca. 1945

It was estimated that, at the turn of the century, 450 Americans died of tuberculosis every day, most between ages 15 and 44.[2] The disease was so common and so terrible that it was often equated with death itself.

Tuberculosis was primarily a disease of the city, where crowded and often filthy living conditions provided an ideal environment for the spread of the disease. The urban poor represented the vast majority of TB victims.

Villemin, Koch & Contagion

Jean-Antoine Villemin (1827-1892)

Science took its first real step toward the control of tuberculosis in 1868, when Frenchman Jean-Antoine Villemin proved that TB was in fact contagious. Before Villemin, many scientists believed that tuberculosis was hereditary. In fact, some stubbornly held on to this belief even after Villemin published his results.[3]

In 1882, German microbiologist Robert Koch converted most of the remaining skeptics when he isolated the causative agent of the disease, a rod-shaped bacterium now called Mycobacterium tuberculosis, or simply, the tubercle bacillus.

The work of Villemin and Koch did not immediately lead to a cure, but their discoveries helped revolutionize the popular view of the disease. They had demonstrated that the tubercle bacillus was present in the victim’s sputum. A single cough or sneeze might contain hundreds of bacilli. The message seemed clear: stay away from people with tuberculosis.

This new rule of behavior was sensible, but it made the tubercular invalid an “untouchable,” a complete outcast. Many lost their jobs because of the panic they created among co-workers. Many landlords refused to house them. Hotel proprietors, forced to consider the safety of other guests, turned them away.[4] Rejected by society, tuberculosis victims gathered in secluded tuberculosis hospitals to die.

Trudeau & the Sanatorium

Edward Livingston Trudeau (1848-1915)

Dr. Edward Livingston Trudeau (1848-1915) was the first American to promote isolation as a means not only to spare the healthy, but to heal the sick. Trudeau believed that a period of rest and moderate exercise in the cool, fresh air of the mountains was a cure for tuberculosis. In 1885, he opened the Adirondack Cottage Sanatorium (often called “the Little Red Cottage”) at Saranac Lake, New York, the first rest home for tuberculosis patients in the United States.

Dr. Trudeau’s sanatorium plan was based on personal experience. When he was nineteen, Trudeau watched his older brother die of TB, an experience that convinced him to become a physician. In 1872, just a year after leaving medical school, he, too, contracted tuberculosis. Faced with what he believed to be a sure and speedy death, Trudeau left his medical practice in New York City and set off for his favorite resort in the Adirondacks to die.[5] There, instead of wasting away, he steadily regained his strength, due entirely, he believed, to healthy diet and outdoor exercise. Experiments on tubercular rabbits in his lab at the cottage seemed to verify his belief. In February of 1885, Trudeau welcomed the first group of hopeful patients to his sanatorium in the woods.

Child Memorial Infirmary with open-air porches for tuberculosis patients at Adirondack Cottage Sanatorium, Saranac Lake, N.Y. Library of Congress.

Trudeau required his guests to follow a strict regimen of diet and exercise. They were given three meals every day, and a glass of milk every four hours. Trudeau and his staff encouraged their patients to spend as much time as possible outdoors. At first, this meant extended periods of sitting on the sanatorium veranda (the open-air porch was a standard feature of Trudeau-style sanatoriums). Gradually, patients spent more time walking than sitting, until they were able to spend 8 to 10 hours per day exercising outdoors, regardless of weather.[6] Trudeau made his rest home available to the poor by setting a very low rent and providing free medical service. By 1900, what started as a single red cottage was a small village, a 22-building complex that included a library, a chapel, and an infirmary.


How common was it for Americans to visit Europe in the late 19th century? - History

Digital History TOPIC ID 92

Food is much more than a mere means of subsistence. It is filled with cultural, psychological, emotional, and even religious significance. It defines shared identities and embodies religious and group traditions. In Europe in the 17th and 18th centuries, food served as a class marker. A distinctive court tradition of haute cuisine and elaborate table manners arose, distinguishing the social elite from the hoi polloi. During the 19th centuries, food became a defining symbol of national identity. It is a remarkable fact that many dishes that we associate with particular countries--such as the tomato-based Italian spaghetti sauce or the American hamburger--are 19th or even 20th century century inventions.

The European discovery of the New World represented a momentous turning point in the history of food. Foods previously unknown in Europe and Africa, such as tomatoes, potatoes, corn, yams, cassava, manioc, and a vast variety of beans migrated eastward, while other sources of food, unknown in the Americas--including pigs, sheep, and cattle--moved westward. Sugar, coffee, and chocolate grown in the New World became the basis for the world's first truly multinational consumer-oriented industries.

Until the late 19th century, the history of food in America was a story of fairly distinct regional traditions that stemmed largely from England. The country's earliest English, Scottish, and Irish Protestant migrants tended to cling strongly to older food traditions. Yet the presence of new ingredients, and especially contact among diverse ethnic groups, would eventually encourage experimentation and innovation. Nevertheless, for more than two centuries, English food traditions dominated American cuisine.

Before the Civil War, there were four major food traditions in the United States, each with English roots. These included a New England tradition that associated plain cooking with religious piety. Hostile toward fancy or highly seasoned foods, which they regarded as a form of sensual indulgence, New Englanders adopted an austere diet stressing boiled and baked meats, boiled vegetables, and baked breads and pies. A Southern tradition, with its high seasonings and emphasis on frying and simmering, was an amalgam of African, English, French, Spanish, and Indian foodways. In the middle Atlantic areas influenced by Quakerism, the diet tended to be plain and simple and emphasized boiling, including boiled puddings and dumplings. In frontier areas of the backcountry, the diet included many ingredients that other English used as animal feed, including potatoes, corn, and various greens. The backcountry diet stressed griddle cakes, grits, greens, and pork.

One unique feature of the American diet from an early period was the abundance of meat--and distilled liquor. Abundant and fertile lands allowed settlers to raise corn and feed it to livestock as fodder, and convert much of the rest into whiskey. By the early nineteenth century, adult men were drinking more than 7 gallons of pure alcohol a year.

One of the first major forces for dietary change came from German immigrants, whose distinctive emphasis on beer, marinaded meats, sour flavors, wursts, and pastries was gradually assimilated into the mainstream American diet in the form of barbeque, cole slaw, hot dogs, donuts, and hamburger. The German association of food with celebrations also encouraged other Americans to make meals the centerpiece of holiday festivities.

An even greater engine of change came from industrialization. Beginning in the late nineteenth century, food began to be mass produced, mass marketed, and standardized. Factories processed, preserved, canned, and packaged a wide variety of foods. Processed cereals, which were originally promoted as one of the first health foods, quickly became a defining feature of the American breakfast. During the 1920s, a new industrial technique--freezing--emerged, as did some of the earliest cafeterias and chains of lunch counters and fast food establishments. Increasingly processed and nationally distributed foods began to dominate the nation's diet. Nevertheless, distinct regional and ethnic cuisines persisted.

During the early twentieth century, food became a major cultural battleground. The influx of large numbers of immigrants from Southern and Eastern Europe Progressive Era brought new foods to the United States. Settlement house workers, and food nutritionists, and domestic scientists tried to "Americanize" immigrant diets and teach immigrant wives and mothers "American" ways of cooking and shopping. Meanwhile, muckraking journalists and reformers raised questions about the health, purity, and wholesomeness of food, leading to the passage of the first federal laws banning unsafe food additives and mandating meat inspection.

During the nineteenth and early twentieth centuries, change in American foodways took place slowly, despite a steady influx of immigrants. Since World War II, and especially since the 1970s, shifts in eating patterns have greatly accelerated. World War II played a key role in making the American diet more cosmopolitan. Overseas service introduced soldiers to a variety of foreign cuisines, while population movements at home exposed to a wider variety of American foodways. The post-war expansion of international trade also made American diets more diverse, making fresh fruits and vegetables available year round.

Today, food tends to play a less distinctive role in defining ethnic or religious identity. Americans, regardless of religion or region, eat bagels, curry, egg rolls, and salsa--and a Thanksgiving turkey. Still, food has become--as it was for European aristocrats--a class marker. For the wealthier segments of the population, dining often involves fine wines and artistically prepared foods made up of expensive ingredients. Expensive dining has been very subject to fads and shifts in taste. Less likely to eat German or even French cuisine, wealthier Americans have become more likely to dine on foods influenced by Asian or Latin American cooking.

Food also has assumed a heightened political significance. The decision to adopt a vegetarian diet or to eat only natural foods has become a conscious way to express resistance to corporate foods. At the same time, the decision to eat particular foods has become a conscious way to assert one's ethnic identity.


European immigrants to America in early 20th century assimilated successfully, Stanford economist says

In the late 19th and early 20th centuries, an "open borders" United States absorbed millions of European immigrants in one of the largest mass migrations ever. New research by Stanford economist Ran Abramitzky challenges the perception that immigrants lagged behind native-born Americans in job pay and career growth.

European immigrants to America during the country's largest migration wave in the late 19th and early 20th centuries had earnings comparable to native-born Americans, contrary to the popular perception, according to new Stanford research.

"Our paper challenges conventional wisdom and prior research about immigrant assimilation during this period," said Ran Abramitzky, an associate professor of economics at Stanford and author of the research paper in the Journal of Political Economy.

New research challenges conventional wisdom about immigrant assimilation during the bygone era of open borders and mass migration.

Abramitzky and his colleagues found the average immigrant in that period did not face a substantial "earnings penalty" – lower pay than native-born workers – upon their arrival.

"The initial earnings penalty is overstated," said Abramitzky.

He said the conventional view is that the average European immigrants held substantially lower-paying jobs than native-born Americans upon first arrival and caught up with natives' earnings after spending some time in the United States. But that perception does not hold up to the facts, he said.

Abramitzky's co-authors include Leah Platt Boustan from the University of California, Los Angeles, and Katherine Eriksson from California Polytechnic State University.

The researchers examined records on 21,000 natives and immigrants from 16 European countries in U.S. Census Bureau data from 1900 to 1910 to 1920.

"Even when U.S. borders were open, the average immigrant who ended up settling in the United States over the long term held occupations that commanded pay similar to that of U.S. natives upon first arrival," Abramitzky said.

In that bygone era of "open borders," Abramitzky said, native-born Americans were concerned that immigrants were not assimilating properly into society – yet, on the whole, this concern appears to be unfounded. "Such concerns are echoed in today's debate over immigration policy," he added.

At the same time, Abramitzky said that immigrants from poorer countries started out with lower paid occupations relative to natives and did not manage to close this gap over time.

"This pattern casts doubt on the conventional view that, in the past, immigrants who arrived with few skills were able to invest in themselves and succeed in the U.S. economy within a single generation," Abramitzky and his colleagues wrote.

Age of migration

America took in more than 30 million immigrants during the Age of Mass Migration (1850-1913), a period when the country had open borders. By 1910, 22 percent of the U.S. labor force – and 38 percent of workers in non-southern cities – was foreign-born (compared with 17 percent today).

As the research showed, immigrants then were more likely than natives to settle in states with a high-paying mix of occupations. Location choice was an important strategy they used to achieve occupational parity with native-born Americans.

"This Age of Mass Migration not only is of interest in itself, as one of the largest migration waves in modern history, but also is informative about the process of immigrant assimilation in a world without migration restrictions," Abramitzky said.

Over time, many of the immigrants came from the poorer regions of southern and eastern Europe.

Abramitzky pointed out that native-born Americans in the late 19th and early 20th centuries were concerned about poverty in immigrant neighborhoods and low levels of education among children, many of whom left school early to work in industry.

Consequently, American political progressives championed a series of reforms, including U.S. child labor laws and compulsory schooling requirements.

Still, some natives believed that new arrivals would never fit into American society. And so, in 1924, Congress set a strict quota of 150,000 immigrant arrivals per year, with more slots allocated to immigrants from northern and western European countries than those from southern and eastern Europe.

But those early-20th-century fears of unassimilated immigrants were baseless, according to Abramitzky.

"Our results indicate that these concerns were unfounded: The average long-term immigrants in this era arrived with skills similar to those of natives and experienced identical rates of occupational upgrading over their life cycle," he wrote.

How does this lesson apply to today's immigration policy discussion? Should the numbers of immigrants and their countries of origin be limited and those with higher skills be given more entry slots?

Abramitzky said stereotyping immigrants has affected the political nature of the contemporary debate.

"These successful outcomes suggest that migration restrictions are not always necessary to ensure strong migrants' performance in the labor market," he said.


Health & Medicine in the 19th Century

In the early Victorian period disease transmission was largely understood as a matter of inherited susceptibility (today's 'genetic' component) and individual intemperance ('lifestyle'), abetted by climate and location, which were deemed productive of noxious exhalations (a version of environmental causation). Water- and air-borne infection was not generally accepted.

Thus the 1848 edition of Buchan's Domestic Medicine, with its coloured frontispiece showing the symptoms of smallpox, scarlet fever and measles, listed among the general causes of illness 'diseased parents', night air, sedentary habits, anger, wet feet and abrupt changes of temperature. The causes of fever included injury, bad air, violent emotion, irregular bowels and extremes of heat and cold. Cholera, shortly to be epidemic in many British cities, was said to be caused by rancid or putrid food, by 'cold fruits' such as cucumbers and melons, and by passionate fear or rage.

Treatments relied heavily on a 'change of air' (to the coast, for example), together with emetic and laxative purgation and bleeding by cup or leech (a traditional remedy only abandoned in mid-century) to clear 'impurities' from the body. A limited range of medication was employed, and the power of prayer was regularly invoked.

Diseases such as pulmonary tuberculosis (often called consumption) were endemic others such as cholera, were frighteningly epidemic. In the morbidity statistics, infectious and respiratory causes predominated (the latter owing much to the sulphurous fogs known as pea-soupers). Male death rates were aggravated by occupational injury and toxic substances, those for women by childbirth and violence. Work-related conditions were often specific: young women match-makers suffered 'phossy jaw', an incurable necrosis caused by exposure to phosphorous.

In Britain, epidemiological measuring and mapping of mortality and morbidity was one of the first fruits of the Victorian passion for taxonomy, leading to the clear association of pollution and disease, followed by appropriate environmental health measures. A major breakthrough came during the 1854 cholera outbreak, when Dr John Snow demonstrated that infection was spread not by miasmas but by contaminated water from a public pump in crowded Soho. When the pump handle was removed, cholera subsided. It was then possible for public health officials such as Sir John Simon to push forward projects to provide clean water, separate sewage systems and rubbish removal in urban areas, as well as to legislate for improved housing - one goal being to reduce overcrowding. The number of inhabitants per house in Scotland, for example, fell from 7.6 in 1861 to 4.7 in 1901. Between 1847 and 1900 there were 50 new statutes on housing, ranging from the major Public Health Acts of 1848 and 1872 to the 1866 Lodging Houses and Dwellings (Ireland) Act, the 1885 Housing of the Working Classes Act and the 1888 Local Government Act. On a household basis, the indoor water-closet began to replace the traditional outdoor privy.

Scientific developments in the 19th century had a major impact on understanding health and disease, as experimental research resulted in new knowledge in histology, pathology and microbiology. Few of these advances took place in Britain, where medical practice was rarely linked to scientific work and there was public hostility to the animal vivisection on which many experiments relied. The biochemical understanding of physiology began in Germany in the 1850s, together with significant work on vision and the neuromuscular system, while in France Louis Pasteur laid the foundations of the germ theory of disease based on the identification of micro-bacterial organisms. By the end of the century a new understanding of biology was thus coming into being, ushering in a new emphasis on rigorous hygiene and fresh air, and a long-lasting fear of invisible contagion from the unwashed multitude, toilet seats and shared utensils. British patent applications around 1900 include devices for avoiding infection via the communion chalice and the new-fangled telephone.

Technological developments underpinned this process, from the opthalmoscope and improved microscopes that revealed micro-organisms, to instruments like the kymograph, to measure blood pressure and muscular contraction. By mid-century, the stethoscope, invented in France in 1817 to aid diagnosis of respiratory and cardiac disorders, became the symbolic icon of the medical profession. However, the most famous British visual image, Luke Fildes's The Doctor (exhibited at the Royal Academy in 1891) shows a medical man with virtually no 'modern' equipment.

Surgery advanced - or at least increased - owing largely to the invention of anaesthesia in the late 1840s. Significant events include a notable public demonstration of the effects of ether in London in October 1846 and the use of chloroform for the queen's eighth confinement in 1853. Anaesthetics enabled surgeons to perform more sophisticated operations in addition to the traditional amputations. Specialised surgical instruments and techniques followed, for some time with mixed results, as unsterile equipment frequently led to fatal infection.

Antiseptic surgical procedures based on the practical application of Pasteur's laboratory work were developed by Joseph Lister (1827-1912) using carbolic acid (phenol) from 1869 in Edinburgh and in 1877 in London. Aseptic procedures followed, involving sterilisation of whole environments. Successful outcomes, such as Edward VII's appendicitis operation on the eve of his scheduled coronation, helped pave the way for the 20th-century era of heroic surgery.

In 1895, at the end of the era, came Wilhelm Roentgen's discovery of X-rays, and in due course the photo of Roentgen's wife's hand became a potent sign of medical advance through scientific instruments. But overall the 19th century is notable more for systematic monitoring of disease aetiology than for curative treatment.

A growing medical industry

Like other learned professions, medicine grew in size and regulation. In the early Victorian era it was dominated by the gentlemen physicians of the Royal College (founded 1518), with surgeons and apothecaries occupying lower positions. The British Medical Association was established in 1856 and from 1858 the General Medical Council (GMC) controlled entry through central registration. In the same spirit, the profession also resisted the admission of women, who struggled to have their qualifications recognised. Partly in response to population growth, however, numbers rose for example, from a total of 14,415 physicians and surgeons in England and Wales in 1861, to 22,698 (of whom 212 were female) in 1901. At the turn of the century the GMC register held 35,650 names altogether, including 6580 in military and imperial service. The number of dentists rose from 1584 in 1861 to 5309 (including 140 women) in 1901. A growing proportion of qualified personnel worked in public institutions, and a new hierarchy arose, headed by hospital consultants. This reflected the rise in hospital-based practice, for this was also the era of heroic hospital building in the major cities, accompanied by municipal and Poor Law infirmaries elsewhere. These were for working-class patients those in higher economic groups received treatment at home.

A secondary aspect of growth and regulation was the steady medicalisation of childbirth, so that over this period traditional female midwives were superseded by male obstetricians, with all their 'modern' ideas and instruments. Under prevailing conditions, however, intervention through the use of forceps, for example, often caused puerperal fever and the high maternal mortality, which was a mid-century concern.

Largely through the endeavours and energy of Florence Nightingale, whose nursing team at Scutari captured the public imagination amid military deficiencies in the Crimean War, hospital and home nursing was reformed, chiefly along sanitary lines. Rigorous nurse training also raised the social status of the profession and created a career structure largely occupied by women.

Despite these and other improvements, death rates remained relatively steady. Roughly one quarter of all children died in the first year at the end of Victoria's reign as at the beginning, and maternal mortality showed no decline. In some fields, however, survival rates improved and mortality statistics slowly declined. Thus crude death rates fell from 21.6 per thousand in 1841 to 14.6 in 1901. Here, the main factors were public hygiene and better nutrition thanks to higher earnings - that is, prevention rather than cure. Although doctors made much of their medicines with Latin names and measured doses, effective remedies were few, and chemical pharmacology as it is known in 2001 only began at the end of the Victorian era. From the 1870s (animal) thyroid extract was used for various complaints including constipation and depression, while from 1889 animal testicular extracts were deployed in pursuit of rejuvenation and miracle cures. At the same date aspirin was developed to replace traditional opiate painkillers.

As a result, many conditions remained chronic or incurable. These limitations, together with the relatively high cost of medical attendance, led to the rise (or extension) of alternative therapies including homeopathy, naturopathy ('herbal remedies'), hydropathy (water cures), mesmerism (hypnotism) and galvanism (electric therapy) as well as blatant fraudulence through the promotion of useless pills, powders and coloured liquids. From 1866 notions that disease was caused and cured by mental or spiritual power alone were circulated by the Christian Science movement.

Treating mental illness

Another highly popular fashion was that of phrenology, which claimed to identify temperamental characteristics such as aggression or lust ('amativeness') by means of lumps and bumps on the individual skull, and facial physiognomy. Psychology itself retained largely traditional concepts such 'melancholic' and 'choleric' tendencies, but in 1846 the term 'psychiatry' was coined to denote medical treatment of disabling mental conditions, which were generally held to have hereditary causes.

The Victorian period witnessed an impressive growth in the classification and isolation (or strictly the concentration) of the insane and mentally impaired in large, strictly regulated lunatic asylums outside major cities, where women and men were legally incarcerated, usually for life. Opened in 1851, the Colney Hatch Asylum in Middlesex housed 1250 patients. Wealthier families made use of private care, in smaller establishments.

Two major figures in the Victorian mental health field were James Conolly, author of The Construction and Government of Lunatic Asylums (1847) and Henry Maudsley, whose influential books included The Physiology and Pathology of Mind (1867).

Regarded at the time as progressive and humane, mental policies and asylum practices now seem almost as cruel as the earlier punitive regimes. Men and women were housed in separate wards and put to different work, most devoted to supply and service within the asylum. The use of mechanical restraints such as manacles and muzzles was steadily phased out in favour of 'moral management', although solitary confinement and straitjackets continued to be used. By the end of the era therapeutic hopes of restoring patients to sanity were largely replaced by programmes of control, where best practice was judged by inmates' docility. As part of the passion for measuring and classifying, patient records and photographs were kept, in order to 'illustrate' the physical evidence or effects of different types of derangement. Particular attention was paid to female patients, whose lack of approved feminine qualities was tautologically taken to 'prove' their madness. Over the period, sexualised theories of insanity were steadily imposed on mad women, in ways that were unmistakably manipulative. Towards the end of the 19th century, the term 'neurasthenia' came into use to describe milder or temporary nervous conditions, especially among the educated classes.

Throughout the era, since disorders of both body and mind were believed to be heritable conditions, the chronic sick, the mentally impaired and the deranged were vigorously urged against marriage and parenthood.

Jan Marsh is the author of The Pre-Raphaelite Sisterhood (1985) and biographies of Dante Gabriel Rossetti and Christina Rossetti. She has written widely on gender and society in the 19th century. She is currently a visiting professor at the Humanities Research Centre of the University of Sussex and is working on Victorian representations of ethnicity.

Interactive Map

Discover the many treasures in the beautiful V&A galleries, find out where events are happening in the Museum or just check the location of the café, shops, lifts or toilets. Simple to use, the V&A interactive map works on all screen sizes, from your tablet or smartphone to your desktop at home.


Establishing the exact place of origin

Due to the vagaries of record keeping in England there are relatively few records that give detailed information about the origins of immigrants in this period. The sources that do help are largely to be found at the National Archives at Kew.

TNA has published a very useful book by Roger Kershaw and Mark Pearsall called Immigrants and Aliens: A Guide to Sources on UK Immigration and Citizenship (The National Archives, 2004).

You might find the Moving Here website a good place to start, which looks at immigration in England over the last 200 years and currently focuses on Caribbean, Irish, South Asian and Jewish communities.


6 Jokes From 19th Century America

Comic actor Fanny Rice, sometimes billed as the Funniest Woman in America,€” in 1896.

There were some mighty funny folks in 19th century America: writers Mark Twain and Ambrose Bierce, for instance. And, by some accounts, stage comedians Fanny Rice and Marshall P. Wilder.

For a while, Rice was billed as the Funniest Woman in America. And Wilder, who specialized in mother-in-law jokes, was called the Funniest Man.

"It is one of the hardest things in the world to be funny," an aspiring comedian said in an 1887 reprint of a New York Journal story about Marshall Wilder and other comics, "because while what you are saying may be awfully comical, yet the fact that a lot of critical girls and fellows are looking at you makes you feel and look frightened."

Marshall P. Wilder, called by some the Funniest Man in America, in the late 19th century. Library of Congress hide caption

That fear didn't stop Americans from telling jokes. Sometimes the quips were crude or cruel or racist or just plain humorless. Here are half a dozen from the 1800s, lightly edited, that may still play well to contemporary sensibilities:

1870: While passing a house on the road, two Virginia salesmen spotted a "very peculiar chimney, unfinished, and it attracting their attention, they asked a flaxen-haired urchin standing near the house if it 'drawed well' whereupon the aforementioned urchin gave them the stinging retort: 'Yes, it draws all the attention of all the d***** fools that pass this road.' " Daily Milwaukee News, May 21, 1870

1872: A man said to a preacher, "That was an excellent sermon, but it was not original." The preacher was taken aback. The man said he had a book at home containing every word the preacher used. The next day the man brought the preacher a dictionary. Daily Phoenix, April 4, 1872

1888: There was a man whose last name was Rose. As a lark, he named his daughter Wild, "with the happy conceit of having her called Wild Rose." But that sentiment was "knocked out" when the woman grew up to marry a man whose last name was Bull. Weekly Journal-Miner in Prescott, Ariz., May 23, 1888

1890: Whatever troubles Adam had / No man could make him sore / By saying when he told a jest / "I've heard that joke before." Philadelphia Times, Feb. 23, 1890

1896: A fellow tells his ma that there are two holes in his trousers — and then tells her that's where he puts his feet through. Cincinnati Enquirer, Nov. 1, 1896

1899: A man got up one morning and couldn't find his alarm clock, so he asked his wife what had become of it. She said, "It went off at 6 o'clock." Salt Lake Herald, April 27, 1899