social safety nets
The first federal social safety net program in the United States was social security, which was created in 1935 through the Social Security Act signed by President Franklin Roosevelt, as millions Americans suffered from homelessness, hungry and job loss as a result of the Wall St. bank crash in 1929, resulting in the Great Depression. The purpose of the Social Security Act was to provide for the poor widowed mother and the disabled, the unemployed and the elderly, and to provide “some measure of protection, to the average citizen and his family.”
Learn more about social safety nets in the US:
"Social justice" owes its origin as a distinct concept (giustizia sociale) to the Italian Risorgimento of the nineteenth century. It was first used, by the Jesuit philosopher Luigi Taparelli d'Azeglio in 1843. It is a concept of fair and just relations between the individual and society, measured by the explicit and implicit terms for the distribution of wealth, opportunities for personal activity, and social privileges. The concept of social justice has often referred to the process of ensuring that individuals fulfill their societal roles and receive what was their due from society. In modern global grassroots movements for social justice, the emphasis has been on the breaking of barriers for social mobility, the creation of safety nets and economic justice, especially for marginalized communities previously left out of decision making and creation of systems that govern their lives in public society.
third world country
Coined by French demographer Alfred Sauvy in the 1950s, "third world" refers to economically underdeveloped countries. Sauvy was making an analogy between pre-industrial nations with the poor of pre-Revolutionary France, who were considered part of the "third estate." The term was originally created as part of the political and economic re-alignment after World War II, and the creation of multi-lateral groups such as NATO, which included only highly industrialized countries with democratic forms of government and were the “first world”, with less industrialized countries with Socialist and Communist forms of government considered to be “second world or second estate” countries, and all other countries “third world” or even “fourth world” (the term “fourth world” referred to Indigenous people who lived in nation-states after post colonial man-made borders). Over the years the term has fallen out of favor primarily because of political changes. In the mid 1990’s the term was largely put out of academic usage for terms such as “developed country”, “developing country” and “under-developed country” to define a country’s stage of economic development. While these are the most common terms used today, there are other terms gaining favor, including LIC and LMIC (Low Income Country and Low-Moderate Income Country) that describe a county based on gross domestic product.
Read more about the conversation regarding this terminology:
The word poor, is derived from the Latin term pauper and the old French term povre and later the modern French term pauvre. The original Latin referred to “little” or “getting little”, and later took on the meaning of “inferior quality”. It wasn’t until the 1600’s that the English use of the word took on the meaning of a “poor person”.
The term “welfare queen” is a derogatory term used in the United States to refer to women who allegedly misuse or collect excessive welfare payments through fraud, child endangerment, or manipulation. Reporting on welfare fraud began during the early 1960s, appearing in general-interest magazines. The term "welfare queen" originates from media reporting in 1974 and the story of Linda Taylor in Chicago.
Since then, the phrase "welfare queen" has remained a stigmatizing label and is most often directed toward black, single mothers. Although women in the U.S. could no longer stay on welfare indefinitely after the federal government launched the Temporary Assistance for Needy Families (TANF) program in 1996,] the term remains a trope in the American dialogue on poverty
The idea of welfare fraud goes back to the early-1960s, when the majority of known offenders were male. Despite this, many journalistic exposés were published at the time on those who would come to be known as welfare queens. Readers Digest and Look magazine published sensational stories about mothers gaming the system. Additionally, Ronald Reagan employed the trope of the "Welfare Queen" in order to rally support for reform of the welfare system. During his initial bid for the Republican nomination in 1976, and again in 1980, Reagan constantly made reference to the "Welfare Queen" at his campaign rallies and the term was seared into the public lexicon.
Read more about the history of the term ‘welfare queen’ and the story of Linda Taylor
A minimum wage is the lowest wage that employers may legally pay to workers. The first minimum wage law was enacted in 1894 in New Zealand, followed by the Australian state of Victoria in 1896. Factory inspector reports and newspaper reporting on the conditions of “sweat labor” (which consisted of primarily women and young children - this was also common in the U.S. and other industrialized nations around the world at the time) in Melbourne, Victoria in 1895 led to the formation of the National Anti-Sweating League which pushed the government aggressively to deal legislatively with the problem of substandard wages. The government, following the recommendation of the Victorian Chief Secretary Alexander Peacock, established wage boards which were tasked with establishing minimum wages in the labor trades which suffered from nonlivable wages. During the same time period, campaigns against sweated labor were occurring in the United States and England during what is called the “Progressive Era”.
From the 1890s to the 1920s, during the Progressive Era, a time of social activists and political reform across the United States, progressive reformers, women's organizations, religious figures, academics, and politicians all played an important role in getting state minimum wage laws passed throughout the United States.
With the passage of The Fair Labor Standards Act of 1938 (FLSA), under President Franklin D. Roosevelt (as part of The New Deal), the U.S. minimum wage was initially set at $0.25 per hour for covered workers. Since then, it has been raised 22 separate times–most recently, in July 2009, to $7.25 an hour. In the U.S. the federal minimum wage has been the same for 10 years without an increase. As of January 2018, there were 29 states with a minimum wage higher than the federal minimum and 5 states without a minimum wage at all. From 2017 to 2018, eight states increased their minimum wage levels through automatic adjustments, while increases in eleven other states occurred through referendum or legislative action.
There is a different minimum wage for workers who are identified as part of the tipped service industry. This is $2.13 an hour, with the expectation that tips will make the remaining portion of a person’s wages.
Over the last few years, that has been an upswing in the grassroots movement to raise the minimum wage city by city, and state by state to $15.00 an hour through local votes and passage of legislative bills.
Minimum Wage History - UC Davis, Center for Poverty Research
bootstraps/pull yourself up by your bootstraps
The earliest use of the term ‘pull yourself up by your bootstraps’ in the United States was identified as 1843 in a manner that was meant to ridicule an idea. The term 'pull yourself up by your bootstraps' is generally attributed to an 1843 snippet in the magazine Workingman’s Advocate, ridiculing an advertisement in a Nashville newspaper (the Nashville Banner) of the alleged discovery of perpetual motion by Nimrod Murphree. The magazine noted the action of the obviously impossible task that Mr. Murphree was suggesting and compared it to the idea of using one's bootstraps to pull themselves over a fence - an impossible task. Thus the term 'pull yourself up by your bootstraps' has always meant a task that is impossible to achieve. The term has also been attributed to a story in Rudolf Erich Raspe's The Surprising Adventures of Baron Munchausen, but in that story Baron Munchausen pulls himself (and his horse) out of a swamp by his hair (specifically, his pigtail), not by his bootstraps – and no explicit reference to bootstraps has been found elsewhere in the various versions of the Munchausen tales.
The term again found it’s way to common language in 1884 when a writer for the Chicago Tribune responded to oppose a planned railroad from the US to Argentina saying “To attempt to construct or run [a railroad] now against the existing trade barriers will be as hopeless a task as the man undertook who tried to elevate himself by his bootstraps.” A few months later the term caught on in the political world when a laborer stated neither the Democratic or Republican party had done anything to assist laborers and had so subjugated the laborer that “if he elevated himself at all, he must pull himself up by his boot-straps” again noting that laborers weren’t actually making any gains.
In modern day history, the term found new life in political discourse until the 1930’s during the Great Depression, when it became a slogan for self-reliance and self-sufficiency. Later in the 1960s during the Civil Rights movement, it found it’s way back to the forefront of American politics (and expanded to international usage in the 1980s) when politicians who were social conservatives used the term as reason to not support civil rights or attempts to ‘level the playing field’ through legislative means to bring about equality.
Going back to the early 1600s, the term homeless was originally used as an adjective that described having no permanent place to reside (or not having an abode). It wasn’t until the mid 1800s (circa 1857) that the word became a noun, and was used to describe a person (as in homeless person). The term takes the place of words such as vagrant and vagabond that were used during the 1300s to refer to the unhoused and those sleeping on the streets. Beginning with the Peasant’s Revolt of 1381 in England, that lack of having a permanent home became criminalized by the state (government). In 1383 the English Poor Laws were created in an effort to curb homelessness; but instead resulted in people serving time in the stocks (jail) for not having a home and being branded with a “V” to alert the rest of society that the person was a vagrant (the early iteration of a term for a person without a home). The Poor Laws included a long term sentence in the stocks (jail) for a first offense and then a person could be given a sentence of death for a second offense of homelessness. In the US, the issue of homelessness has been one that has been dealt with somewhat as a social issue with the creation of the first rescue mission created in 1872 in New York (the New York Rescue Mission, which still exists today) and also as a criminal issue with cities and municipalities creating laws against sleeping outside or spending too much time “idle” in one place in public during the day, leading to people being given tickets and often times arrested and spending time in jail.
middle class & middle class values
The term “middle class” historically has different meanings, ranging from being in the middle of the social hierarchy to those earning wages above poverty level but below the upper (wealthy) class, and is often used interchangeably with the term ‘working class’. In broad socio-economic terms middle class means those who are neither rich nor poor, and reside in the middle. In narrower terms, middle class is much defined according to income level. Social scientists and economists in the US have three sub-categories of the middle class, while the UK has 5 sub-categories. Generally in non-Western countries, the middle class is a fairly new concept, emerging in the last 30 - 40.
The concept of a middle class first rose to prominence in Europe during the late-feudalist era (between the 9th – 15th centuries) and meant those who were not the nobility residing in the countryside, not those who were the workers doing hard manual labor in the countryside (the peasants), and literally meant those who lived in the towns (town-dwellers) and operated around the mercantile functions in the city. The term was first used in 1745 by James Bradshaw in a pamphlet discussing the impact of trade (wool and other fabrics) on how those not of the landed gentry, the lower and middle class dressed.
In the US and other Western nations, the middle class grew beginning with the Industrial Revolution in the 18th century and exploded in the 1920’s during the Gilded Age and was supported by various government interventions and measures from the 1930’s - current day. From the 1930’s to the 1960s, government policies that supported and helped the middle class grow was directly aimed to benefit white citizens of Western countries, leaving indigenous and communities of color largely without access to that support to move into the middle class.
The term and concept of a middle class has always revolved around commerce and has always been based on economics and the way income is earned; and in Western countries has historically referred to those in non-communities of color.
The term ‘middle class values’ is new and is typically only used by writers and politicians to erroneously reinforce the belief that those in the middle class are the owners of values such as: a belief in hard work, self-discipline, thrift, honesty, ambition and aspiration. The term is derived from the term and belief in the “Protestant work ethic”.
The term ‘food desert’ is a fairly new word, only coined in the 1970s and officially used by governments and the general public beginning in the 1990s. The term is rooted in the use of ‘desert’, which was historically used by those in the urban and city planning sectors to describe suburban areas that lacked important amenities for community development. The term ‘food desert’ is attributed to residents of public housing in Western Scotland who used the phrase to specifically talk about their lack of access to grocery stores. By 1973, "desert" was ascribed to suburban areas lacking amenities important for community development. Cummins and Macintyre report that a resident of public housing in western Scotland supposedly coined the more specific phrase "food desert" in the early 1990s. The phrase was first officially used in a 1995 document from a policy working group on the Low Income Project Team of the UK's Nutrition Task Force.
In more recent years, the term has been redefined to specifically include rural and urban areas that were heavily affected by years of redlining and government and private divestment. The term is used to only describe a symptom of the larger problem caused by lack of access to fresh food - the health disparities that are prevalent in under-resourced communities and in communities of color.
The word ghetto is traced back to 1516 Venice, Italy and taken from the Italian word ‘gheto’ or foundry (factory) where cannons were located in a specific part of the island, where only those of Jewish ancestry could reside. This area was called the ‘gheto’; and became known as the section of a city where Jews are forced to live by ruling (government) policy. The term and the idea of forced segregation by religion, specifically for those of Jewish ancestry was then used in countries around the world, primarily taking root in European countries and later the US, which also segregated white non-Protestant residents into ghettos based on their country of origin and language until they assimilated into English speakers and culture. The concept of Jewish based ghettos were largely dismantled by the 19th century, until it was brought back by the Nazi’s, leading up to World War II and the Holocaust. After World War II, Jewish ghettos were largely dismantled in Europe and in the US, and were replaced by race based ghettos, especially in the US, due to the large migration of American Blacks from the southern states to the northern and western states. In the US, the practice of race based ghetto’s thrived due to government policies of racial segregation and redlining which at first kept all American Blacks in segregated neighborhoods. As legal segregation and redlining policies eased, income based ghettos were created, that largely defined where low income earning American Blacks lived. While the term historically has negative connotations, in more recent years (beginning in the 1990s), those who live in government created ghettos, began using term colloquially (slang) in a more positive light. The word still retains the negative connotations when used by those from outside of communities created by these government policies.
The concept of income inequality is not new and has been around since the term income was created circa 1300 from the Old English verb “incuman” meaning ‘money made through business or labor’ and can be seen throughout history and is an outcome of economic policies that favor the highest income earners.
The term income inequality appears to have first been used by French economists Emmanuel Saez and Thomas Piketty, in a paper titled “Income Inequality in the United States 1913 – 1998”, in the February 2003 volume of the Quarterly Journal of Economics, a peer-reviewed academic journal.
In their paper they describe income inequality as the unequal distribution of income and wealth in a society characterized by the majority of income (or wealth) held in the hands of a few. Income inequality can be and is measured on a global scale, per country, within societies of a country along the intersections of race and ethnicity, sex, gender and class.
The term opportunity gap is a new term come into usage around 2013 in Education Reformer circles, largely replacing the term ‘academic’ gap. Opportunity gap is the ways in which race, ethnicity, socioeconomic status, English proficiency, community wealth, familial situations, or other factors contribute to or perpetuate lower educational aspirations, achievement, and attainment for certain groups of students.
The shift from using the term ‘academic gap’ to ‘opportunity gap’ was made to acknowledge the systemic reasons for education outcome gaps and the impact they have as opposed to focusing solely on the individual efforts without acknowledgement of the systemic and social impacts that help or hinder student achievement and their educational outcomes.
Factors that may affect the opportunity gap are:
· Students from lower-income households may not have the financial resources that give students from higher-income households an advantage when it comes to performing well in school, scoring high on standardized tests, and aspiring to and succeeding in college. Poor nutrition, health problems resulting from a lack of healthcare, or an inability to pay for preschool education, tutoring, test-preparation services, and/or college tuition (in addition to a fear of taking on student-loan debt) may all contribute to lower educational achievement and attainment.
· Minority students may be subject to prejudice or bias that denies them equal and equitable access to learning opportunities. For example, students of color tend to be disproportionately represented in lower-level courses and special-education programs, and their academic achievement, graduation rates, and college-enrollment rates are typically lower than those of their white peers.
· Students raised by parents who have not earned a college degree or who may not value postsecondary education may lack the familial encouragement and support available to other students. These students may not be encouraged to take college-preparatory courses, for example, or their parents may struggle with the complexities of navigating the college-admissions and financial-aid process.
· Students raised in a non-English-speaking family or culture could experience limited educational opportunities if their acquisition of English proficiency, fluency, and literacy is delayed. If courses are taught exclusively in English, if educational materials are printed in English, or enriching educational programs are conducted in English or require English fluency, students who are learning or struggling with English may be denied full participation in these opportunities.
· Economically disadvantaged schools and communities may suffer from less-effective teaching, overcrowded schools, dilapidated facilities, and inadequate educational resources, programs, and opportunities—all of which can contribute to lower educational performance or attainment.
· Small schools located in geographically isolated rural areas may not be able to offer the same diversity of educational opportunities—such as multiple world-language courses or co-curricular programs like science fairs, debate competitions, robotics clubs, or theatrical performances, for example—that are available to students in larger schools. Rural students may also have less access to libraries, cultural institutions, museums, internships, and other learning opportunities because they do not exist, they are too far away, or there is no free or low-cost public transportation.
· A lack of internet connectivity, computers, and new learning technologies in rural schools, inner-city schools, and lower-income communities can place students at a disadvantage when it comes to acquiring technological skills, taking computer-based tests, or accessing knowledge and learning opportunities online.
The concept of the deserving poor appears to go back to 1248 in England, when the poor were first forced to carry certificates of good character in order to receive food and other assistance from the nobles and local authorities. The concept was codified a little more beginning in the 1570s and in 1801, the actual term ‘deserving poor’ appears to have first been used to explicitly refer to those who through no fault of their own, specifically meaning those who were old in age, sick and disabled, were poor.
In most countries, particularly the industrialized countries, the concept of deserving poor depends on judging the person asking for assistance on their lifestyle and decision making, according to conservative Judeo-Christian religious beliefs. This has often shown itself in the creation of public policies that exclude non-white people, single mothers and fathers, and those that are members of the LGTBQIA+ community who are poor but do not fit into conservative Judeo-Christian moral requirements or norms. Throughout the history of public policy for those who are poor, there have often been requirements to force those who are poor to conform to these conservative Judeo-Christian requirements, without taking into account the actual need of the person.
The concept of the ‘deserving poor’ is one based in judgement of those who are poor, and in particular a judgement of their moral character, which makes them deserving or not.