The current resurgence in Conservative ideas owes a great deal to the belief in an “expanding culture of dependency,” as Paul Ryan put it in his “A Roadmap to America's Future.” This culture change is allegedly the result of increased government spending, or often more specifically government social spending, which is leading to dependency on government and threatening American prosperity. It does this by changing the cultural beliefs of Americans. According to the hysterics and dire warnings American's are coming to view “government... as their main source of support” resulting in “individual initiative and personal responsibility” being drained.
As a political belief, the culture of dependency argument suffers from a fatal flaw, there is little evidence in any source of measurable data that changes occurring in America can be linked to this. If government spending were tightly related to cultural changes that determine economic and political outcomes we should be seeing easily detectable alarm signals that are getting worse in proportion to rising government spending over time. Also, if this spending were leading to a cultural shift throughout America we should be seeing clear signs of this in data tracking changes in American culture and opinions over time.
But surveys do not indicate any such shift, in fact, even as government spending has increased polls reported by the Pew Research Center for the People and the Press indicate that trust in government has declined from 73% in 1958 to 22% today and that the proportion of people disagreeing with the statement that “success in life is pretty much determined by forces outside our control” has increased from 57% in 1988 (the earliest year reported) to 64% in 2009. This is hardly consistent with a mass shift in America's culture towards greater reliance on government. People don't rely much on things they don't trust. While it is true that government spending has grown over the past 60 years there's no evidence that this has led to the cultural changes alleged by conservatives. If anything, the opposite has occurred. Americans seem to be feeling they are more responsible for their own success, not less.
Yet myths claiming that a rise in government spending is directly related to a growing personal dependence on government and reduction in personal responsibility proliferate, and doubtless will be even more evident in the coming election campaign. The Heritage Foundation, for instance, breathlessly warns us of the growing dependence on government. By their numbers, an index they created of dependence on government increased from 113 in 1988 to 179 in 2000. But it's unclear why we should worry about this. How is this impacting anything that actually matters to people? In this time frame, America saw the labor force participation rate peak, at 67.1% between 1997 and 2000 compared to only 58.9% in 1965, when Heritage's index of government dependence was only 22. There is no correlation between indexes such as Heritage's and many sources of data that can serve as a measure of traits such as personal responsibility, aversion to risk, or individual initiative. People are still going out and getting jobs and supporting their family as government spending increases, in fact, more people have been doing this every decade as spending rises.
Of course, those that believe in the idea of a culture of dependency don't think it applies to themselves. They won't be surprised that a majority of Americans don't display the attitudes they are alleging. Instead, it must be some small, but growing, subset of Americans who are developing a separate culture from the majority due to the rise in government spending. The most obvious group for this is the poor who are perceived as receiving more of their income from the government than other Americans and, under this logic, would be the first to display the cultural changes from dependency.
In this respect, the notion of a culture of dependency bears a great deal of similarity to a long standing prejudice against the poor that seeks to link poverty to personal and moral failings in order to discredit the idea that social and economic forces that can be dealt with through policy and spending play a role. Strong similarities exist between these concepts and the debates over assistance given to the poor analyzed by scholars Margaret Somers and Fred Block in the paper “From Poverty to Perversity: Ideas, Markets, and Institutions over 200 Years of Welfare Debate.” They compare the remarkable similarities between debates occurring in England regarding the 1834 Poor Law Reform and US debates over welfare in the 1980s and 90s. In both cases the notion of a culture of dependency was used to discredit state assistance to the poor by arguing that poverty was not the result of situational factors but was rather because of the “perverse” habits and attitudes of the poor. This “perverse” behavior was not solely due to the bad habits of individuals, rather it was due to the misguided attempts of reformers who were trying to alleviate the problems the poor faced. By trying to help them a culture of dependency was created which prevented these people from raising themselves out of poverty by their own efforts, which they could have done if left to themselves. This culture of dependency made people lazy, led to illegitimacy, and degraded. As a result of this, if we want to help the poor the best thing we can do is to not give them a reliable means of support since if we do they will become dependent which makes them worse off in the long run.
The political success of these ideas is undeniable. A very large number of Americans have come to believe that government programs have done little to help the poor; many also go further and believe we have done more harm than good. This is the key idea behind the notion of a culture of dependency, that by having the government do things, no matter how well intentioned, people inevitably end up worse off because they stop doing things for themselves.
However, it should be obvious that this argument is too good to be true. It's successful because it tells people what they want to hear. The idea of a culture of dependency is so attractive to many people because by claiming that poverty is the result of negative personal traits created by trying to help the poor the culture of dependency flips traditional virtues on their head. Greed and selfishness are elevated to virtues that make others more moral while traditional virtues such as compassion and generosity become vices that degrade individuals. Once this is done it becomes easy to advocate to slashes in social spending meant to benefit people, if I believe in a culture of dependency not only am I looking out for my own personal interests but I am also acting in the best interests of those who would otherwise be led into a culture of dependency by this spending.
The viewpoint of a culture of dependency is so appealing because it absolves us of all responsibility for the society we live in and the fates of others. Because these ideas are so seductive and appealing they require extraordinary evidence to back them up. Otherwise, we fall into the trap of believing things to be true because they are convenient rather than believing the things that are best backed up by evidence.
So is there any evidence that the culture of dependency describes how our culture is actually reacting to rising spending? I've heard many logical sounding arguments in favor of it, but a logical argument can be made to support just about anything, at most it is a second best approach when data is lacking. There are no direct measures of whether or not individual's are more dependent, however a lot of data is collected on poor people. We can look at behavior often described as linked to dependency, such as workforce participation, to see if rising spending has been correlated with traits associated with the culture of dependency. Focusing on these measures, even if imperfect, is far less likely to inadvertently hurt people than letting our imagination run wild with rousing tales.
One of the main tenants of the culture of dependency idea is that government assistance increases poverty rather than alleviates it because people work less hard. It is true that we have not managed to eliminate poverty, and we must admit the possibility we never will. But we have done a great deal to alleviate it. Since much of the basis for the dependency argument lies in what happened with poverty before reforms in the 1980s, I'll summarize what happened in those years. Since official poverty statistics do not include in-kind assistance, such as Medicaid, housing, and food stamps, the impact of government spending on poverty is better represented by numbers from Christopher Jencks' Rethinking Social Policy which shows that poverty rates declined from 30% in 1950 to 10% in 1980, including non-cash as well as cash benefits. Over this period, government spending on social welfare increased from 8.2% of GDP in 1950 to 18.7% in 1980. These numbers are of course vary with economic conditions, but the overall picture is quite clear.
The picture is even clearer if we compare groups that the government focused money on to those that were less of a priority. Between 1960 and 1980 poverty rates for the elderly declined from 33% to 16%. Among families headed by women, poverty rates only declined from 45% in 1960 to 37% in 1980. Given the government's reluctance to spend on welfare, aside from a few targeted groups such as the elderly, means tested cash benefits, food stamps,and housing subsidies never topped 2% of GDP over this period. It should be clear both that government can reduce poverty when it chooses to and that when the politics are wrong government can choose to help people very little.
There are other problems with the culture of dependency thesis. Various studies of people receiving assistance from the pre-1996 welfare program, Aid to Families with Dependent Children (AFDC), showed that changing benefit levels did little to impact employment. This isn't surprising since welfare payments have always been too little to provide for all the needs of a family People's behavior on welfare doesn't match the simple picture painted by welfare critics of choosing solely between work and welfare. Rather, studies that examine actual behavior show that people pursue a mix of strategies and sources of income to survive; welfare is simply a part of the package. Incentives to not report income can be high due to poor program design, making it difficult to get accurate data. One study conducted in Chicago in 1988 by Kathryn Edin, reported in Rethinking Social Policy, which sought to gain respondent's trust to get better data, found that only 58% of income came from welfare and food stamps. The rest came from a variety of sources, frequently unreported. Often, this was support from family members, friends, or significant others, but roughly half of mothers reported having a legal job but working off the books. With benefits so low people are forced to become creative to survive. Not all of the actions taken are ones that could be described as socially beneficial, but the cutting of corners and other expedients pursued are hardly consistent with a culture of dependency. Rather, they show perhaps too much individual initiative.
A further point against the culture of dependency argument comes out of this study. One would expect that if there were a cultural change occurring that there would be differences in work habits between people receiving welfare for the first time and people whose families had received welfare when they were children. However, this isn't what shows up in the Edin study. According to these results, both people who received welfare as a child and people who are first generation welfare recipients show very similar frequencies of holding some kind of job and have similar amounts of unreported income.
This isn't to deny that there was probably some work disincentive effect to AFDC. It needs to be remembered however, that when this program was designed in 1965 people believed that a mother should be in the home, not working. Over the course of the program, the level of benefit reduction varied between 66% and 100% of every dollar earned. By giving single women some means of support, however, this program did what it was designed to do, keep families together. Before AFDC, if a woman couldn't rely on her family for support, her only option was often to abandon a child, usually at an orphanage although in those times leaving a baby on someone's doorstep was not completely unheard of. By giving women an additional option this program allowed women to take personal responsibility for their children, rather than leaving them to be raised by others.
Of course, times change and we no longer believe women can't both work and raise well adjusted children. While it took some time, during the 1990s extensive reforms were instituted in how the Federal government assisted low income families. Federal support for working families increased from $11 billion in 1988 to $66.7 billion in 1999. This was partially offset by reducing cash welfare spending from $24 billion in 1988 to $13 billion in 1999. Other non-cash work support aid increased from $9.5 billion in 1993 to $18 billion in 2000. 19.8% of women receiving public assistance the year before reported themselves as working the next in 1990, this rose to 44.3% in 2000. Among women receiving welfare, the share reporting earnings rose from 6.7% in 1990 to 28.1% in 1999. Overall, labor force participation among single mothers with children rose by 10% between 1994 and 1999. Poverty among single mother families also fell, from 35.4% in 1992 (33.4% in 1988) to 24.7% in 2000 (27.8% 1999).
While the strong economy helped create these results changes in Federal spending during this period also deserve credit. This increase in Federal spending seems to be strongly correlated not with a decline in personal initiative and responsibility but rather an increased willingness to improve one's situation through work. Increasing the rewards of work helped motivate women to pull their families out of the bleak situation they were otherwise facing when they found they couldn't make ends meet on a minimum wage job alone. While not all of these stories are success stories, some women did find that once they were able to support themselves off work and the subsidies, rather than having to scramble for multiple sources of income, they could focus on work and pull themselves out of poverty. These subsidies didn't “drain” their character or make their dependency on government into a virtue. Rather, it stopped the cycle of having to beg for support from anyone in a position to give it to them. Although a small helping hand was necessary to get them started, it's simply a lie to call this dependency. They did all the hard work themselves.
Despite these successes, there remains more to be done. The recent recession and ongoing economic problems have exposed many problems with earlier welfare reforms that still need to be fixed, including work disincentives that are defended as cutting costs. But this isn't a problem due to government spending, more often than not, it's a problem with insufficient spending. One of the things many programs do to save money is to reduce benefits sharply as income rises. With the old AFDC program sharp benefit reductions discouraged recipients from working and encouraged them to not report the work they did. Some Federal programs continue to reduce benefits at steep rates, as high as 50% per dollar earned. While this reduces government spending, it tends to not encourage personal responsibility.
We also know from past attempts that cutting benefits does little to encourage work, falls in benefit levels have historically had little impact on employment. Given the option, people turn to friends, family, or community resources before turning to government. The role government plays is giving people that don't have these options the option of turning to government, rather than an abusive boyfriend, pimp, or other even more unsavory choices. Increasing spending, however, can allow us to reduce these benefit reductions and encourage work, especially when combined with better designed programs and eligibility rules, giving individuals a greater range of choice as well as responsibility for themselves and their children. After all, if losing 35 cents on every dollar to income taxes is believed to discourage work for people whose wages tend to increase in increments of thousands, as conservatives believe, imagine how much a 50 cent on the dollar reduction reduces incentives for someone happy to get a raise of 25 cents an hour.
“A Roadmap to America's Future” says that “if one exercises liberty irresponsibly – ignoring consequences, and refusing to accept them – that freedom eventually will be lost.” I agree, and this is the motivation behind writing this article. America faces many challenges today, if we are to face them successfully we can't afford to refuse to acknowledge, and accept, the consequences of our past actions. That means that we have to study our own history, learn from our mistakes, and seek to do better next time. But to do this, we have accept what our history shows rather than relying on convenient myths. We can't let ideology replace our actual history. We can't let an ideologue insult millions of Americans by telling them that the assistance they have received from government has drained their individual responsibility and personal initiative, given them an aversion to risk, sapped their entrepreneurial spirit, and suffocated their potential for prosperity. Two hundred years ago, the dependency argument may have been plausible, we lacked the data to judge. Today, opinion polls, labor data, data on firm start ups, and data from myriad other sources exist that tell us that these contentions are flat out wrong. Instead, far more people participate in our work force than before the New Deal and our poverty rate is vastly below what it was then. Americans continue to display an unparalleled thirst for entrepreneurship and risk. Both as individuals and as a society we continue to have problems to overcome but the way forward is not to look back towards an idealized past that doesn't measure up to our vision of it, or the present. Rather, we must be honest about how far we've come and figure out where it is that we want to go. Not recognizing our actual achievements and running as fast as we can from threats no proof exists for isn't a roadmap, it's a way to get lost.