Bernie’s soda stumble

Sen. Bernie Sanders has managed to run a wildly and unexpectedly successful presidential campaign not just as a proud democratic socialist, but as a democratic socialist who unabashedly wants to raise your taxes.  While he leans most heavily on raising revenue out of the highest earners, he would also increase taxes on low-income and middle-class families to pay for a host of new government programs.

And a growing legion of Democratic primary voters have decided they are completely okay with this.  The Sanders campaign has insisted that voters consider both sides of the ledger, arguing that tax increases are worth it to pay for new government-provided goods like single-payer healthcare with reduced out-of-pocket costs, and like free college with no out-of-pocket tuition.  Millions of voters seem to agree.

This is a revolutionary political achievement.  Sanders has cracked the decades-old Democratic taboo that says it’s political suicide to even suggest middle-class tax hikes.  He has asked us to embrace not just Scandinavian-style policy, but a Scandinavian-style mindset that tolerates higher taxes in exchange for more collective goods and, in turn, more choice and freedom.

Which is why it is such a disappointment to see Sanders seemingly abandon these principles in the Pennsylvania Democratic Primary.  Philadelphia mayor Jim Kenney has proposed to provide universal preschool for the city’s four-year olds.  He has also proposed paying for this initiative by implementing a soda tax, levying a 3 cents-per-ounce on sugary drinks.  As Margot Sanger-Katz of the New York Times explains, “That means a tax of $4.32 on a 12-pack of soda, which typically costs between $3 and $6 at the grocery store. It would come to 60 cents of tax on a 20-ounce bottle, which usually retails for about $2.”

Hillary Clinton came out in favor of the plan, implicitly arguing that universal pre-K is sufficiently important to justify the tax increase.  But Sanders strongly opposed the mayor’s plan.  He argued that it’s a regressive tax that disproportionately hits low-income consumers, who tend to buy more sugary drinks. “Mayor Kenney deserves praise for emphasizing the importance of universal pre-kindergarten,” he wrote in a Philadelphia op-ed.  “But at a time of massive income and wealth inequality, it should be the people on top who see an increase in their taxes, not low-income and working people.”

There are a few curiosities about Sanders’s position.  For one, a tax on soda would hardly be the first sin tax to disproportionately impact the poor.  Mayor Kenney’s plan is reminiscent of the Clinton administration’s Children’s Health Insurance initiative, which provided healthcare for low-income children and was funded by an increase in the excise tax on cigarettes.  Yet low-income Americans tend to be heavier smokers than the wealthy, meaning that the poor bear the brunt of higher cigarette taxes.

In their recent legislative biography of 1990s era Ted Kennedy, authors Nick Littlefield and David Nexon explain the political strategy behind this funding scheme for CHIP.  By paying for a new government program through a tax increase targeted at cigarette smokers and manufacturers, congressional liberals were able to catalyze public health groups as a constituency backing the proposal as a counterweight to opposition conservative anti-tax interest groups.  When tax increases are more diffuse, the anti-tax groups often go unopposed during the legislative process.  In this way, liberals were able to ride the momentum of the anti-smoking push of the 1990s to finance an important new social program.

Today in Philadelphia, Mayor Kenney is attempting to replicate this strategy by garnering support from public health organizations to counter the anti-tax messaging of conservative groups and the American Beverage Association.  Like cigarette taxes, a soda tax can be pitched as not just a funding stream, but a good in itself as a health-improving deterrent against bad consumption habits.  Kenney is making the same calculation today that Kennedy, Clinton, and other liberals made in the 1990s: that any regressive impact of a sin tax is outweighed by both the health gains from the tax and the gains for poor children from the program the tax is funding.

This is a quintessentially Sanders style of analysis, which is why it’s so odd to see him dispensing with it at this stage of the game.  Take Sanders’s proposal for free college tuition.  Sanders has proposed treating public college the same way we treat K-12 education, where all students can attend for free.  It turns out that this is a fundamentally regressive proposal.  The rich reap most of the benefits from free college, largely because they tend to attend more expensive schools, and because colleges already impose a sort of private progressive redistribution via financial aid packages.

Sanders justifies his free college plan by arguing that any regressive impact is counteracted by his progressive tax plan—that the rich will more than pay their fair share through higher taxes.  Specifically, he plans to pay for free college through a tax on Wall Street high-frequency financial transactions.  Here, Sanders asks us to consider both sides of the ledger, arguing that it’s worth enacting a somewhat regressive social program—one that disproportionately benefits the rich—coupled with a progressive funding scheme.

Philadelphia’s pre-K plan is the opposite: a progressive social program coupled with a somewhat regressive funding scheme.  But Sanders isn’t evaluating both sides of the ledger now.  It would be one thing if he was arguing that the value of universal pre-K is too uncertain to justify a regressive new tax.  Given the uneven findings around the long-term impact of pre-K, it’s a case he certainly could be making.

But Sanders isn’t making that argument.  He agrees that pre-K is important, but nonetheless rejects Philadelphia’s plan out of hand because it relies on a “totally regressive tax.”

Bernie is better than this.  He has done wonders to shake the Democratic Party out of its fear-driven Tax Pledge Lite dogma.  Just five months after saying “I don’t see how you can be serious about raising working and middle-class families’ incomes if you also want to slap new taxes on them—no matter what the taxes will pay for,” Hillary Clinton herself has come around to embrace a tax that impacts low- and middle-income Philadelphians.  That quick evolution, however slight, is directly traceable to Sanders proving that it’s not a political death knell to raise taxes outside of the top 5 percent.

To get a robust social insurance system in the United States, our debate can’t just focus on taxes in isolation.  Instead, Americans must look at both costs of a policy and its benefits, and decide if a given program is a good deal.  If they come to agree that new social insurance programs are worth paying for, that’s a far bigger achievement for liberalism than telling Americans they can have new benefits paid for entirely by the rich’s money.

That is the reasoned calculus the Sanders campaign has been offering throughout the 2016 primary.  While a progressive tax code should of course tilt toward taxing the rich, the scope of Sanders’s social democratic vision also requires broad-based buy-in from working Americans.  He shouldn’t abandon a message with massive long-term political import for the sake of a futile last-ditch effort to win cheap political points.

How charter schools can play the lottery to promote diversity

Matt Barnum of the education reform outlet The 74 recently wrote about the role charter schools should play in fostering racial and socioeconomic integration.  “If policies and the commitment of charter operators can expand effective, integrated charter options,” Barnum concludes, “these focused schools of choice could well help foster a durable system of classroom integration that, to date, both charter and district schools have largely failed to provide.”

Barnum points to three sets of policies that could increase integration in charters: district-wide controlled parental choice plans, “mirror image” laws that require charters to reflect the demographic mix of local public schools, and weighing charter school lotteries to give priority to underrepresented or disadvantaged students.  (The Century Foundation also has a good rundown on the different integrative methods schools can utilize.)

It’s worth focusing a bit on the last of these policies.  Charter schools are legally required to hold an entrance lottery if they receive more applicants than they have spaces for.  Structuring the entrance lottery to favor a certain group of applicants thus raises interesting legal and philosophical questions about how charters should craft their student bodies and conceive of their educational mission.

Until recently, weighted lotteries had been subjected to significant constraints by the Department of Education.  These constraints have been modified largely by the advocacy of New York’s Success Academy, one of the most successful charter networks in the country.

By way of background, charter schools have historically been accused of failing to enroll their fair share of high needs students.  In 2010, New York amended its charter school law to set quotas requiring charters to enroll a certain number of English language learners and special education students.  In order to meet its quota, Success Academy moved to weight its lottery to favor these groups.

Federal law, however, complicated Success Academy’s plan.  The federal government’s primary vehicle for supporting charter schools is the Charter School Program, which provides a number of funding sources to charter schools.  To be eligible for funding, oversubscribed charter schools like Success must hold a “lottery” to pick entrants.  The Department of Education had interpreted “lottery” to mean that there cannot be any preferences.  This meant that Success’s weighted lottery violated the CSP, jeopardizing $15 million in grants that the network had been awarded.

Shelving its plan, Success Academy continued lobbying the Department of Education to reconsider.  In January 2014, the Department changed course, deciding that “a charter school may weight its lottery to give slightly better chances for admission to all or a subset of educationally disadvantaged students if State law permits the use of weighted lotteries in favor of such students.”  For purposes of the CSP, “educationally disadvantaged students” include low-income students, students with disabilities, and English language learners, among others.

This cleared the way for charter schools to prioritize certain disadvantaged students without sacrificing federal funding.  Some education experts suggest going further, however.  Richard Kahlenberg, an education scholar at the Century Foundation, argues that charters should be allowed to weight their lotteries to favor any group that is underrepresented in a school.  This would allow “[h]igh-poverty schools [to] set aside seats for middle-income families, or all-black schools [to] make room for white, Latino and Asian students[.]”

In fact, some schools already do versions of this, despite becoming ineligible for federal funds.  To avoid running afoul of Supreme Court limitations on using individual student race in K-12 school assignments, these schools draw on neighborhoods or zip codes as proxies for race in constructing a lottery algorithm, relying on residential segregation patterns to yield a diverse entrance class.  For instance, San Diego’s High Tech High aims to educate students that mirror local diverse demographics.  It therefore employs a weighted admission algorithm that yields a student body that is 65% minority and 38% low-income, while getting 100% of its students accepted into college.

So charter schools have tools at their disposal to promote integration.  But these tools may require grappling with and evolving upon principles that some charter advocates hold dear.  For one thing, tinkering with the lottery is plainly at odds with the egalitarian ethic of charters’ current admission process.  Charter advocates seek equality of educational opportunity, be it at the time of admission—that everyone should have a fair shot at going to a good school—or throughout elementary and secondary school—that the poor should get the same caliber education as the non-poor.  Tilting the odds in favor of an underrepresented group seemingly drifts away from this basic principle.

Some advocates also find it doubly offensive to give favor to students who don’t truly “need” to attend a charter school.  The core mission of charter advocates is to serve the underserved.  When charter schools award seats to middle-income students who have access to other high-quality public or private schools, these spots come at the expense of poor students with few other good options.  For instance, Marcia Aaron, the executive director of the 99 percent minority and 89 percent low-income KIPP Los Angeles schools, told the Huffington Post that any policy that reduces available seats for disadvantaged students communicates that charters aren’t sufficiently focused on providing high quality options for the neediest children.  To some, cultivating a critical mass of white middle-class students in charters is just not a good use of scarce resources.

However, there is a bounty of research showing that learning in a socioeconomically diverse environment is hugely beneficial for all students, and especially for the very population of underserved students that charters aim to serve.  Studies show that students living in low-income housing who attend integrated schools perform better than those attend high poverty schools.  Other research shows that integrated education leads to better academic performance, college enrollment, and economic gains for all students.

The high performance of some well-established charter networks like KIPP and Success Academy shows that low-income students can thrive in a racially and economically isolated setting.  This is tremendous news for schools and children in deeply segregated neighborhood tracts, where integration may be virtually impossible.

But in schools where integration is possible, cultivating diversity is a boon for low-income students, yielding benefits that charters simply cannot provide in isolation.  Unlike traditional public schools, charters aren’t necessarily beholden to their immediate geography.  These schools should move beyond the admirable initial instinct to triage the K-12 academic opportunities for as many disadvantaged students as possible, taking a broader conception of what it means to give these students a quality education.  In the twenty-first century, any definition of a well-rounded education requires a meaningful degree of integration.

Our policy is slowly coming around to this view.  Once a key plank of civil rights policy, proactive school integration fell out of favor after the busing controversies of the 1970s.  Lately though, more and more schools across the country are actively attempting to provide their students with a socioeconomically diverse learning environment.  In his final budget, President Obama has proposed a $120 million fund to reward schools who come up with the best plans to foster diversity.  And the Department of Education is now seeking ideas about how federally-funded turnaround schools can promote voluntary socioeconomic diversity.

One of the ideas on the table for many charter schools should be structuring their lotteries to prioritize a diverse student body.  Federal and state policy, along with philanthropic funding, will need to catch up so as to not disadvantage charters that deliberately try to integrate.  And some charters will need to reassess what it means to provide an equal educational opportunity to disadvantaged students.

But these are steps worth taking.  The data have long confirmed that students of all classes and races benefit from going to school together.  A renaissance for integration would thus be remarkable progress for our education system.

The cost of kludge

There was a good piece yesterday at the New York Times’ Upshot by Susan Dynarski detailing how the $20 billion the government spends each year on tax credits for families paying college tuition is producing no rise in college enrollment. The squandering of this tax subsidy is a good case study and quantification of the consequences of kludgy public policy.

Each year, the government spends more than $20 billion dollars on a trio of tax credits for families paying for college: the Hope Tax Credit, the Tax Credit for Lifelong Learning, and the American Opportunity Tax Credit. The primary aim of these tax credits is to partially subsidize the cost of attending college, with the goal of encouraging more students to enroll in school.

How effective are these tax credits at boosting college enrollment? Not effective at all, it turns out. Dynarski describes the findings of a careful study conducted by economists George Bulman and Caroline Hoxby finding that the tax credits had no discernible impact on college enrollment. The economists poured through IRS data studying the tax credits’ “phase-out” zone—the income range where the credits are steadily reduced to taxpayers—to see if the declining value of the credits had any effect on household behavior. As Dynarski puts it, “If the tax credits help to increase college attendance, we should see this positive relationship between income and college attendance weaken where the tax credits phase out.”

Bulman and Hoxby found that phasing out the tax credits had no impact on enrollment decisions. This is probably because (1) the tax credits disproportionately help middle-class and high-income families, for whom a $2,500 tax incentive is unlikely to be a determinative factor in enrollment; and (2) the timing of the tax credits is not aligned to enrollment and tuition payment.

The latter explanation—that these tax credits are delivered too late to have any impact on decision-making—is part of an increasingly common shortcoming in our public policy. We are essentially pumping $20 billion or more each year to enroll students in college each fall, but we aren’t paying families this subsidy until the following spring. This makes little sense if the goal of the policy is to make college accessible and to nudge toward enrollment those students for whom $2,500 matters.

This timing mismatch has become endemic as we’ve increasingly baked our social policy into the tax code, creating what Suzanne Mettler calls a “submerged state” of hidden subsidies and social programs. For instance, in order to curtail direct subsidies to the poor, we’ve replaced traditional welfare with a tax credit meant to increase the value of work. But this tax credit only comes in an annual tax refund, and is not aligned to the work someone puts in throughout the year. And rather than pay families a child allowance to help defray the cost of raising a family, the government instead gives families a single annual tax break that is also ill timed to meet year-round childrearing costs and needs.

In short, we’ve been constructing policy in the exact opposite of the maxim of Occam’s Razor: the simplest solution might be the best, but the most convoluted policy has become the one most likely to see enactment. Political scientist Steven Teles wrote in 2012 about our political system’s bias toward policy “kludges”—that is, “inelegant patch[es] put in place to be backward compatible with the rest of a system.” “For any particular problem,” Teles writes, “we have arrived at the most gerry-rigged, opaque and complicated response.”

The higher education tax credits are classic kludges. Rather than simply directly subsidize the cost of college, we’ve opted to bury this financial benefit in the tax code, where it is difficult to access and doesn’t arrive when families need it.

There are numerous institutional causes of this tendency toward kludging. Our constitutional structure—a presidential national government with divided powers, and “marble-cake” federalism with overlapping state and federal authority—creates endless veto points that legislation must pass through. During the political and legislative process, relatively simple policy is often compromised down to an unduly complex shadow of its former self in order to appease a consensus of interests. In divided government, a president cannot simply sweep away past programs in the same way a parliamentary system might; instead, presidents typically must layer new policy on top of old programs in recognition of societal path dependence.

Moreover, the ideological preferences of the parties sometimes converge around policy kludges. Liberals favor government spending, while conservatives favor tax relief. Increasingly, both sides have mutually agreed to funnel targeted government spending through the tax code in the guise of a tax cut.

Teles points out that this easy harmony is bad for both liberals and conservatives. For liberals, sneaking social spending through the tax code leaves people unaware that they are benefiting from government assistance, facilitating the conservative myth of rugged individualism and pure self-reliance. For conservatives, promulgating kludgy and hard-to-navigate tax programs creates a class of private actors benefiting from the complexity, turning them into lobbyists for entrenched government activity. This is why we see the tax prep industry fight so ardently against any attempt to simplify our uber-kludge tax code.

These costs of kludge, however, are nothing compared to the social costs. More than 20 percent of families eligible for the Earned Income Tax Credit fail to claim it. And now we know that the $20 billion spent to help families afford college has little impact. By relying on kludges, we create needless inefficiency and dilute the effectiveness of our policy programs.

This is not to say that these compromise policies aren’t worth enacting. They help inculcate the principle that education, work by low-income families, and children are worth investing in—a principle that hopefully can later be built upon with a more robust system of investment.

The shining example of what our policy could be is Social Security, where government simply collects a tax from our paychecks during our working years and automatically cuts us a regular check during retirement. If our higher education subsidy program operated like our retirement subsidy program, we would simply subsidize the cost of tuition at the time of payment, rather than asking families to front the money and be reimbursed come tax time.

Social Security is one of our most popular government programs precisely because of its simplicity. And there is a clear political hunger for dredging the kludge out of our public policy. Much of the appeal of Sen. Bernie Sanders’s call for single-payer healthcare seems to arise from a basic desire to simplify the experience of accessing and financing healthcare, and to streamline the current system’s exhausting byzantine complexity.

As tempting as it may be to root the kludge out of the system, the political and institutional forces toward kludge show little sign of ebbing. However, there is one solution to move us in the right direction that may be more politically palatable: fighting kludge with kludge.

ObamaCare is often held out as the ultimate kludge, hacked up by five congressional committees to finance quasi-universal health insurance through a fragmented and primarily private system. But it did subvert at least one kludge: it pays out its “tax credits” for those who purchase private insurance on health exchanges directly to the insurers at the time of premium payment. Had ObamaCare gone full kludge, it could have taken after our other submerged state policies and paid out its subsidy for insurance through a single lump sum tax refund, which would have drastically reduced its usefulness and effectiveness.

Instead, ObamaCare has helped establish the advance tax credit as a feature of our public policy. We could use the ObamaCare model to turn our higher ed subsidies into advance tax credits paid directly to colleges, as Bulman and Hoxby suggest. We could similarly make periodic payments of our childcare and earned income subsidies to families to better meet childrearing costs and work performed. (The Earned Income Tax Credit used to have an advance payment option, but the program failed to gain much traction because it was run through employers and suffered from other flaws.)

It’s worth noting that the term “advance tax credit” is somewhat disingenuous, as an advance tax credit is indistinguishable from a subsidy. But this structure capitalizes on our political system’s preference for tax spending while ironing out the timing inefficiencies inherent in traditional tax credit subsidies. In doing so, our policy more closely approximates the direct and simple tax and payment structure of Social Security or a child allowance.

Our policymaking instinct to settle upon the kludgiest common denominator is producing suboptimal outcomes, costing us in college enrollment and beyond. In a perfect world, we’d straighten out our roundabout tax code social programs. But in the world we live in, we may stand a better chance of bending backward from the status quo to improve the efficacy of our tax credit welfare state—in essence, doubling down on kludge.

Rainy Day Refunds (#12 & 35)

Tax Day is upon us, and Senators Cory Booker and Jerry Moran have teamed up on a bipartisan proposal aimed at using tax refunds to boost Americans’ savings. Known as the Refund to Rainy Day Savings Act, the bill would give tax filers the option of setting aside a slice of their tax refund in an emergency savings account, to be direct deposited into their checking account six months later.

Under Booker and Moran’s plan, tax filers could elect to set aside 20 percent of their refund in a savings account, where it would accrue interest for six months before being transferred into their checking account. The idea is to smooth out payment to make tax refunds last longer and give families a cushion if a financial emergency strikes later in the year. “This bipartisan legislation would allow Americans to utilize a rare moment of financial flexibility that accompanies a tax refund to plan for the future, set aside savings for a rainy day, and invest in their own financial stability,” says Senator Moran.

On the whole, this is an encouraging idea. The brutal combination of rising financial pressure and stagnating incomes has left too many Americans treading water with their savings. More than 60 percent of Americans have less than $1,000 in their savings accounts, and one out of five doesn’t even have a savings account. Even when factoring in liquid assets like retirement accounts, 44 percent of Americans do not have sufficient  funds readily accessible to stay afloat for three months. Booker and Moran hope to capitalize on the momentary financial breathing room following a tax refund to help Americans build up their savings.

Booker and Moran’s bill is particularly concerned with building savings among the most vulnerable Americans. “Families living paycheck to paycheck endure the persistent threat of sudden financial disaster,” Senator Booker notes.

To that end, their bill is largely an adaptation of a policy proposal from the Center for Enterprise Development to reform the Earned Income Tax Credit in order to provide targeted help to these very families. The EITC is a refundable tax credit and one of our most important anti-poverty programs, providing up to $6,000 in additional income to working families annually. Under CFED’s Rainy Day EITC plan, families could shift 20 percent of their EITC refund into a savings account, getting it back six months later. (CFED’s plan also gave families a 50 percent match from the IRS to further incentivize saving.  Booker-Moran does not.) It’s one of several policy proposals attempting to spread payment of the EITC to alleviate the uneven “boom and bust” yearly budget cycle for low-income households getting a large portion of their income in a single tax refund.

I’m partial to a variation of this type of plan that advances half of a family’s EITC refund in four quarterly payments, rather than deferring it. This plan was authored by Brookings Institute economist Steve Holt and tested out in Chicago by the Center for Economic Progress. CEP found that families receiving periodic payments had  financial stability, reduced their debt load, avoided predatory loans, and accumulated savings.

Poor families have a hard time saving because their resources are already stretched too thin to meet regular living expenses. One study found that 40 percent of EITC dollars were used to pay down debt, while only 39 percent of families put any of their refund toward savings, tucking away on average of 15 percent of their refunds. Because families live in the red year-round, their tax season windfalls are disproportionately spent paying down debts accrued by just keeping up with the year’s expenses. Paying off these debts from budget shortfalls thus crowds out savings.

If we advanced EITC refunds to families, they could draw on this money throughout the year to avoid sinking into costly debt. Freeing up resources that are currently spent clawing out of debt would give families the chance to create meaningful savings.

CFED considered structuring their Rainy Day EITC as a periodic payment, but evidently determined that “monthly payments would be too small and would simply be incorporated into monthly budgets,” and that other periodic deliveries would be unduly complex. However, a structure like CED’s Chicago experiment, which pays half of a family’s EITC in four quarterly payments, would retain simplicity and make fairly sizable payments: a family due $4,000 in EITC would receive $2,000 at tax time and $500 every three months. Absorbing these $500 payments into their regular budget is precisely the point, allowing families to make ends meet without relying on debt.

In their study of EITC-recipient families in It’s Not Like I’m Poor, Kathryn Edin, Laura Tach—both of whom also co-authored the Rainy Day EITC proposal—and their co-authors found that these families approached tax season with tremendous anticipation and great relief when their refunds arrived. Tach and Edin laud the lump sum EITC for providing a rare opportunity for families to rise above scraping by to plan their finances, make investments, and indulge in middle-class treats. But the joy of these families at tax time is the joy of someone momentarily lifted above constant financial peril, resembling the desperate relief of a person starving for a year finally receiving a decent meal.

Indeed, Tach and Edin found that for these families, “[g]etting into debt and trying to dig out of it were near-universal experiences.” This financial whipsawing could be avoided by padding the budgets of these families year-round. Staving off debt is the first step to building the savings cushion needed to survive misfortune and financial shock.

The proposal from Senators Booker and Moran may well help do that. It’s a laudable effort to unlock the money due to American families from the strictures of the tax code. As a general matter, the resources owed to Americans should be made available to meet the actual timing of their needs, not tuned to the tax calendar.

But there might be more effective ways to help some of the most vulnerable families shore up their finances. When families wait on their tax refunds all year long as a chance to finally catch up, it’s hard to ask them to withhold this money for a rainy day. These families might benefit more from help today than they do from setting aside for tomorrow.

Why we need Big Government in the twenty-first century

For years, it has been an article of faith in American politics that more government is bad for the economy.  Candidates for office run against Washington and condemn the overreach and toxicity of government, with hardly anyone batting an eye.

But the conventional wisdom that government has only negative effects on markets forgets the invaluable role that government played in creating our national wealth in the past.  Re-learning this history and its lessons is critical to fostering widespread prosperity in the twenty-first century.

Government has had few defenders in politics over the last 35 years.  On the right, beating up on government has been a sure-fire way to gain approval on the right.  In the 1980s, Ronald Reagan issued a clarion call that government was the source of our problems, not its solution.  In the 1990s, House Speaker Newt Gingrich declared open warfare on government.  And in the Obama years, Senate Majority Leader Mitch McConnell ground the legislative process to a halt to breed frustration with government and reap electoral gains for his anti-government party.

Liberals have let the right’s government bashing go unchallenged, and at times have actively acquiesced to it.  Bill Clinton essentially conceded conservatives’ point when he pronounced the era of Big Government over in 1996.  The center-liberal position was that government needed to get smarter, more effective, and more efficient—not grow larger.  We see this today in Hillary Clinton’s campaign.  Though her policy platform implicitly recognizes that government can intervene constructively in the economy, she has criticized Bernie Sanders for wanting to grow the size of government.  And while Sanders unabashedly favors expanding government programs, he also sees our current government as overrun with special interests and big money.  This too darkens our confidence in our governing institutions.

The net effect has been a steady cultural decline in our collective faith in the capabilities of government.   The right has attacked government as inherently destructive; the left has ceded the point outright or through silence, while bemoaning government as hopelessly corrupted.

But what if this widespread cynicism about government has it all wrong?  That’s the message of Jacob Hacker and Paul Pierson’s important new book, American Amnesia.  According to the two political scientists, the American experience shows that “government and markets, working in tandem, have steadily increased human welfare.”

Hacker and Pierson demonstrate just how indispensable government was in creating twentieth-century prosperity in the United States.  Beginning just after the turn of the century, government interventions vastly improved public health, extended lifespans, stabilized financial markets, and laid the groundwork for robust economic growth.  Government fronted the research and development that led to revolutionary technologies years later, laying the platform upon which companies like Apple, Google, and countless others would later thrive

The result was a century of phenomenal improvement in human wellbeing.  Because government interceded to craft a mixed economy, Americans became wealthier, healthier, lived longer, and enjoyed the fruits of innovation and technology.

For much of the twentieth century, government and markets worked in happy (if at times begrudging) harmony.  The heavy thumb of government stepped in when the market failed to account for harmful externalities like pollution.  Government produced public goods like roads, infrastructure, education, and scientific research that markets wouldn’t.  This laid the groundwork for the dexterity of markets to build off of public investment to innovate and improve quality of life.

The mixed economy became so institutionalized that we eventually lost sight of the role government played in providing these crucial market supports.  By the end of the century, government had become widely demonized while entrepreneurs were lionized.  Yet during this same period, the mixed economy stopped functioning as well as it once had.  Inequality steadily rose, wages stagnated, our financial sector once again periodically jeopardized the economy, growth and recovery slowed, and wealth flowed more and more to the highest earners.

The solution is for government to reclaim its place in creating healthy conditions for the mixed economy to operate.  The deterioration of the mixed economy has left “money on the table just waiting to be picked up,” write Hacker and Pierson.  Government action can create positive-sum outcomes, making “our already prosperous society much more prosperous.”

Two of the main ways government can cultivate a thriving twenty-first century mixed economy are to invest in public goods and to insure against common risks.  When government invests in physical and intellectual infrastructure that is collectively useful but no one firm would invest in on its own, our markets are made better off by this government intervention.  And when government protects us against risk, it greases the wheels of markets by giving entrepreneurs and workers the security to take chances.

Sometimes the social insurance and public good functions of government dovetail together.  Take our child poverty crisis, where one in five children live in poverty today.  A decent safety net would protect children from the ravages of poverty on humanitarian grounds alone.  But high childhood poverty is also a market failure that is bad for our mixed economy. Growing up in poverty stunts lifetime academic and professional achievement.  By one count, child poverty costs our economy some $672 billion annually.  It is impossible to know how many would-be innovators and leaders we have lost amid the squandered potential of children suffering in deprivation.  Investing in children to protect them from poverty is thus also a public good—and one that pays off handsomely in the long run.

Moreover, the absence of government action can often impose burdens on the private sector.  In the void left by government inaction to address the student loan and college tuition crisis, private firms are increasingly shouldering the burden of helping young workers repay their burdensome debts as another fringe benefit.  But when government tackles common social problems and provides access to benefits, like health care and pensions in old age, firms are relieved of some of the obligation to manage these extraneous benefit programs.  Government actions to take on the social insurance responsibilities that have traditionally (and bizarrely) been grafted on to our employers encourage a dynamic, nimble future for the American economy.

For too long, the virtues of government have gone unheralded in our politics.  To make the mixed economy work again, we re-learn a tried and proven method to success that has largely been forgotten: that in the story of American prosperity, government has always been an integral force.

Cash: A poverty solution worth trying

As I’ve been writing lately, child poverty is a moral crisis in the United States. One in five children live below the poverty line today—a hardship that devastates their development and inflicts untold suffering.

This national plight—and in particular, our abysmal tolerance for child poverty relative to other countries—has gained some attention lately thanks to the prominence Sen. Bernie Sanders has given the issue in his presidential campaign. But how exactly can we best help children living in poverty? It’s worth exploring recent experiments importing successful poverty relief programs from abroad, and thinking about how we can best adapt them to meet the needs of children in the United States.

Between 2007 and 2010, New York City undertook a bold policy experiment to put more cash in the hands of low-income families. The pilot program, spearheaded by Mayor Bloomberg and funded by a consortium of philanthropic organizations, offered 4,800 poor families cash payments for certain beneficial activities. For instance, families could receive $200 for an annual doctor’s visit, or $150 per month for maintaining a full-time job. Students could earn $25 to $50 per month for good school attendance, and $600 for passing a high school Regents exam.

The experiment was a conditional cash transfer (“CCT”)—a policy that transfers cash benefits to low-income families upon their meeting set conditions, which has been implemented with tremendous success in Latin American countries like Mexico. New York’s program—known as “Opportunity NYC Family Rewards”—was the first of its kind in the United States or any developed nation.

Family Rewards had mixed results. All told, the program transferred an average of $8,700 to families during the three-year period. It reduced poverty, cut down hunger, and boosted family savings. But the program fell short of expectations in other metrics. Most notably, the program failed to increase educational outcomes for elementary or middle school outcomes—a key policy goal for the Bloomberg administration.

Some speculate that the design of Family Rewards was flawed from the start. Lawrence Aber, a New York University professor who participated in the implementation of Family Rewards, thought that the program’s conditions were too many and payments were too infrequent and too small. The program may have overwhelmed families with its ambition, trying to test too many incentives at once.

New York has since simplified the program and rolled out Family Rewards 2.0 on a smaller scale and with more targeted conditions in an attempt to improve upon the first pilot. For students, version 2.0 limits its rewards to high schoolers, providing cash benefits for attending 95 percent of scheduled school days in a month, taking an SAT or ACT exam, receiving good grades on a report card, and passing a Regents exam. This program has been implemented in both the Bronx and in Memphis, Tennessee. The results are still under evaluation, but early reports showed that they increased family incomes on average by more than $2,000 a year.

While the impact of the original Family Rewards program’s educational incentives so far appear disappointing, the program was successful at keeping families out of poverty by padding their incomes. This in itself helps promote academic achievement. We know that higher family income tends to improve students’ outcomes in school. In low-income families that receive higher EITC and CTC refunds, their children score higher on standardized tests, are more likely to graduate high school, and attend college in greater numbers.

This is because poverty impairs the ability of children to succeed in school. Material deprivation interferes with cognitive processes, clouds students with toxic stress, and reshapes their brain functioning. Providing enough income to reduce poverty and reverse these harmful processes thereby has the effect of making it easier for students to achieve in school.

So what’s needed for low-income children likely is not an incentive structure to succeed in school or a set of rewards to enhance the value of academic growth. Rather, what’s needed is basic household income security to provide the foundation of resources necessary to facilitate learning and development — something like a child allowance. Such a policy would provide families with a basic payment to cover the essentials of raising children and setting them on a path to success from the earliest years of life.

But it may still be worthwhile to consider coupling the support of a child allowance with the nudging instinct of Family Rewards. One could imagine an enterprising city, school district, or even a well-funded and ambitious charter school trying out an initiative that (1) provides each family with a monthly “scholar success stipend” for each of their children, and (2) conditions receipt of a full payment on children meeting certain basic expectations in school.

For instance, a school could offer each family $200 per child each month as a stipend meant to help with the cost of school supplies, clothes, food, and any other resource the family’s children need to succeed in school. But if a student missed too many school days, or failed to meet behavioral standards, or didn’t complete homework assignments, her family’s payment could be docked.

This structure has a few things going for it. When families receive a periodic and stable child allowance, they can better plan and incorporate this benefit into their household budget and incorporate this payment into spending and saving decisions. It’s much harder to do so with an incentive-based cash transfer program that pays out irregularly upon taking certain prescribed benchmark actions.

Moreover, we know that most people tend to be driven by significant loss aversion. That is, they fear loss of existing income more than they value the opportunity to gain additional income of the same amount. This means that for some families, the prospect of a $25 loss for failing to attend school regularly might be more powerful than a $25 reward for good attendance.

We also know that when a benefit is earmarked for children, parents do in fact tend to spend it on their children. Sociologist Jane Waldfogel studied Britain’s war on child poverty, and found that low-income families receiving child benefits increased spending on goods like children’s clothing, books, and toys, and decreased spending on alcohol and tobacco. The evidence thus refutes the boogeyman myth that poor families can’t be trusted to spend cash benefits appropriately.

Children are better off when they live in homes with enough income to meet basic needs. Decent economic security saves children from a whole range of detrimental long-lasting disadvantages, and sets them on the road to success. One way or another, curbing child poverty is an investment we desperately need to make, and it’s worth experimenting with how we can deliver this investment to families in the most effective way possible.

Looking to Britain in fighting child poverty

The Century Foundation recently released an important report on how we can provide more support for American children by adopting a universal child allowance.  With nearly twenty percent of American children living in poverty, a regular cash benefit for all families would provide effective and efficient protection against hardship.

One of the lead authors on the TCF report is Jane Waldfogel, a sociologist who has studied Britain’s remarkably successful fight against child poverty. Over the last fifteen years, Britain has cut child poverty in half with a combination of child benefits, early childhood investments, and family-friendly work policies. In 2012, Waldfogel gave a lecture at Cornell University on “What the U.S. Can Learn From Britain’s War on Poverty,” detailing exactly how Britain halved its child poverty rate and drawing lessons for the United States.

During the 1980s and 1990s, Britain saw its child poverty rate rise rapidly. In 1999, Prime Minister Tony Blair committed Britain to ending child poverty in twenty years. To do so, he aimed to “reform the welfare state and build it around the needs of families and children.”

This ultimately became a three-pronged strategy: (1) Promoting work and making work pay, by incentivizing work and boosting take-home pay; (2) Raising incomes for families with children by subsidizing child-rearing costs; and (3) Investing in children via early childhood services and workplace reforms.

Britain modeled its efforts to promote and incentivize work off of American welfare reform under President Clinton, with some key departures. Like the U.S., Britain adopted welfare-to-work programs and a working families tax credit similar to the U.S. Earned Income Tax Credit.

But Britain’s reform did not initially require single mothers to work. Under welfare reform in the U.S., single mothers must work 30 hours each week or risk losing benefits, and are only eligible for traditional welfare benefits for five years. Britain, on the other hand, only recently required single mothers to work once their oldest child turns twelve (a requirement that has since been lowered to seven under David Cameron).

Britain’s second front against child poverty was to raise incomes for families with children. Britain provides a universal child benefit to all families with children. Families receive about $30 per week for their first child, and $20 per week for each additional child. Britain provides an additional benefit to families with children less than ten years old, determining that it’s a good investment to provide extra support for children in their earliest years.

Britain also boosted family incomes by adopting a child tax credit, which is fully refundable for low-income families and has no work requirement. Low-income families with new children also received an additional tax credit. And all newborn British children received a child trust fund — a baby bond savings device where the government matched all family contributions.

Third, Britain invested in children by funding early childhood services and enacting family-friendly workplace reforms. New mothers are entitled to nine months of paid maternity leave. The first six weeks are paid at 90 percent of their average weekly earnings, and the remaining thirty-three weeks at a maximum of $207 per week. Fathers receive two weeks of paid paternity leave at a set flat amount (which Prime Minister Blair took while in office). Britain also provides maternity grants to low-income families.

Britain also enacted a so-called “right to request” law, which gives employees the right to request part-time or flexible working hours from their employers. Employers must consider the request seriously and may only decline it for a legitimate business reason. This policy has been highly successful, as 91 percent of requests are ultimately approved. The right to request was originally available for workers with young children, but has since been expanded first to cover workers with any children or elderly parents, and then to cover all workers.

Britain also provides universal pre-K for all three- and four-year-olds based on a voucher system that provides funds for parents to place their children in the school of their choice. This program also covers two-year-olds from disadvantaged families.

To navigate these programs, Britons rely on a series of community children’s centers. Though these centers were originally placed only in low-income areas, they proved to be incredibly popular and were expanded throughout the country. The centers coordinate child services for families and are tasked with locating sufficient childcare for working families, which is provided by the market, not directly from the government.

All told, the cost of these programs to combat child poverty amounted to 1 percent of Britain’s GDP annually. This was pitched to the public as “one percent for the kids.”

Waldfogel points out that, contrary to the claims of skeptics, there’s no evidence that this public expense was squandered. Studies show that families with young children used their income boost to spend more on their kids and less on alcohol and tobacco.

Some of Britain’s reforms were pared back during Prime Minister David Cameron’s austerity response to the Great Recession, but the vast majority of the anti-poverty effort carried forward. During austerity, Britain eliminated the new child trust funds and new baby tax credit. It capped the child tax credit for middle-income families and froze the value of the child benefit for three years. But it compensated for these cuts by increasing the value of the child tax credit for low-income families to prevent a subsequent rise in child poverty.

Waldfogel sees framing lessons from Britain’s experience for the U.S. Unlike in the United States, there is little racialization to the perception of poverty in Britain. Being poor is not necessarily associated with any one ethnic group in the public’s mind, so the mission to fight child poverty wasn’t seen as disproportionately helping any one group. Not so in the U.S. Because of the divisive politics of race and poverty in the United States, Waldfogel suggests that an analogous American effort might be best framed in terms of “child hunger,” “child housing,” “child opportunity,” or “child investment.”

However it’s framed, the U.S. would be wise to take note of Britain’s success combating child poverty. Britain’s reforms are broadly consistent with American policy traditions of supporting working families. But Britain went further to provide paid family leave, child allowances, the right to flexible work scheduling, and universal pre-K. Britain is also more generous to low-income households by providing more refundable tax credits, and to non-working parents like single mothers by maintaining their benefit eligibility.

Some of these reforms are already being contemplated in the United States, and the others should be, too.   With more than one out of six American children living in poverty, we should look across the Atlantic for proven ways to end this national moral crisis.

The Lion and his long game

I’ve been enjoying Nick Littlefield and David Nexon’s excellent book The Lion of the Senate, which chronicles Sen. Ted Kennedy’s remarkable legislative success during the 1990s in spite of Newt Gingrich’s 1994 Republican revolution. It provides a number of lessons for liberals today chasing a policy vision in Kennedy’s absence.

Kennedy’s top legislative aspiration had long been universal healthcare. He first articulated his vision of comprehensive health reform in 1971, proposing a single-payer plan that would have insured all Americans, and covered at least 70 percent of all medical expenses.

With comprehensive reform out of reach, Kennedy turned to incremental piece-meal reforms. In 1986, he helped pass EMTALA, which prohibits hospitals from refusing emergency room patients because they lack insurance. That same year, he helped pass an omnibus budget bill that extended health insurance coverage to the unemployed (commonly known as “COBRA” coverage).

Comprehensive health reform seemed within the country’s grasp early on in the Clinton administration in 1993. But that effort ultimately went down in defeat and sparked the Republican takeover of Congress. Rather than give up, Kennedy looked to build a coalition around smaller consensus, no-brainer health initiatives.

To that end, he teamed up with conservative Sen. Orrin Hatch to pass the Children’s Health Insurance Program in 1997, which provides health coverage for more than six million low-income children. This achievement, Littlefield and Nexon write, marked “a giant step toward universal health insurance coverage for children and a milestone in Kennedy’s quest for the day when health care would be a right, not a privilege, for every American.”

And of course, in his final years, Kennedy was instrumental in constructing the legislative framework that ultimately became the Affordable Care Act, the most comprehensive effort at universal healthcare in the United States to date.

Kennedy spent his whole career calling for reform that would transform healthcare from a privilege to a right. By 2013, President Obama was trumpeting the Affordable Care Act as achieving just that, declaring, “In the United States, health care is not a privilege for the fortunate few, it is a right.”

Kennedy played the long game to get to this point. He settled for positive but incremental reforms that nudged the system in the right direction and helped millions of people.

Importantly, these reforms helped, little by little, deepen the norm that healthcare is a right and not a privilege. He built a consensus around codifying this principle for the most vulnerable for whom denial of care was the most morally indefensible: those with emergency illnesses (under EMTALA) and children born without access to care (CHIP). He extended this principle to those most consistent with traditional American notions of virtue and sympathy: working families who had suffered job loss (COBRA).

With these precedents under his belt, it was just a short leap to extend this principle to virtually all Americans under the ACA. If the most vulnerable have a claim to healthcare as a right, why not everyone else? By 2009, this sensibility had become commonplace, and the ethos behind Kennedy’s slew of small-scale healthcare achievements made it much harder to deny granting the right to healthcare to the rest of the country.

The arc of Kennedy’s achievements on healthcare has important lessons for liberals aspiring for greater social reform. Compromise is a virtue not just because the good shouldn’t be the enemy of the perfect, but because positive compromise today can help deepen the principles that will get you closer to the perfect tomorrow.

Take an issue like poverty or income security. The idea of a basic income to secure a right from resource deprivation is resurgent on the left, with commentators and other countries increasingly pushing for a guaranteed income.

Perhaps this is a worthy vision to aspire to. But we are far away from a basic income being a plausible political reality in the United States.

Instead, liberals should follow Kennedy’s lead and start by providing income security for the most vulnerable and politically sympathetic: to those whose lives are in imminent danger from their poverty, like the chronically homeless and hungry; to children who were born into poverty through no fault of their own; and to working families who did everything right but fell on hard times due to job loss.

To provide help poor children, we could enact a child allowance—a common social benefit provided in Canada and much of Europe. This benefit could start out by focusing on the youngest children who are most severely harmed by spells of poverty, as some have already proposed.

For working families and those out of work, we could enact a host of policies to enhance their economic security: wage insurance to cushion them from a cut in pay or a change in jobs; periodic pay subsidies for low-income workers to boost their purchasing power and compensate for wage stagnation; and more robust unemployment insurance to keep both families afloat during bouts of joblessness and the economy stabilized during recessions.

And for the destitute, the homeless, and the hungry, we must provide an adequate safety net to prevent them from going without shelter and from suffering without nutrition. This may be the most difficult reform to channel political energy behind, as the poorest of the poor have virtually no voice in our politics. Moreover, unlike being stricken with emergency illness, few Americans see themselves as at risk for deep and severe poverty. Nonetheless, there is a clear moral imperative to provide more comprehensive protection from life-endangering poverty.

Some of these initiatives can build off of existing programs. The Child Tax Credit could be converted into a child allowance by making it fully refundable without income thresholds and by providing families with a periodic payment option. Similarly, the Earned Income Tax Credit can be transformed into a wage subsidy via a periodic government check or direct deposit that acts as a work bonus.

Each of these initiatives would be a tremendous achievement in its own right. Fewer children would grow up in deprivation; fewer families would be set back by job loss; and fewer people would go without basic food and shelter.

But collectively, they would also stitch together a series of programs reflecting a principle in favor of basic income security and freedom from need. Slowly but surely, we’d move closer both in practice and principle to a norm that demands a guaranteed level of basic resources for all.

Indeed, Kennedy’s health reform efforts—the “cause of [his] life”—fit into this narrative as well. Universal healthcare is ultimately about security in our lives, bodies, and resources. The right to healthcare is, at a fundamental level, the right to personal security. We can follow Kennedy’s long game and pick up where the Senate’s lion left off.

A universal young child allowance (almost) comes to Congress

Child poverty is a moral scourge in the United States. Nearly 20 percent of all children live in households below the poverty line. Poverty is most prevalent among young children, as 25 percent of children under the age of 3 live in poverty during some of their most developmentally formative years. These numbers are uniquely high among developed nations, most of which provide far greater government transfers to support families and children.

This has disastrous consequences for these young Americans. Poverty impedes physical and mental development; it impairs the ability to learn in school; and it even alters children’s brain composition. Years later, the effects of poverty in childhood continue to reverberate, from depressing the value of a college degree through diminishing adulthood earnings. To be raised in poverty is nothing short of a lifetime affliction.

New legislation introduced in Congress would begin to ease our child poverty crisis. Representative Rosa DeLauro (D-CT) has introduced a proposal for a Young Child Tax Credit—a fully refundable $1,500 tax credit for each child under 3 years old. Importantly, DeLauro’s tax credit would not have a minimum income threshold, meaning it would be available to all low-income families. These features make the YCTC distinct from our current Child Tax Credit, which is available to families with children of all ages, but is only partially refundable and cuts out the lowest earners entirely.

DeLauro’s proposal would be a big step toward curtailing child poverty in the earliest years of life. Consistent with the practices in other countries, it provides a supplemental child benefit for families with young children in recognition that a new child reflects a substantial economic burden on families.

DeLauro’s legislation is modeled off of a report released by the Center for American Progress last year. And while her proposal is an important progressive reform, it deviates from CAP’s version of the young child tax credit it one significant way. CAP structured their tax credit as a periodic advance tax credit—that is, families with young children would have the option of receiving a $125 payment per child every month via direct deposit or a government-issued debit card. DeLauro’s plan provides families with a lump sum of extra support through a tax refund; CAP’s provides families with regular year-round support.

By structuring the YCTC as a traditional tax credit, DeLauro continues our habit of submerging important social benefits in the tax code. This forces families to navigate the tax system and incur preparation and filing costs to claim these benefits. Some benefits thus go unclaimed, and others are squandered on tax prep fees.

The CAP proposal, on the other hand, is essentially a European child allowance dressed up in kludgy American nomenclature. It’s a “tax credit” in the same way that the Affordable Care Act’s subsidies for purchasing insurance are advance “tax credits.” These direct payments are common across the OECD, and there are proposals percolating to convert several important American tax credits into periodic direct benefits, including the earned income credit for low-income workers, the American Opportunity credit for families with children in college, and the Child Tax Credit itself.

Direct payment of social benefits has its obvious advantages. It’s simple, and the benefits are timed to meet family needs throughout the year. American journalist Russell Shorto discovered the refreshing simplicity of the child allowance upon relocating to the Netherlands: “Every quarter,” he explained, “the [Dutch Social Insurance Bank] quietly drops $665 into my account with the one-word explanation kinderbijslag, or child benefit.”

But direct payment is also a more efficient and fiscally responsible way to reduce poverty. Last week, the Century Foundation released an important report on the merits of a child allowance for the United States. TCF found that a child allowance that provides $2,500 per child annually for all families would lift 5.5 million kids out of poverty and reduce the child poverty rate by more than a third.

Importantly, TCF also found that a child allowance would have more anti-poverty impact than an equivalent increase in spending on our current Child Tax Credit. “A dollar invested in a universal child allowance,” the report finds, “would do more to reduce child poverty than a dollar spent on an expanded child tax credit.” For instance, TCF found that a $2,500 child allowance would cost an additional $109 billion per year and would cut child poverty by 5.1 percent. On the other hand, a $4,000 Child Tax Credit would cost an additional $101 billion per year while cutting child poverty by only 1.2 percent. This is because a child allowance would reach all families, including those with little or no income, and would thus rescue more children from deep poverty.

While child allowances are common overseas, they have not been widely debated in the United States. Nonetheless, the TCF report joins a small but growing chorus of adherents pushing for such a policy to combat insidious American childhood poverty.

Increasingly, policy thinkers are recognizing not just the damage done by child poverty, but also the merits of cash aid to families. Studies of children in families receiving the Earned Income Tax Credit, Cherokee casino payments, and other external or governmental windfalls all show clear benefits accruing to children when their households receive a boost in income.

Cash assistance helps relieve the deprivation of children in poverty. This is a moral imperative, and our policy must act to address this deprivation. But child poverty also amounts to an annual $672 billion drag on the economy (nearly 4 percent of GDP) so relieving this poverty has positive effects for the broader economy as well. Investments in young children are thought to save massive sums of money over the long run, which makes the fiscal case for cash assistance to these children today look even better. (Note that TCF did not rely on the macroeconomic effects of a child allowance to offset the projected cost in its report.)

The United States currently tolerates an unconscionable level of hardship and suffering among children. We must act to protect children from poverty and the lifelong ravages it inflicts. Rep. DeLauro’s proposal is a bold step in the right direction. Though it falls short of a regularly timed child allowance, it does reach the poorest and most vulnerable families, and may be the best policy that our political system can bear right now.

Ultimately, we should consider a universal cash allowance for all American children. But providing urgently needed support for our youngest citizens is a good place to start.

Conservatism’s working class blues

In 2004, Thomas Frank set off a firestorm of debate with his book What’s the Matter with Kansas? He explored the question of why working-class white voters in America’s heartland insisted on supporting conservative candidates, even when such support might cut against their own economic interests. Frank’s answer was that conservatives were duping these voters by raising the salience of social wedge issues like abortion and same-sex marriage.

In the years since Frank’s book became a touchstone in liberal circles, conservatives have consistently blasted Frank’s thesis as a condescending and elitist anthropologic dive in to the Heartland’s psychology. It exposed the “smug superiority on the left,” they said. Even today, they condemn Frank’s narrow vision of what’s good for the working class: “To Frank, the idea that voters might have interests beyond their economic status was unthinkable.”

Regardless of whether one accepted Frank’s theory, he was early detecting a certain angst in the Heartland. The possibly curious voting patterns were simply an indicator that something was awry.

We are now learning that Frank may have had his thumb near the pulse of a deeper crisis than we knew. In November, economists Anne Case and Angus Deaton published a jarring study finding that the mortality rate for middle-aged white Americans had singularly and sharply increased over the last decade and a half. The authors found that “poisonings” and suicides among this population had spiraled to previously unseen heights. People were medicating themselves, abusing opioids, and, increasingly, ending their lives.

Deaton speculated that these Americans had “lost the narrative of their lives — meaning something like a loss of hope, a loss of expectations of progress.” Traditional working class jobs like manufacturing had vanished. Access to economic opportunity with basic education was once the norm, but was now nonexistent. Despair in the heartland and among the working class was producing tangible and terrifying human devastation, the economists found.

At the same time, the Trump phenomenon was sweeping through this very population. Trump was trouncing in the counties with the highest middle-aged white mortality rates. He was winning county after county with the least college diplomas; the most people out of work; and the greatest loss of manufacturing employment. In short, Trump was cleaning up in Case-and-Deaton Country: the places without jobs, education, or hope—the places where people were quite literally dying from despair.

Establishment conservatives, of course, have been tearing their collective hair out over the rise of Trump. They’ve pleaded with voters to see through his con routine and reject his strongman show, marshaling all of the intellectual firepower in their arsenal against Trump’s march to the nomination. Suddenly, there was a test of the allegiance of the conservative elite to the white working-class they had long professed to defend.

So what does the conservative elite think of the communities they used to lionize as “real America” now that they insist on supporting a candidate they loathe? Enter the National Review’s Kevin Williamson:

“The truth about these dysfunctional, downscale communities is that they deserve to die. Economically, they are negative assets. Morally, they are indefensible. Forget all your cheap theatrical Bruce Springsteen crap. Forget your sanctimony about struggling Rust Belt factory towns and your conspiracy theories about the wily Orientals stealing our jobs. Forget your goddamned gypsum, and, if he has a problem with that, forget Ed Burke, too. The white American underclass is in thrall to a vicious, selfish culture whose main products are misery and used heroin needles. Donald Trump’s speeches make them feel good. So does OxyContin. What they need isn’t analgesics, literal or political. They need real opportunity, which means that they need real change, which means that they need U-Haul.”

When the Twittersphere collectively gasped in horror at Williamson’s denunciation, the National Review only doubled down, sneering at the “self-destructive moral failures” of “millions of Americans [that] aren’t doing their best. Indeed, they’re barely trying. [. . .] Simply put, Americans are killing themselves and destroying their families at an alarming rate. No one is making them do it. The economy isn’t putting a bottle in their hand. Immigrants aren’t making them cheat on their wives or snort OxyContin. Obama isn’t walking them into the lawyer’s office to force them to file a bogus disability claim.”

Paul Krugman rightly connects this sentiment to Mitt Romney’s contempt for the 47 percent of Americans who make too little to owe federal income taxes, and to Speaker Paul Ryan’s critique of our social safety net as a coddling “hammock that lulls able-bodied people to lives of dependency and complacency.” “Stripped down to its essence,” Krugman concludes, “the G.O.P. elite view is that working-class America faces a crisis, not of opportunity, but of values.”

Simply put, this is the go-to conservative diagnosis of widespread crisis among those caught in the lower rungs of the social ladder. When evaluating (predominantly black) urban poverty, Ryan once warned that “[w]e have got this tailspin of culture, in our inner cities in particular, of men not working and just generations of men not even thinking about working or learning the value and the culture of work.” And when assessing what ails the unemployed, former Speaker John Boehner said the jobless would “rather just sit around” and coast off of public benefits.

Conservatives have been tsk-tsk’ing the morals of the urban poor and the jobless for years. Never mind that disability rolls have swelled in close correlation to the exodus of blue-collar jobs. Never mind that slashing unemployment benefits does nothing to aid the job search process. And never mind that the supposed cultural rot conservatives detect in poor African-American communities can overwhelmingly be traced to pervasive systemic disadvantage. When a community is in need, conservatives can almost always find a moral failing lurking close behind.

Yet this begs the fundamental question of whether a community’s moral anguish is the cause or the effect of its suffering. To Williamson and others, the white working class has “lost the narrative of their lives” because they picked up heroin needles, crushed OxyContin, and pulled one over on the Social Security office. End of story.

But under a more sympathetic—and, to my mind, more compelling—view of these communities, something has caused them to lose the narrative of their lives, and in response, they have stood numb as work disappeared, have resorted to disability checks just to make ends meet, and have increasingly succumbed to drug abuse or worse. The sky-high rate of poisonings, the futile search for meaningful work, and the alarming frequency of self-inflicted harm are indicators of a deeper existential crisis—a loss of self-value from far-reaching systemic upheaval. The dispiriting data uncovered by Case and Deaton are the symptoms, not the underlying disease.

What Case and Deaton want to discover—and what liberal policymakers want to fix—is that something: the root cause of this despair and these unmoored bearings. It’s undoubtedly too late to return to the ‘60s and on-shore a vast and job-intensive well-paying manufacturing sector. But if nothing else, we can craft a modern social insurance system to match the volatility and realities of 21st century capitalism. Indeed, if we want to reap the tremendous gains of such an economy, we have a moral obligation to cushion those whom it inevitably fails.

But if the determinative moral failing is the individual’s (or the community’s), then conservatives can rest easy while doing little to remedy the plight. And that’s the causation they’ve largely chosen: the cause of a community’s pathology is the community’s insistence on being pathologic. The fix is for the community to simply stop acting in pathologic ways.

It’s a diagnosis that confers agency and abhors dependency, it’s claimed. But how convenient that this theory of what ails the working class so neatly fits within the contours of the laissez-faire free-marketeering predisposition of conservatism. It’s a theory that lets conservatism wash its hands of the struggle of the dispossessed. Why alleviate hardship when you can moralize as it festers?

Conservatism has long claimed to defend the working class and the rural heartland from the snobbery of self-styled liberal elites. Now we know that these communities are suffering immensely in the twenty-first century. And the suffering has grown so acute that these communities have latched on in great numbers to a duplicitous and vulgar anti-politician who gives uninhibited voice to their rage and sense of past greatness lost.

Rather than defend these communities, some conservatives are turning their fire on them. “They deserve to die,” Williamson snorts. Which suggests that Frank was onto something all along. The white working class believed that conservatism had its back. But if there was ever any doubt, now it’s becoming clear: when times grow tough, too often the first instinct of conservatism is to cast judgment rather than to extend a helping hand.