School of Nursing Stat Mavens Make Numbers Tell the Truth

By Andrew Schwartz on June 15, 2006
by Andrew Schwartz Thanks to the old saw about "lies, damn lies and statistics," most people recognize that statistics can deceive. But the puckish humor in that saying (variously attributed to Mark Twain, Benjamin Disraeli and others) masks a phenomenon that too often has serious consequences. Consider prospective federal legislation that would help low-income people with disabilities purchase personal assistance services. Those advocating for the legislation believe that because these services can help people avoid a nursing home, the financial support would be in keeping with the spirit of the Americans with Disabilities Act (ADA) - and would be less costly than nursing home care. But the law has not made its way to the floor of Congress, in large part because the Congressional Budget Office (CBO) estimated it would ring up annual costs of $10 billion to $20 billion. Coming from an official, nonpartisan organization like the CBO, the number automatically carries an air of legitimacy. School of Nursing researchers Mitchell LaPlante, Steve Kaye and Charlene Harrington - who directs the UCSF-based Center for Personal Assistance Services (PAS), where LaPlante and Kaye are co-principal investigators - were convinced, however, there was something amiss in the CBO estimate. They ran their own study and found the likely annual costs to be $1.2 billion to $3.2 billion, a difference of between $9 billion and $17 billion from what the CBO came up with. Their study not only challenges the CBO's assumptions, but raises the question of just how people arrive at numbers that become gospel in public discussions. Fugitive Statistics
LaPlante, a sociologist, calls numbers like those the CBO published "fugitive statistics." In a career in which he's achieved nationwide recognition as a researcher and scholar in the field of disability - as well as for his methodological skills in survey research - LaPlante frequently encounters such fugitives. For example, a few years back, a federal regulatory board was mulling numbers supplied by a hotel industry group. The numbers indicated that the set-aside of rooms that the ADA mandated would impose an undue economic burden on the hotel industry. But when LaPlante and Kaye examined the data, they found that the hotel group had overestimated the economic impact because it had made assumptions that didn't match the reality of how the disabled community travels and, therefore, uses hotels. (This is one common reason for the emergence of fugitive statistics: incorrect assumptions that make their way into calculations.) Ultimately, the evidence LaPlante and Kaye supplied in that case won the day. They're now trying to do the same for the PAS legislation, which is why they have spent much of the past year researching the costs. They are about to publish the results of their study and hope it can move the proposed legislation along. Finding the $17 Billion Gap
How did LaPlante, Kaye and Harrington arrive at numbers so at odds with the CBO's? LaPlante explains that their study and the CBO estimate emerged from the same data gleaned from two surveys of the entire US population: one from the Census Bureau and one from the National Center for Health Statistics. (The technique is known as secondary data analysis.) The divergence began when LaPlante, Kaye and Harrington applied their first criterion for eligibility under the proposed law: those people who in the surveys indicated a need for help with two or more activities of daily living (ADLs). (ADLs typically include taking a bath or a shower, dressing, eating, getting in and out of bed, toileting and getting around inside the house.) They chose that criterion because eligibility for a nursing home is the starting point for receiving personal assistance services under the proposed legislation. "And typically, it's help with two or more ADLs that makes someone institutionally eligible for a nursing home," says LaPlante. In contrast, the CBO included both ADLs and instrumental ADLs, which include such things as doing light housework, preparing meals, taking the right amount of prescribed medication, and keeping track of money and bills. In addition, the CBO assumed that needing help with any one of these activities would qualify under the legislation. The second criterion is financial. To receive assistance, the disabled person must be Medicaid-eligible, a standard that varies from state to state and so can be complicated to work into calculations. This is another spot where differences may have emerged. After applying both criteria, LaPlante, Kaye and Harrington calculated that between half a million and 1 million people will be eligible for assistance. (The CBO estimated 8 million.) Finally, based on interviews, LaPlante, Kaye and Harrington found that on average, these people would need 16.6 hours of help per week, calculated the average hourly cost for the service and did the math. "It's actually pretty simple, pretty straightforward," says LaPlante. "The difference, really, is that we made better assumptions and so reduced the uncertainty." Always Uncertainty
Though LaPlante believes the numbers they arrived at are accurate, he acknowledges, "No survey is perfect. In addition, policy work can be quite complex because it's multidisciplinary, multifactorial." He notes, for example, that projecting the impact of legislation sometimes must account for the "woodwork effect," where over time, more people learn about available benefits and begin applying for them. Researchers, therefore, must determine how the woodwork effect will change costs over time. And, in fact, most numbers that people seize on during public debates have questions surrounding them; preventing fugitive statistics is all about shrinking the uncertainty and adjusting for likely measurement errors. "For example, survey interviews always have a social context," says LaPlante. How literate is the person being interviewed? Are there cultural or language differences that get in the way of receiving an honest or complete answer? "How you ask a question is particularly important," says LaPlante. "We've found that many disabled people don't like being asked what they can't do. But if you ask them if they do things differently because of their disability or ask how they adapt, they are more likely to give you a full answer." Another twist can occur when researchers get results that are counterintuitive. LaPlante recalls a time when he was still a doctoral student, and a faculty member was doing a study for a program that advocated cash payments for welfare recipients (as opposed to food stamps, etc.). To the researchers' surprise, the study revealed that the cash payments were associated with higher divorce and separation rates; this was not a result they were expecting (or hoping) to see. "When that happens, you check to make sure that you haven't done something wrong, but ultimately you have to stand by the results," says LaPlante. The Importance of Policy Research
Despite the inherent frustrations and uncertainties, LaPlante is committed to policy research. He recognizes that without a steady supply of reliable numbers, fugitive statistics - call them lies or damn lies, if you will - can plant themselves in the public debate and shape thinking among key constituencies. Consider a string of recent Supreme Court decisions that limited the scope of the Americans with Disabilities Act (ADA) by restricting its coverage to people with disabilities in their "corrected state." As part of the basis for their decisions, justices cited Congress' statement in the ADA (which was passed in 1990) that 43 million Americans have one or more physical or mental disabilities. The court said that the number indicated "that Congress did not intend to bring under the ADA's protection all those whose uncorrected conditions amount to disabilities." But the "43 million" figure was just an estimate that disability groups used to illustrate the scope of the problem during congressional testimony. There's some real question about whether or not it is rooted in legitimate research. At this point, however, the die's been cast. Stamped as legitimate, the figure informed how the Supreme Court interpreted a law that affects millions of disabled Americans. For those denied rights or benefits because of fugitive statistics like these, that's no laughing matter.