
By Erica C. Barnett
Later this week, the King County Regional Homelessness Authority is expected to announce the results of the process iabot’s using this year in lieu of the traditional point in time count of the region’s unsheltered homeless population—historically, an in-person count in January whose results have always been considered an undercount, combined with interviews at homeless service providers and shelters to gather “qualitative” data about people’s day-to-day experience of homelessness.
Over the years, the count has incorporated various methods to estimate the unsheltered population (such as assumptions about the number of people occupying tents and cars) and has used statistical methods to extrapolate demographic and other information from interviews with 1,000 or more individuals.
The US Department of Housing and Urban Development requires agencies like KCRHA to conduct a “point in time count” of their unsheltered populations every two years. The KCRHA initially planned to opt out of the mandatory count this year, but announced in mid-December that HUD had given them an exemption from its usual requirements, allowing the authority to replace an in-person count with a statistical extrapolation from interviews with unsheltered King County residents conducted over several weeks. As a result, the final “point in time count” number won’t come from a point in time, nor will it represent an actual count.
The process the KCRHA selected, called Respondent Driven Sampling, had two stages. First, volunteers and outreach workers went out to places where people are living unsheltered, such as encampments, to interview people and recruit them to distribute coupons to people in their networks. People who completed an interview received a prepaid $25 debit card and their own set of coupons, each redeemable for a $25 debit card for each recruit who participated in an interview. Those recruits, in turn, would get more coupons to distribute. Through successive waves of recruitment, the system is designed to reach people with no obvious connection to the initial group of recruits.
A spokeswoman for the authority said new approach enabled KCRHA to “capture people’s stories the way they want them to be told. This is not a knock on previous methods; but it is a different approach that allows for rich data collection, honors people’s experiences, and builds relationship with community.”
“Because interviewees identified the next set of interviewees, this helped us interview people we might not otherwise have been able to engage with, including people who are less service-connected,” KCRHA spokeswoman Anne Martens said. The approach, developed in the 1990s, has been used to reach people in “hidden” populations, such as male drug users who have sex with men, in sociological studies ever since.
The benefit of respondent driven sampling, according to University of Washington assistant sociology professor Zack Almquist, who recommended RDS to the authority and helped develop its approach, is that it captures groups that don’t show up with traditional sampling methods, such as random-digit dialing.
The KCRHA’s researchers started with a large group of “seeds,” Almquist said—the original group of recruits who went out and recruited people in their social networks—with the goal of reaching a broad sample of unsheltered people. Then they sat down with them for in-person interviews at nine designated “hubs” around the county, asking them how they first became homeless, where they currently sleep, and other probing questions about their personal and family history and experience being homeless. Two of the hubs were in Seattle—one in Georgetown and the other on Aurora Avenue N—and the other seven were scattered across South and East King County; the furthest east was in North Bend and the furthest south was in Auburn.
The questions are similar to, but far more detailed than, the VI-SPDAT—a staccato, yes/no list of questions that homelessness agencies across the country are phasing out because it leads to racially biased results. Earlier this year, the KCRHA began using COVID vulnerability criteria (a list of conditions the agency can verify without talking to a person directly, such as age, race, and pregnancy status) in lieu of the VI-SPDAT—in part, the agency said, because the VI-SPDAT’s questions were potentially retraumatizing and invasive.
Martens said the combination of RDS and in-depth interviews enabled KCRHA to “capture people’s stories the way they want them to be told. This is not a knock on previous methods; but it is a different approach that allows for rich data collection, honors people’s experiences, and builds relationship with community.”
Some members of the KCRHA’s governing board have raised concerns about the authority’s methods and called the process rushed and mysterious. Auburn Mayor Nancy Backus, who sits on the KCRHA’s governing board, recently said the KCRHA had given her “no clue what was going on” with the interviews, and Redmond Mayor Angela Birney, also on the board, questioned the authority’s choice of “hub” locations and the decision to limit interviews to business hours.
The method has its critics, who question whether statistical extrapolation, based on interviews with a small subset of the homeless population, can produce an accurate estimate of the number of people experiencing homelessness in an area or characterize the conditions under which they live.
“In the past, the reason we did the point in time count the way we did is that we went to people where they were—we didn’t expect them to have to travel or get transported or find a location,” Birney said during a March board meeting, when the KCRHA was in the middle of doing interviews. “I’m a little curious about bringing people to a hub, what kind of disruption that creates.”
Respondent-driven sampling has its critics, who question whether this kind of statistical extrapolation, based on interviews with a small subset of the homeless population, can produce an accurate estimate of the number of people experiencing homelessness in an area or the characterize the conditions under which they live.
Academic critiques of the approach have focused on the fact that people experiencing homelessness often have loose and transitory social ties, making their social networks unreliable; the likelihood that “coupons” redeemable for money or goods (like the $25 debit cards) end up being used as a form of currency or in violation of the rules set for recruitment; and that “hub” sites aren’t equally accessible for everyone, both because of physical distance and because more marginalized or vulnerable people are less likely to go to an official government interview site.
The list of questions the interviewers asked are extensive, and not everyone answered all of them; some people responded to most of the questions, but wouldn’t answer questions about their social network and were excluded from the data, according to Almquist. The result is that although the authority initially said it would base its count on around 1,000 initial interviews, they ended up with a usable sample of between 550 and 574 people. “People had to answer questions about their social network, because respondent driven sampling relies on us knowing about” the people survey respondents interact with and “weighting the population based on their network properties,” Almquist said—factors like the number of people a person says they know and how well they know them.
At a meeting of the Seattle City Council’s homelessness committee earlier this month, KCRHA CEO Marc Dones said the authority was planning to do a “Phase 2” of the surveys, which would add to the qualitative data portion of the count—the kind of information the county used to gather by going to homeless service providers and talking to people who showed up to access services. Continue reading “The County’s Annual Homeless Estimate Won’t Include A Physical Count This Year. Here’s How It Will Work.”