The Five-Year Plan for Homelessness Was Based Largely on 180 Interviews. Experts Say They Were Deeply Flawed.

Source: KCRHA Five-Year Plan

By Erica C. Barnett

In 2022, the King County Regional Homelessness Authority did away with the longstanding, but flawed, practice of physically counting people experiencing homelessness on a single night. By replacing the physical “point in time” count with a statistical model based on Department of Commerce Data, combined with interviews with people recruited through the broad-ranging social networks that exist among unsheltered people, the KCRHA hoped to produce a more accurate picture of homelessness in King County.

The interviews, which contributed to the KCRHA’s estimate of more than 53,000 people experiencing homelessness in King County, also served a second, arguably more impactful, purpose: They formed the basis for an overarching plan that will guide the authority’s use of public dollars for the next five years. The Five-Year Plan, which includes recommendations for specific temporary housing types (and initially came with a $12 billion price tag), was based largely on 180 of these interviews, which researchers used to “identify specific temporary and permanent housing models directly from the voices of people living unsheltered, interpreted in partnership with people with lived experience,” according to the final five-year plan.

PubliCola has obtained the transcripts of more than 80 of these interviews, which took place in the early spring of 2022, through a records request. The interviews range from terse questions and answer sessions to lengthy, discursive conversations in which interviewers abandon the Q&A format to offer opinions, give advice, and tell people they can help them access services—something qualitative researchers are generally cautioned not to do. We also consulted two experts on qualitative research to learn more about how interviews like the ones KCRHA oversaw can best be used, and to learn some best practices for the kind of evaluation the KCRHA was attempting to do.

Additionally, PubliCola talked to an experienced data analyst at the KCRHA, who explained how the project worked. Initially, the interviews (which former KCRHA CEO Marc Dones called “oral histories”) were the sole focus of the research project, which the KCRHA titled “Understanding Unsheltered Homelessness.” Later on, after the Department of Housing and Urban Development rejected the KCRHA’s request to skip the point-in-time count altogether, Dones decided to “combine efforts between doing the Point in Time Count and this qualitative data collection,” Owen Kajfasz, the KCRHA’s acting chief community officer, said. “So we really merged two projects into one data collection.”

Some of the earliest interviews, which took place in South King County, didn’t include the proper consent forms or had transcripts that couldn’t be traced back to interview subjects. And overall, the interviews ended up oversampling straight white men, and undersampling women, people of color, and LGBTQ+ people, forcing researchers to go back and add some incomplete interviews to the pool to correct the imbalance.

To conduct the interviews, KCRHA recruited members of the Lived Experience Coalition, a group that advocates for the inclusion of people with personal experience being homeless in policy and decision-making processes. (KCRHA staff also conducted some of the interviews.) Most interviewers received a two-part training led by Dones, who served as the “primary investigator,” or lead researcher, on the project. Those who couldn’t make the training or came on board later were instructed to read the training documents, which included a list of 31 questions, before starting work.

LEC members also held all three seats on the advisory board that oversaw the project, and later made up a majority of the team that “coded” the interviews in order to translate them into a set of recommendations for the five-year plan. As we’ve reported, the KCRHA has recently tried to distance itself from the LEC, but at the time—early 2022—the group was deeply integrated into the agency’s operations.

Although the researchers conducted more than 500 interviews, they ended up using just 180 transcripts. Some of the earliest interviews, which took place in South King County, didn’t include the proper consent forms or had transcripts that couldn’t be traced back to interview subjects. Overall, the interviews ended up oversampling straight white men, and undersampling women, people of color, and LGBTQ+ people, forcing researchers to go back and add some incomplete interviews to the pool to correct the imbalance—overrepresenting marginalized groups because they are the least served by the current shelter and service system..

“This wasn’t a perfect process,” Kajfasz acknowledged. “We did have more of those [interviews] that we couldn’t use than I was anticipating.”

Once the interviews were complete, a group of LEC members and KCRHA staff, aided by technical assistance from a Washington, D.C.-based firm called the Cloudburst Group, read transcripts of the interviews and “coded” them to correspond with different shelter and housing types, using the codes “to identify specific temporary and permanent housing models directly from the voices of people living unsheltered, interpreted in partnership with people with lived experience,” according to the Five-Year Plan. Although the final plan no longer includes specific dollar figure or specific numeric recommendations (eliminating, for example, a chart that suggested building no new tiny house villages ), it still represents a proposal that would, if implemented, reverse many longstanding policies and invest heavily in new approaches, like many more permanent parking spaces for people living in RVs and cars across the city.

The interview transcripts show many interviewers engaged in patient, compassionate attempts to elicit clear responses from people who were often discursive, rambling, and hard to follow. Interviewers from the Lived Experience Coalition used their own experiences to guide conversations and make their interview subjects comfortable—a key reason for including people with lived experience in data collection.

In one such conversation, the interviewer expresses concern and empathy when the person they’re talking to describes a series of traumatic situations, while still keeping the overall conversation on track. “I’m sorry you experienced that in such a tragic way. Thank you for just being vulnerable and open and sharing that because it’ll give me a glimpse of who you are and what you’ve been through,” the interviewer says, then moves on to the next question.

Researchers who use qualitative methods say it’s important to allow the conversation to flow and to use the questions as a guide rather than reading them word by word.

“In a qualitative interview, so much depends on the amount of trust and empathy that the interviewer can show,” said New York University School of Social Work professor Dr. Deborah Padgett, an expert on qualitative research who has written several books on the subject. “If you’re there in a trusting way, and you’re there as a researcher as opposed to a case worker or outreach worker or more official person, it gives you some legitimacy.”

Dr. Tyler Kincaid, a research assistant professor at Department of Psychiatry and Behavioral Sciences at the University of New Mexico who has led qualitative research about people experiencing homelessness, said qualitative interviews can’t be scripted to the extent that an ordinary survey can. “There’s an art to making the participant comfortable enough to respond, to keep the conversation going,” Kincaid said, and going “off script” is just  part of the process. “If you have, say, 10 semi-structured questions, hopefully there’s followup questions and side questions and things within those ten standard questions on a piece of paper to help to bring out more information,” Kincaid said.

But the transcripts also revealed troubling practices. In the transcripts, interviewers often cut people off, talk at length about themselves, or offer unsolicited advice. Several times, interviewers suggest they or someone else at the interview site can directly connect people with services, such as housing vouchers or a workaround for King County’s hated 211 system, or jump in with answers before the person has had time to respond.

The experts we spoke to said it’s important for researchers not to involve themselves in people’s lives or promise things they can’t deliver. Kajfasz said researchers were told that the point of the research was data collection, not problem-solving, but “folks with lived experience, when they know they have a solution for somebody, they offer it.” Some interview locations had housing navigators or other services on site, Kajfasz added.

In one transcript, an interviewer offers their opinion about the man’s substance use, saying that with drugs, “when you wake up in the morning, you hate yourself.” “I don’t ever hate myself,” the man retorts. After a tangent about the concerns police have raised about encampment fires, the second interviewer tells the man he should join the military. “You can still do it. You’re young enough. You understand?”

In many cases, interviewers suggested answers to their own questions before people had a chance to speak. In one representative transcript, an interviewer repeatedly appears to cut their subject, a Native American man, off—suggesting, for example, that the reason the man is homeless is because he “prefer[s] to be in the woods” and doesn’t “want to be acclimated in society.” Although the man says “yeah” in response to both those statements, he objects when the interviewer continues, “You don’t want an apartment.” “Well, I do eventually,” he says.

Later in the transcript, a second interviewer offers their opinion about the man’s substance use, calling it “impressive” that “it’s just Everclear now” and adding, “you’ll wean yourself off that soon enough,” prompting the man to say he isn’t so sure. With drugs, the second interviewer continues, “when you wake up in the morning, you hate yourself.” “I don’t ever hate myself,” the man retorts. After a tangent about the concerns police have raised about encampment fires, the second interviewer tells the man he should join the military. “You can still do it. You’re young enough. You understand?”

In another transcript, the interviewer suggests that their subject, a Latino man who appears to struggle with English, find work as a day laborer—a stereotypical job for Spanish-speaking immigrants. None of the interviews PubliCola reviewed were conducted in a language other than English; Kajfasz said the KCRHA offered “language services,” but that “the majority of folks, even if English was not their first language, were choosing English.”

Padgett says that while good qualitative research requires an interviewer to be patient, “take a lot of time,” and build trust and empathy, interviewers should never weigh in with their own opinions or advice.  “If you’re giving opinions about what they’re saying, you’re taking up valuable space in the conversation—and they may not want that level of pity,” Padgett said. “When I’m training people, the idea is to be empathic. In the moment, you might say, ‘I’m really sorry that happened to you,’ but [you shouldn’t] go down the rabbit hole of ‘tell me more about your trauma.'”

In an interview that appeared to cross this line, an interviewer jumped in when the man he was interviewing, whose race is not identified in the transcript, said he was probably homeless because he’d been in the foster care and prison systems. “Now you know that’s not true,” the interviewer said. The conversation continued:

Interviewer: You can’t tell me that, because you’re here for a reason. You got kids. We went through the pandemic. You got sick two times, you just said. Hell to the no, ain’t nobody in their life ever in my face will ever say that. Not while I’m standing there. I’m sorry. That just hurt. That just touched me.

Subject: No, I, I—

Interviewer: Don’t ever say that shit again to me.

Subject: My apologies. I will not say that again.

Speaker 2: To anybody, because I want you to know, that’s something that’s instilled in you and that you’re going to instill every person that come through your life, especially your children. Because my kids, they already know like I’m their ride or die. They tell people, you don’t know my mama when they tell them, oh you know how your mom- No. You don’t know my mom. So, it’s a different generation. When we got to teach that generation, there is something to live for. You’re not here for nothing. You what I’m saying? I don’t know where we went off of. Let’s see. Where did you go to school?

Padgett, who looked over the interview questions before we spoke, said the questions themselves were “pretty good, but it’s qualitative, so what’s good on paper only comes out as good if the interviewer does it well so there’s a lot more onus put on the interviewer” to keep things on track. The KCRHA did not provide its training materials, but Padgett said she usually has trainees do mock interviews, then supervises them and provides feedback on their methods throughout a project so they can adjust and improve.

In addition to asking leading questions and interrupting, a number of transcripts include interviewers skipping past questions, making assumptions about people’s gender identity or sexual orientation, and speaking excessively about themselves. In one transcript, an interviewer provides a detailed roster of their own family members’ birthdays; in another, the interviewer tries to recruit the person they’re interviewing to join the Lived Experience Coalition and the KCRHA’s Vehicle Residency Policy Group.

“It’s not about the interviewer,” Padgett said. “You should think of yourself as wearing a hat that says ‘researcher’ on it, and if you take that hat off and become a comrade of lived experience, then you’re losing what qualitative [research] does best, which is having some distance but also empathy. It’s a juggling act.”

“You really don’t want to make some sort of big, generalized governmental or programmatic decisions just based off qualitative research.” —Dr. Tyler Kincaid, University of New Mexico

Once the interviews were complete, another team of researchers, which included several members of the LEC, translated them into housing types, using specific keywords and concepts that people brought up during their conversations to create a roster of shelter types that might be appropriate. People who are using drugs but want to get sober might end up in a box titled “recovery housing,” while those with medical problems might end up in another box labeled “medical respite.” Many of the 180 interviews were with people living in their vehicles or RVs, who often ended up in separate boxes for safe parking and RV safe lots.

“It wasn’t directly, ‘hey, I need medical respite,’ so these people get medical respite, or ‘hey, I need RV parking,’ so this person gets RV parking. It was looking at all of these types of challenges folks are facing,” Kajfasz said. “And for some of those, we’re having to take pieces of information across the interview.” People who were employed but couldn’t afford rent, for example, suggested a need for more housing with supported employment services, while people struggling to stay sober suggested a need for sober housing.

At a glance, some of these solutions can seem overly determinative—some people who want to quit drinking or using other drugs might do better living independently than moving into group recovery housing, for example. Others, like RV parking lots, are widely viewed as short-term solutions, not permanent homes. Although these may seem like minor issues—shouldn’t people trying to avoid drugs and alcohol jump at an opportunity for a room in sober living, even if they would prefer a private apartment?—they translate into real policy choices, and ultimately into real money.

The initial version of the Five-Year Plan called for nearly 4,000 medical respite beds ($2.7 billion over five years); 2,570 units of recovery housing ($1.8 billion); and nearly 5,000 permanent parking spots for passenger vehicles and RVs ($192 million). The specific numbers and dollar figures may have been excised from the final plan, but the mix of shelter, or “temporary housing,” types—based on an “analysis of PIT interviews and input of the Lived Experience Commission advisory group,” according to an internal memo—remains the same, so it seems important to get it right.

“You really don’t want to make some sort of big, generalized governmental or programmatic decisions just based off qualitative research,” Kincaid, from the University of New Mexico, said. For example, he said, it would “so difficult to [use] any sort of qualitative research” as the basis for investing in one type of shelter over another, unless people consistently identified a specific type of shelter they wanted.

Padgett, from NYU, said she believes strongly that “well-done, rigorous qualitative research can play a strong and scientifically valid role, but it’s all in how you handle the information so that you’re not coming to conclusions with no basis in the data.”

In a memo from September 2022, consultants from the Cloudburst Group summarized some of the lessons the LEC and KCRHA learned from the Understanding Unsheltered Homelessness project. Among their conclusions: If the KCRHA does another series of interviews in the future, researchers need to identify the intent of the project before starting interviews, and trainers should emphasize the need to ask questions consistently, “as well as allowing participants to speak and not be interrupted.”

The way the researchers recruited participants—by identifying an initial “wave” of subjects who recruited new people through their social networks—was flawed and contributed to an interview pool that was disproportionately made up of straight, white men.

Finally, the memo noted, it was hard to interpret some interviews because they included multiple people (interviewers as well as people who approached and started talking during interviews; in the future, the memo says, the protocol for interviews “should establish that these are individual interviews.”

Kajfasz said that without Dones’ “significant expertise” in the area of qualitative research, the KCRHA isn’t planning to do an interview project of similar size and scope any time in the near future. Nor will in-person interviews form the basis for the next point-in-time count, which the KCRHA must conduct next year. The KCRHA is “currently conferring with HUD about what the next PIT count” will involve, Kajfasz said, but it probably will never look like last year’s count again. “Never do I ever want to do the point-in-time count and large-scale qualitative interviewing together,” Kajfasz said.

13 thoughts on “The Five-Year Plan for Homelessness Was Based Largely on 180 Interviews. Experts Say They Were Deeply Flawed.”

  1. Wow thanks for uncovering the complete incompetence of Marc Dones and the Lived Experience Coalition to do population estimates like this. This adds to the other failures to pay contractors that were reported in the news and shows KCRHA needs new leadership! They should re structure and replace the Board of people with loved experience with a Board of people who have accurate data gathering and estimating experience plus Contracting experience.

  2. I think about 2/3 of the way through you duplicate some paragraphs and a quote. Just FYI. 🙂

    1. If you’re viewing on mobile, it’s possible that WordPress published pullquotes in a way that makes them look like regular text. The story has a few quotes that are obvious on desktop but may look like duplicate paragraphs on a phone.

  3. KCHRA was formed by an Interlocal Agreement between King County and Seattle in December 2019. The agreement is binding for a period of five years. Since its inception KCHRA has under performed, and over spent at the expense of local taxpayers.

    It has failed to take full advantage of federal funding available through HUD and Title XIX (medicaid). Funding KCHRA solely through local budgets is not sustainable. As a county agency they have access to the technical assistance to develop the infrastructure to bill medicaid, and pursue federal grants. This is how government works.

    Federal dollars filter through and help fund state, county, and local governments. Federal funds are in the air we breathe, the water we drink, and the roads we roll & walk on. Those federal funds do however, come with performative and data requirements. The point in time count is a perfect example.

    HUD has been working with and supplying funding to participating cities across the country dealing with homelessness. One of the requirements has been to conduct the (flawed as it may be) one night point in time count. The feds for this requirement don’t care if KCHRA has a “better” way. Not doing the count misses a major point in the funding agreement and jeopardize future funding. I am encouraged Kajfasz gets it.

  4. Where do I start? This King County homeless “counting” strategy devised by KCRHA, under the direction and influence of the LEC is the perfect example of why Americans have lost faith in the ability of government to successfully accomplish anything and is, by the way, why Trump got elected, and may get re-elected. Government programs, including the KCRHA and, by extension, all homelessness nonprofits, invariably demonstrate an irrepressible aversion to valid metrics of productivity and success. In this case, the KCRHA decided to do everything in its power to avoid arriving at any real, objective measure of homeless numbers in the county. Instead of actually counting people, they opted to adopt a “qualitative” analysis strategy. This qualitative approach to arrive at a quantitative result (number of homeless people) is sort of like a realtor (Trump) telling you that what really matters is not how many actually square feet the condo he built you is but instead “how big do you feel like it is”.
    I’m not going to get into a lengthy argument about the value of qualitative vs quantitative analysis strategies but I will tell you that one of the risks of the former is that it potentiates considerable hand-waving explanations for justifying methodologies (I still have no idea how the KCRHA counting was done), as well as, interpreting results. But this is exactly how government programs work/fail. They make a living off of being as obfuscational as possible. They have no desire for you to know, what they’re doing, why they’re doing it or whether or not what they do works. They just want more money to do it.
    It’s my hope that the KCRHA can get its shit together and start looking at real quantitative metrics, like measuring number of people getting into permanent housing, but I’m not holding my breath.

    Peace

    1. The actual counting was hit or miss as well. See a van with fogged up windows covered by blankets? Let’s count that as 3 homeless.

    2. Strong post Bruce! Of course this blog is run by a woman who has publicly stated Seattle homelessness won’t be solved for decades, (Erica C. Barnett), so this sort of bull crap can roll on for decades according to the Seattle Left.

      All the KCRHA did is add another level of (mis)management to a broken system. The way the Homeless Industrial Complex works to have as many organizations involved as possible with as much management as possible with as many funding sources as possible. Then hire unqualified front line workers as cheaply as possible and give them tasks without enough training or resources for success. The bottom line for the Homeless Industrial Complex is always look out of the cash flow first…. and never submit to any outside pressure for measurable progress.

    3. The LEC is the only part of the KCRHA experiment worth keeping. We need to give them more money and power, not less. Those obstructing them are the real problem.

  5. This is great, and I hope a lot of people read it. Basing the expenditure of billions of dollars on 180 interviews, even if well executed, doesn’t make sense. Add that to the notion that Dones had a bias against tiny houses, and results supporting his views are not surprising. The idea that these interviews had to be requested is a little crazy. I also wonder about how they are running the project under their ‘Theory of Change’ which focuses on lived experiences. It seems like we should look first on what works, what we can afford before placing to much emphasis on their opinions, which are based on flawed interviews, anyway.

Comments are closed.