Case Study: Designing a Self-guided Option for Showing Your Home

How Opendoor revamped home assessments to give customers more flexibility

Jill L. H. Reed
8 min readJun 24, 2022

Selling a home to Opendoor is designed to be simple, fast and certain. Traditionally, in-person home assessments can take hours, and require tons of preparation, such as cleaning, staging and repairs. By contrast, Opendoor customers answer a few qualifying questions online, and then schedule a 30-minute call to show us their home by video. We use the video to determine the home’s market value and scope any repairs. Then we present the customer with a final offer to buy their home for cash.

Customers can schedule a video call while waiting to view their Preliminary Offer, an estimate calculated from publicly available information, such as comparable home sales.

Last year, we adopted the Jobs to Be Done framework, which proposes that customers don’t simply buy products or services, but rather “hire” them to do a job. One of our operating principles is to start and end with the customer. The “Jobs to Be Done” framework does just that, to help us understand customer needs to inform product improvements and innovation. This case study walks through how the Opendoor Design team discovered our customers’ jobs to be done, and then innovated to meet their “hiring criteria” by designing an alternative self-guided home walkthrough.

All hands on deck

At Opendoor, we believe in acting from ownership and keeping a startup mentality (a bias for action). So, while this case study is written in linear fashion, keep in mind that the following activities actually happened concurrently:

  • Customer Research
  • On-site Experiments
  • Design Explorations and Testing

Customer Research

To better understand our customers and to discover opportunities for product improvement and innovation, our research team conducted a foundational study of home sellers’ Jobs to Be Done. We spoke with customers at various stages of their selling process with Opendoor, and also to homeowners who had initiated a sale to Opendoor, but then sold elsewhere. The study was designed to be generative, so we interviewed home sellers about their experiences across a wide range of topics. Those topics included home assessments, video calls and home valuation.

Initial customer interviews yielded some interesting jobs to be done related to showing their homes, such as:

  • Convey an accurate “feel” and show off unique features of my home, so buyers will understand why my home is special and better than other homes nearby.
  • Get the highest offer for my home, so I can feel confident I’m making the best financial decision for myself and my family.
  • Show buyers my home in a manner that is reasonable and convenient for me, so I won’t burden myself and my family with the hassle.

Those customer sentiments all raised more questions for us, but we honed in on the last one. With respect to showing their homes, what do reasonable and convenient even mean to our customers? Was our video call process truly the best “hire” for their jobs? These questions inspired us to dig deeper, so we started planning more research.

On-site Experiments

Meanwhile, our customer product team wondered if our customers who sell their homes to Opendoor (Sell Direct customers) might appreciate a more flexible option to show us their homes, like the self-service model we offer to real estate agents. Instead of scheduling a video call, the seller’s agent takes a video of their client’s home on their own schedule, and uploads it to our secure webpage.

Sell Direct customers show us their homes via live video call (shown at left), while real estate agents follow a self-service model of uploading videos of their clients’ homes (shown at right).

For quick insights, our product team decided to test customer interest in self-service by simply reusing the existing agent flow. One set of customers, the control group, would experience the usual site flow to schedule a home assessment video call. Other groups would be assigned alternative self-service flows:

  • On-site, some customers would be defaulted to the self-service flow, with ability to opt-out to schedule a call instead
  • Other customers would be offered the choice to either schedule a call or upload a video by self-service
  • Customers who had missed their video call, or whose appointments were scheduled more than two days out, would be offered the choice by email.

Then we would compare overall conversion. That is, we would measure how many customers in each flow ended up completing a home assessment video, whether by live video call or by the self-guided walk-through previously offered only to agents.

Experiment Results: Customers want self-service! But need more guidance

Throughout the experiment, we tracked the number of times a customer in one of the treatment flows showed interest in the self-service option. The results surprised us!

When shown both options, home sellers’ overall interest in completing a video assessment (by either means) increased significantly. However, fewer people actually completed an assessment. That’s right, overall conversion actually went down. It appeared the presence of a self-service option was enticing, but customers weren’t following through.

To dive deeper, the research team followed up with customers who had taken part in the experiment. Because they had actually experienced the two choices, we used a more targeted interview method from the Jobs to Be Done toolkit: switch interviews.

Additionally, we tracked down our Operations partners to understand exactly what parts of customers’ homes they needed to see in order to make a final offer. A targeted list of pricing criteria would help our design team refine the next iteration of self-service tools. We had to move fast, because our designers were already starting work on a new, more customer-centric design.

Design Explorations

Learning about our customers’ need for a more convenient way to show their home, the design team began exploring the existing self-service flow from our direct customer’s point of view. A heuristic evaluation of the agent’s self-guided walk-through flow revealed usability problems that we imagined could be stopping our direct customers from completing it, such as:

  • I can’t see the instructions while I upload the video or photos
  • I can’t save my progress and come back to it later.
  • Completed tasks take up the first part of the screen with no indication to scroll down and complete the remaining tasks and/or hit Submit
  • After I think I’m done, there’s a new series of questions I have to answer

To address these issues, the new prototype provided guidance and support, such as:

  • A sense of progress indicator
  • Grouped tasks with expected time commitments
  • Example videos to help people understand what to upload
  • Ability to see instructions while shooting the video
  • Ability to save and come back later
A heuristic evaluation of the original upload flow (at left) revealed usability problems, so our designer created a new customer-focused prototype (at right).

Concept Tests

To test the prototype, our researcher recruited home sellers who were not customers of Opendoor, but were open to our business model. We wanted to make sure they hadn’t been primed or biased by a previous experience on our site.

We started the study by asking participants some lifestyle questions, such as their goals for their upcoming home sale, their experience in real estate, and how they use technology. These would help us during the session to understand their frame of reference, and later to segment and analyze results. We wanted to learn, is there a certain customer type, or persona, more likely to choose self-service?

Participants were then asked to complete the self-guided walk-through on the new prototype, while “thinking out loud” about the experience. That would expose any remaining usability problems in the flow.

What we learned

We had hypothesized that customers opting into self-service would be more tech savvy than average. However, that turned out not to be an indicator of preference. Neither did experience in real estate. The most notable difference between sellers who wanted to try self-service and those who did not was their preference for either avoiding or engaging people.

Seems intuitive enough. But the differences go deeper than that. These two groups differ in how they define convenience. To people-avoiders, convenience means personal control over my time, location and activities. For them, engaging with another person, such as by video call, is actually an inconvenience. It means less efficiency, more complexity and the social pressure of providing reciprocal accommodations.

Research participants who chose self-service preferred to avoid other people.

On the flip side, to the people engagers, convenience means having access to another person for instantaneous help, reassurance and validation. To them, our completely autonomous, self-guided walk-through was the definition of inconvenience!

We also discovered a third group who could go either way, depending on their circumstances. These “middle-ground” home sellers fully expected that self-service would be both more convenient and faster than scheduling a video call with a rep. But they had anxiety about it. They just weren’t confident that their homemade video would bring as high an offer as a live video call with an Opendoor rep.

Prototype test results

The prototype proved functional, as participants were able to complete their video walk-throughs without error. But for many, it still wasn’t usable.

Participants expressed confusion about what criteria to capture. They were worried they would make a mistake, one that could be costly. While the most staunch people-avoiders were willing to put their faith in the process rather than schedule a video call, the “middle-ground” folks were not ready to make that trade-off. They chose the certainty of talking to an Opendoor rep in real time over the speed and convenience of self-service.

Usability tests confirmed that our prototype was functional and easy to use, but left some home sellers less than confident in the expected outcome.

Next steps

Between all of the experimentation, interviews and prototype testing, our design team amassed a ton of input to synthesize into a coherent and usable design. With each new set of insights, our three teams — product, research and design — worked together to prioritize the problems our next iteration self-guided walk-through would need to solve in order for customers to hire us.

Here are the problems we prioritized for the next release, and how we addressed them.

  1. We knew from follow-up interviews that a number of customers had experienced technical problems attempting to upload videos. Large videos would stall or simply stop uploading. For the next iteration, we would no longer ask for one video walk-through of the entire property. Instead, we would request a few, shorter videos, and add more guidance about how long each video should take.
  2. Internal interviews had revealed a mismatch between our operations team’s needs and the content customers were currently submitting through self-service. For the next iteration, we would remove tasks that are no longer needed.
  3. We had validated through user testing the improvement to group tasks by completion and highlight only tasks that remain unfinished.
  4. To address customer uncertainty and lack of confidence, the next iteration would make it easier to contact support while in the self-service flow.
  5. Finally, for customers who start the self-service flow but don’t progress, our content designer created a whole new flow of email reminders to re-engage customers.

This new iteration was recently released to a pilot group of customers. Watching closely for signs of success or trouble, we’re already planning improvements, based on insights from the switch interviews conducted during early product experiments. At Opendoor, where our goals are innovation and improvement, the iterations never stop.

--

--

Jill L. H. Reed

Qualitative researcher, former Hallmark writer and Duke MBA who regularly interrupts best-laid business plans with messy customer insights