Menu

Jeff Gothelf – Axe Requirements-driven Product Design Live!

by Sean Carmichael
Play

[ Transcript Available ]

Jeff Gothelf

This is a sample of Jeff’s 90-minute talk from the User Interface 18 conference.

There’s a traditional way of building a product. Normally there’s a huge time investment made as you come up with the idea, design, build and re-build until it’s released. At this point you’re hoping this solution solves the users’ problems, and also that it doesn’t crash and burn. And if it does fail, there’s going to be some hell to pay.

Jeff Gothelf considers this “the old way” of product development. He posits that there is an immense amount of risk involved with this approach, and suggests that design and product development should be viewed as a hypothesis. Using this method, you’re putting hypotheses out there, testing them, and even if they fail, you’re continuously learning.

With these “small bites” being taken, you can design with a comfort level, knowing you’re not putting the entire project at risk. You’re collecting data and therefore able to iterate based upon objective observations. If the data proves you’re heading down the wrong path, you can quickly kill the idea and move onto the next hypothesis.

Want to hear more from Jeff? The recordings of the User Interface 18 conference are now available as UI18 OnDemand. Relive (or experience for the first time) all eight featured talks and Jared Spool’s informative and entertaining keynote. Get all of the details at uiconf.com.

Recorded: December, 2013
[ Subscribe to our podcast via Use iTunes to subscribe to UIE's RSS feed. ←This link will launch the iTunes application.]
[ Subscribe with other podcast applications.]

Full Transcript.


Jeff: Let me share another case study. This is a positive case study from TheLadders. I joined a team at TheLadders that was called the connections team. The connections team’s job was, to ensure that job seekers and recruiters and employers were connecting through our system. After all, that was the point of the service.

At the time, 14 percent of communication from recruiters to job seekers were getting a response from job seekers, 14 percent. That means less than two out of 10 were actually getting a response, which is absurd. You’ve got two parties paying to use a service, trying to find each other, communicating with each other, and then not responding.

Through eight years of analysis and data, we knew that job seeker response rate, which is what this is, was one of the outcomes that affected customer satisfaction. We tasked a team with moving this number. This was the outcome. Move this number from 14 percent to 75 percent.

We didn’t task them with improving customer satisfaction, because there are so many different things that play into customer satisfaction, right? You could call the call center one morning, and the person that you get is simply having a bad day. That could tank customer satisfaction. Move this number from 14 percent to 75 percent.

We put together a cross-functional team, designers, developers, product managers. They got together, and they brainstormed ideas together. Creativity in a generation of solutions. We visited recruiters to understand how they communicate with job seekers. We talked to job seekers on a weekly basis to understand. We empathized with them.

Through that analysis, and through rapid iterative launches of product…remember the continuous launch of product. We could launch tests into the system on a weekly basis and get learning’s. We started to figure out what moves that number from 14 percent higher. After three months, we got that team to 38 percent. We got the number to 38 percent.

We went back to the organization, and we said, “Look, we’ve been working for three months. You tasked us to get to 75 percent. We’re at 38 percent.” They said, “That’s great. You get another three months. You’re making positive progress.”

We worked again for another three months. Same techniques, brainstorms, usability tests, prototyping, customer research, A/B tests, launching into production, learning what works, measuring everything against that response rate, and doubling down on the features that actually made a difference.

We got to 68 percent at the end of six months. The organization said, “That’s good. 68 percent is close enough. You can move on to the next project.” The whole time this team worked, not a single executive told this team what to build. Nobody said, “Make the corners round, or make it pink, or have the buttons say these words, or make sure that the legal copy is there.” Whatever it is.

This team built whatever moved that needle forward. They worked together creatively. They tested together. They built that shared understanding, and they would quickly, by continuously integrating their design into the real world, hitting real market feedback, until they got to the outcome that they were looking for.

Now look, I’m a designer. I assume a lot of you are designers who are in this room, and we are hopelessly optimistic people. We could work on something for five minutes, or five months, and we think it’s awesome. We love our ideas. It’s OK. Five minutes of solving a problem, I think, it’s the greatest solution ever. I love it, and we think it’s going to be big.

Now we release it, it’s actually not that big. Nobody clicked on it, nobody liked it, nobody used it. Our job as creative product people, creative product designers, is to reduce the time between these two points, from months to hours, and the only way to do that is to work collaboratively with our colleagues. This is the old way of building products.

This is the old way. This is time and this is risk, and you build, and you build the old way. You build, and you build, and you build, and you build, and you release, and you really hope that it’s going to work, when you’ve built up all this risk, and if it doesn’t work, someone head’s going to roll at that point.

Now what we’re suggesting is through this system builds on continuously learning, we take small bites of risk, and you test them. Small bites and you test and you test and you test. Even if you fail, because you’re taking such small bites, it’s not catastrophic, and you’ve learned something, and you’re moving closer to the most accurate solution for this particular problem.

As you’re doing that, you’re collecting data, and when you collect that data, it allows you to make decisions based on objective observations, because you’ve defined your success criteria when you wrote your hypothesis.

We will know this is true when we see X, some kind of a behavior. As you’re collecting this data, you get to a point where you have to make a decision. There are three decisions that you can make when you collect enough data.

The first decision, if the data that you’ve collected indicates that your hypothesis was wrong, kill that idea. This is a bad idea. All right? In this case, kill it before it kills you, right?

But seriously, if the data that you’ve collected around your experiment shows you that your assumptions were invalid, stop working, kill that idea and move on to the next thing. You don’t need to spend any more time on this idea. Any more time on this is waste. You’re wasting your time. Now you’ll get to a point where you’ve collected enough data that says, “You know what? We’re heading in the right direction, but we’re not getting there fast enough.”

That’s when you execute a pivot. Now pivot is a used and abused word these days. Essentially what it means is you fix one foot in your strategy, and you change your tactics. We’re still trying to get some power somewhere, but we’re changing our tactic. We’re just not getting there fast enough.

Eventually, you’ll get to a point where the data’s telling you that you’re on the right track, that you’re moving the needle in the right direction, and when you get there, this is when you double down.

This is when you switch from a learning to growth. You’ve found the right idea. It’s clearly moving the needle in the right direction. This is when you scale, this is when you double down. But you’ve got to collect that data objectively through these rapid experiments in order to know what decision to make at this point.

I love this quote, this is Kent Beck. Kent Beck was one of the authors of the “Agile Manifesto.” among many other things, he works at Facebook these days. He tweeted us about six months ago, five months ago and he said, “Product road maps should be lists of questions, not lists of features.” The questions that he’s talking about are the hypotheses, that we’re talking about here today.

Instead of saying, “We will build a new authentication system into the service,” we want to ask, we believe that building a new authentication system will actually drive new registrations, because it’ll be easier to sign up. Let’s ask those questions and let’s work as quickly as we can to answer them, and move through that road map.

It makes the road map less certain, but to be quite honest with you, a product road map, especially one that stretches longer than six months, is wholly uncertain anyway. You might as well word it in a way that actually tells the truth. “We don’t know, and we need to find out.”

I’m going to share with you a quick case study; again, for work I did last year. This is a company called Lenddo. Anybody heard of this company? They’re not dead. Just throw it around. Lenddo was a micro-lending company based in New York City.

They’re a start-up, and they do lending in the southern hemisphere of the world. The way that they work is really interesting. What they do is they introduce the concept of a credit score where one doesn’t exist.

The way that they introduce that is through social capital. When you sign up for Lenddo as a potential borrower, you have to connect all of your social networks to the system, Facebook, Twitter, LinkedIn, Yahoo, Gmail, everything, Google Plus. You connect everything. After you connect enough social networks, it raises your credit score to a point where they will lend to you.

Now if you don’t pay that loan back, and we’re talking about $200, $300 loans, that type of thing, if you don’t pay that loan back, they shame you publicly to all of your social networks.

[laughter]

Jeff: In addition to that, anybody that’s connected to you and is using a Lenddo service, their score also goes down, you’re heavily incentivized to pay that loan back. That’s what they do. Now they had an issue. Their issue was that people were coming and signing up for the service, but they weren’t filling up their address book with connections.

They’d sign up for the service, and then connect all their accounts, but they didn’t actually add friends in the network. That’s a big problem for them, because they need those friends in the Lenddo network, that they can publicly shame you, at least threaten you with that so that you’ll pay the money back.

All the word, all the usage of the product, currently, when I was working with them last year was happening in Columbia and the Philippines. The company’s based out of New York. They had a bunch of ideas about how to solve this problem. The engineer had an idea, I had an idea, the CEO had an idea, the product manager, we all had different ideas.

This may seem simple, but what we did was we sketched up a series of wire frames in Balsamiq. It took an afternoon of brainstorming and a couple of hours of sketching in Balsamiq, to get these ideas.

Then with me in New York, and testers, and facilitators on the ground in Colombia and the Philippines, we simply had conversations with the customers. We got out of the building and had conversations with customers, and showed them these ideas to get some market feedback about what tactics would encourage people to come back, and add users.

I know this sounds simple, but so many companies don’t do this, and we spent half a day putting together the assets, and another half a day running the tests. Within that full eight hour cycle, we got enough information to know which track held the greatest chance for success.