19 min read  | Experiments

Andres Glusman of Do What Works

“It's a crime. It's such a waste of time.” That’s how Andres Glusman feels about so many of the tests product teams run. That’s why the former head of product at Meetup founded DoWhatWorks, a company that enables you to see – and learn from – the A/B tests that other companies are running. Rather than test button color for the umpteenth time, Andres wants you to test something that will really move the needle. 

After more than a decade of running product and strategy at Meetup.com, and a lifetime of being fascinated by experiments, Andres is passionate about helping others learn. In this episode, Andres shares what he’s discovered about testing and human behavior, and why running experiments is critical, but also when it’s not. 

Here’s Andres Glusman on Crafted, Artium’s new podcast.
Listen to all episodes and subscribe here

 

 

Andres Glusman: The reality is 80% of the experiments that people launch on their platform do not move the needle. And the reason why 80% of those experiments fail, in our opinion, is because everyone is learning in a vacuum. No one is learning from anyone else. Everyone is repeating the exact same mistakes.

Dan Blumberg: That's Andres Glusman, co-founder of Do What Works. He's a lean startup pioneer and a champion of testing and learning. After more than a decade of running product and strategy at Meetup and a lifetime of being fascinated by experiments, Andres is passionate about helping others learn. His latest company, Do What Works, is an ingenious product that lifts the veil on thousands of split test experiments being run by the world's top companies. So you can see what other companies have A/B tested and whether A or B was the winner. And then you can apply those findings to your own company so you don't reinvent the wheel.

In this episode of Crafted, Andres shares how Do What Works came to be, what he's learned about testing and human behavior, and why running experiments is critical, but also when it's not. Welcome to Crafted, a show about great products and the people who make them. I'm your host, Dan Blumberg. I'm a product and engagement leader at Artium where my colleagues and I help companies build fantastic software and recruit dynamic teams. We're going to talk a lot about experiments in this interview. And in the spirit of experimentation, I'm going to just launch with that experiment right here.

So I'm going to give you the headline of a blog post that you have written and then I'd like you to write that blog post. Here it goes. How selling peanuts at Comiskey Park taught me everything I needed to know about launching lean startups by Andres Glusman.

Andres Glusman: Yeah, that is not an actual blog post I've written, but maybe it should be one. So let's write it in real-time.

Dan Blumberg: So let's back up. As a college student living in Chicago, Andres's summer job was selling peanuts and popcorn at Comiskey Park and Wrigley Field. It was 100% commission-based, so it was all about the hustle.

Andres Glusman: So as a ballpark vendor, your voice is your most important asset. So you adopt vendor voice, you go, "Hey, peanuts. Hey, peanuts. Get your peanuts here." Also putting on a little bit of a show like making it theatrical and kind of going like 1920-esque vendor also really added to the sales. I would vary how I would talk to people when I would sell them bags of peanuts and I would track my sales every single day. And what I found was that when I used a certain structure, when I would give people their change and say, "And you get, 75 cents is your change," they'd be much more likely to say, "Hey, go ahead and keep it." When I'd say, "And you get, 75 cents is your change." So using you twice, I would walk home with an extra 20 bucks in my pocket that night, which was a very big deal to a college student.

 

"You learn how hard it is to change human behavior and how prepared you need to be to get kicked in the teeth over and over and over again as you try and figure out how to make that thing actually happen. But the reality is that even when you're trying to change any behavior at all, the default state is that it won't work."

 

Dan Blumberg: After college, Andres brought his passion for experimentation to online advertising, running split or A/B tests on some of the web's first banner ads. Years later, he joined a young startup with a mission to connect us.

You were at Meetup for nearly 15 years leading product and strategy and other critical roles. What were some of the things that you learned there as you were seeking to create a platform that helps people meet in real life?

Andres Glusman: You learn how hard it is to change human behavior is actually the number one thing and how prepared you need to be to get kicked in the teeth over and over and over again as you try and figure out how to make that thing actually happen. But the reality is that even when you're trying to change any behavior at all, the default state is that it won't work. You won't make the change.

So understanding that's the default and then doing everything in your power to address the situation knowing that, has a profoundly different effect on your psychology and your approach than if you go in thinking that every single thing you're going to do is going to work out beautifully.

Dan Blumberg: One of the problems Andres had at Meetup was that online, this free, easy-to-use platform appeared to be buzzing with events and activities, but at some of the actual events, the organizers would be no-shows.

Andres Glusman: So while it looked awesome that we had all of these groups that were in existence, all these meetups getting scheduled, the actual core user experience was a bit of a dud because the organizers didn't have any skin in the game. And we basically made the hard call to start charging organizers just a nominal fee per month, which created, one, the way for the company to become a sustainable growing organization. But then, two, it also created a little bit of skin in the game and the no-shows were no longer a problem.

Organizers that were there we're committed, they were aligned, and they were the ones that helped fund the growth of the company. And then we were aligned with them because we would help deliver to them the people that would be able to go to their events that were showing up. So it was a much more virtuous circle and once we had that flywheel going, it really helped take care of itself.

Dan Blumberg: At Meetup, Andres and his team embraced and evangelized lean experimentation. This is before the lean startup became a best-seller, the acronym MVP became inescapable, and test-and-learn methodologies became so common. I first learned about Andres and Meetup's methods from a presentation that Andres created in 2011. It went viral on SlideShare.

Andres Glusman: Yeah, it was a fun deck. The reason I think it resonated with people was we open-sourced our playbook on something that a lot of people would've been very guarded about. And two is the approach we took to usability testing. Essentially, took all of the rules of usability testing at the time and turned them on their head and completely changed the game. And I'm really proud of the way in which we looked at what goes into usability testing and carefully thought about whether any one of those pieces deserves to be a part of that pie.

Dan Blumberg: Back then, and still at some places today, companies would pay tens of thousands of dollars so that a dozen or so test subjects could sit in a lab setting with a moderator, complete tasks, and give their feedback on a website.

Andres Glusman: What we completely rethought was what if you changed up every single bit of how that works? And instead of trying to operate in a sterile environment because you believe that was going to do away with error, what if you got very comfortable with the idea that there's error in every single method for understanding users and user behavior is filled with error? And that's a part of life.

And what if instead of trying to remove it, you just embraced it? And what we did was we substituted volume for precision and we organized a system such that instead of having spent $32,000 and having 10 people go through, what if you spent $32,000 and had 400 sessions over the course of a year? And that's essentially what we built.

Dan Blumberg: At Meetup, they invited people into their office every day of the week to get feedback on whatever products were in development. Product managers, designers, and engineers could watch these sessions live and whisper in moderators' ears to ask certain questions.

Andres Glusman: So the goal was to shorten the amount of time between anybody in the organization wondering how somebody would react to something and actually watching them use it. So, in fact, instead of being scripted, it was an extremely agile experience. So all of these ideas that go into traditional usability testing are all about making it so that if you can plan ahead and think of every contingency, you can eliminate error. And our model said, "Forget that noise, instead, jump in, embrace the error, have a lot of volume, and set up the system so you can very quickly respond to the situation that unfolds in the course of a test."

There's so many approaches you can take. You could have them go through a usability experience on your direct competitor. And even within your own staff, there's no reason why the underlying experience that you're putting somebody through has to be polished in any way. All of these things that you're putting in front of users are just prompts to get them to reveal a truth about themselves and the way they're going to behave. And it doesn't really matter what that prompt is as long as it gets you closer to the truth. And the art of all of this is figuring out what truth to put in front of somebody.

 

"And I suddenly really just had a flash of inspiration for Do What Works and was thinking, 'Wait, if I were to do X, Y, and Z, I could actually figure out what Netflix is testing? That's really, really wild.'"

 

Dan Blumberg: In 2019, after almost 15 years at Meetup, and two years after the company was acquired by WeWork, Andres decided it was time to do something new. He wanted to found a company, was exploring lots of ideas, and one theme kept coming up

Andres Glusman: I was also really, really fascinated with this idea of how do you make things that are behind a curtain in front of a curtain? So how do you make things that are invisible visible?

Dan Blumberg: Andres had given himself a year to find the right opportunity and he was joined by his Meetup colleague, Will Howard.

Andres Glusman: And we were meeting with tons and tons of people and just asking them, "How can I put wind in your sails? What are the problems that you're working on? What are the different things that you're facing?" Once a week, I'd go into the city and try to have as many meetings as I could, and then the rest of the week think about, "What are the patterns that I was seeing and what are the things that I was really fascinated by?" And I suddenly really just had a flash of inspiration for Do What Works and was thinking, "Wait, if I were to do X, Y, and Z, I could actually figure out what Netflix is testing? That's really, really wild."

 

"And now, fast forward a few years later, it's snowballed into having six of the top streaming brands are using us and four of the top meal kit companies and eight B2B SaaS Unicorns are using us, et cetera, et cetera."

 

Dan Blumberg: So Andres asked Will if the idea was doable. Will said yes, made some adjustments, and a few days later, they had an MVP. Andres and Will had built a way to see what Netflix and other companies were testing.

Andres Glusman: And I showed it to my friends in the product world and I said, "Hey, does this seem interesting to you? Would you buy it?" And they said, "Yeah, that's really, really unique. That's amazing. I can't believe you can get access to that data. If you built something like this, I would totally pay for it."

Dan Blumberg: Yes, but would their friends actually pay? Time for an experiment. Andres and Will added a payment form to their very simple website, sent the link to their friends, and lo and behold, their friends put money down. They were committed.

Andres Glusman: We sent them the dashboard and that sent us on our journey. That was good enough to have validation that there were three people in the world that were willing to pay for this thing. And now, fast forward a few years later, it's snowballed into having six of the top streaming brands are using us and four of the top meal kit companies and eight B2B SaaS Unicorns are using us, et cetera, et cetera.

But it really all started kind of with this simple proof point of having a very, very, very rough prototype and a stripe form on our URL that gave us the reason to believe that people care and that people are willing to pay for it. So that knocked down kind of two fundamental questions and that bought us a ticket to go try and answer a whole bunch of other questions over the course of the next several years.

"The reality though is that it's really hard to run split tests, not because the technology's hard, it's easier than it's ever been to launch a test. The problem is that once you get a test launched into the world, it takes forever to get results. If you're a larger enterprise, you might be lucky to get results on that page after one month."

 

Dan Blumberg: What was the problem that your friends in the product community needed to solve? So when you came to them and you said, "Would you use this?" They were like, "Oh my God, yes."

Andres Glusman: The problem that I was remembering at Meetup, but also the solution that I was remembering, was there are so many reasons to run split tests. And if you aren't running split tests, you're just guessing. And the industry now has gotten to a point where we all pretty much agree that is true. The reality though is that it's really, really hard to run split tests, not because the technology's hard, it's easier than it's ever been to launch a test. The problem is that once you get a test launched into the world, it takes forever to get results. If you're a larger enterprise, you might be lucky to get results on that page after one month. So if your job is to optimize conversion on a specific experience and it takes you one month to get results, that means you are lucky if, over the course of a year, you can get 12 experiments in, 12.

And the reality is that 80% of the experiments that people launch on their platform do not move the needle. So 12 shots a year, 80% fail. That means your job, if you are in this industry, you're really playing for the two to three wins a year. And the reason why 80% of those experiments fail, in our opinion, is because everyone is learning in a vacuum. No one is learning from anyone else. Everyone is repeating the exact same mistakes. We see people test button color over and over and over again. Our engine is able to detect experiments on any lever that anyone is running on an acquisition experience. I've seen so many button color tests and ultimately, we believe that button color for the vast majority of people just does not make a difference as a concept. If all the buttons are this color, all the buttons this other color, it just doesn't matter.

There might be one or two exceptions, but for the vast majority of people, they're just wasting their time when they're running experiments that are related to that. And they could use one of those 12 cycles on something that's much more likely to work. And it's a crime. It's such a waste of their time, it's a waste of their team. The reality is that the answer is sitting in other people's Google presentations and PowerPoint slides. What the idea was is if we could make it so that you could learn from everyone else's experiments, that would save you from having to run every single test yourself and allow you to place your precious chips on the bets that are more likely to bear fruit and to do it in a way that's more likely to work. And that's what the engine does at scale.

I used to get together with buddies in the New York community who are at Shutterfly and Etsy and various other companies, and we would compare notes, "What did you do last month? Did it work or did it not work? How did that work out for you?" This allows us to do that, but basically, at scale for every company in your direct space that you're competing with and companies in adjacent spaces that you should be learning from in a way that's just kind of automated to the user and allows everyone to learn. And ideally, we should raise the bar on kind of what we spend our time as an industry testing and what we don't.

Dan Blumberg: If I'm a user of Do What Works today, what do I see, what do I get, on what cadence, how does it help me and my team? Is it the team even who hires you? Is it the product team?

Andres Glusman: So Do what works is typically used by marketing and product teams in the intersection of the two, which is often a growth team. It's used by the people in the organization who are most in need of driving conversion optimization on key golden pages, your homepage, your pricing page, your signup page. You spend all this time and money trying to get people to them, and the question is, "How do you get more people through that front door?"

The ways in which our customers use us depends a little bit on the size of the organization, but ultimately, they've got more ideas than they know what to do with and they need to figure out which ones they're going to spend working on, or they're trying to get other people in the organization to buy in.

A reality that I've learned over time is that product management is a team sport, but it's a full-contact team sport. So it helps to have data before you run an experiment on what's more likely to work so that you can get buy-in. And then, ultimately, how do you execute it in a way that makes a difference?

Dan Blumberg: Andres didn’t reveal too much about how their advanced web crawler and patented technology works, but he did share how he knows that Do What Works works.

Andres Glusman: So when we were getting off the ground, a handful of folks knew us and were just really interested in what we were working on. So early adopters, classically, just want to jump in and try something new. At a certain point though, you start getting to folks that would say, "Okay, well, how do I know your engine works?" And our answer was, "Well, we'll send you some examples of your own experiments." So we did and they signed up. So that's about the best validation. They didn't tell us we were right or wrong necessarily. They neither confirmed nor denied, but they signed up. So that was good enough.

Dan Blumberg: What have been some of the biggest challenges you've faced as you've grown the company, whether it's on the technical front or explaining what you do acquiring new customers?

Andres Glusman: At this stage, we are facing an individual challenge and the reward for solving that challenge is to get to work on a new challenge. And there's basically a series of questions that we are posing and answering that unlocks the next question. So question one, for example, was does anybody care about this? Two, is anybody willing to pay for it? If they're willing to pay for it, are they going to use it? If they use it, are they going to keep using it? If they keep using it, are they going to keep paying for it? Can we find other people like them? How do you figure out how to sell to them? Can you do that in a way that's cost-effective? Are they going to stick around? How do you spread the word? How do you grow an organization? Every one of these questions is a question we get to answer, but only after we've earned the right to by virtue of answering their previous question.

 

"The underlying product that I think Google is, is really the collection of data they have around me and the ability to create delightful user experiences against it that make me want to give them more. That's one of those products right now where I just feel like, "Ugh, I could not live without." If that one company went away, the internet would stop being the internet to me."

 

Dan Blumberg: I wanted to ask you about, I'm curious of all the products that you've used, is there one that comes to mind as something that just really changed your view of what a great product is or is capable of?

Andres Glusman: The product that I am most enamored with is the Google ecosystem of products and how effectively they're able to use my data to give me stuff that I love. Their music app is so good for me, it knows exactly what I like. And their news app then gives me news articles on the bands that I'm listening to and they're just able to keep using my data to suck me deeper and deeper into the Google vortex. The underlying product that I think Google is, is really the collection of data they have around me and the ability to create delightful user experiences against it that make me want to give them more. That's one of those products right now where I just feel like, "Ugh, I could not live without." If that one company went away, the internet would stop being the internet to me.

Dan Blumberg: Yeah. I hear Google does everything by gut feel and they don't run any experiments either.

Andres Glusman: Yeah, which is funny because they've been known... While I love experimentation, the funny thing about me and my personal journey with experimentation is that I ran tests at Meetup and I was really successful with it and we were able to start driving growth and our growth went from this to kind of hockey sticking upwards. So what was my natural reaction was, "We got to run a lot more tests." We doubled and tripled the size of the team to be able to run the experiments and suddenly the answer to everything was, "Let's test it." And invariably what ended up happening is the experiments got smaller and smaller and the risk aversion got higher and higher and people refused to make any decisions on gut and they wanted to feel certain that everything was going to work out the way that they hoped it would. And the only way to do that is by testing small thing by small thing, being very, very, very scientific.

But again, if you're able to trade off precision for volume, like add some volume in, you don't have to test everything. And the biggest lesson I learned as a person who was so enamored with it was that there are so many times where it just doesn't make sense to run an experiment and you're just going to have to run it on gut. And if you do, your organization is going to move faster and is going to be able to find bigger wins faster. So it's one of those things where you have to love experimentation but know when not to run a test and when it's okay to just move forward. The opposite message that I want to convey is that running lots and lots of tests is the only way to run a company. I think it's about running the right balance and testing the right things and not testing the stuff you don't need to test.

Dan Blumberg: Yeah. I've seen what you're talking about. I see some people have a strong aversion to the phrase data-driven, data-informed not data-driven, right? Don't blindly do what the data says.

Andres Glusman: I think people get a little too hung up on it, quite candidly. There's so many different kinds of data and ultimately, you're seeking truth. And I think if we all understand that there's error in every single bit of data that you look at and that you just do your best to approximate the truth because that's ultimately what you're after, the closer you can get to approximating it, the more likely it is you're going to make something really cool.

Dan Blumberg: That's such a great way to end. Thank you, Andres, this has been so much fun.

Andres Glusman: Yeah, it's my pleasure. It's a real treat.

Dan Blumberg: That was Andres Glusman, co-founder and CEO of Do What Works, and this is Crafted from Artium. At Artium, we build incredible products, recruit high-performing teams, and help you achieve the culture of craft you need to build great software long after we're gone. We artisans love partnering with creative people to build their visions of the future.

If you've got an opportunity you'd like to discuss or just want to learn more about us, check us out at thisisartium.com or drop us a line at hello@thisisartium.com. This podcast is new and we'd love your support. If you like today's episode and, hey, you've made it this far, maybe text a few craft-minded friends a link to the show, and please subscribe and join us as we highlight more great products and the people who make them. I'm Dan Blumberg, this is Crafted. See you next time.