Why CRO NEEDS To Change

December 1, 2022


Hosted By

Rabah Rahil
CMO at Triple Whale


Ash Melwani
Co-Founder & CMO of MyObvi
Shane Rostad
The go-to designer for companies that want a high-converting website

Episode Description

In this episode of ad spend we break down the landscape and discuss CRO and wh it needs to change. #Adspend

Notes & Links

🐦 Follow us on Twitter for Industry insights https://twitter.com/triplewhale

Follow the people featured in this episode here:

- Rabah's Twitter: https://twitter.com/rabahrahil
- Ash's Twitter: https://twitter.com/ashvinmelwani
- Shane's Twitter: https://twitter.com/shanerostad


Shane Rostad (00:00:00):

You shouldn't be running ab tests like you just shouldn't. Like you should just stop doing that and just build stuff like, Hey, maybe we should add a quiz. Just fucking build a quiz. Don't be like, Oh, let's test it and let's like put it on our page for like 50% of the time. If you only like, just fucking build the quiz and hear the, and talk to your customers. Like literally like if you don't have the quantitative data to learn from doing that, just go and talk to your customers and like get on the phone with them.

Rabah Rahil (00:00:30):

We are back folks. With your favorite D TOC podcast. I am actually had to dip out of our, uh, event here in Austin A. Little bit early. Ash told me I'm not allowed to miss the podcast, what Ash GE Ash wants. Um, as always joined with my p c and wonderful human aswan. And today we have a fantastic guest. It's actually another manifestation of the Bird app. Um, Shane Rasad. Shane, welcome to the show.

Shane Rostad (00:00:56):

Hey, thanks for having me.

Rabah Rahil (00:00:58):

Yeah, absolutely. So Shane, for people that don't know you, were really deep in the CRO space and still are deep in the CRO space, but give a little people kind of some color into your background, what you've worked on, kind of, uh, what your specialties are.

Shane Rostad (00:01:11):

Yeah, um, so about a couple years ago I guess I started doing more work in like the CRO space, specifically in e-commerce. Um, before that I was just doing a bunch of website design. I had a marketing background rather than like an art background, which a lot of people get into design from. So I was always thinking about like, and I liked business so I was always thinking about like how is this gonna get better business results? So it just naturally led into thinking more about CRO and then obviously if you're helping people make more money, that's just a better space to be and like people actually care about their website in eCommerce and in SAS and eCommerce was just what I was into. Um, so started doing that, started a, a newsletter, um, called the Sierra Weekly, which hasn't, I'm no longer publishing, but um, that I was publishing weekly for a while tweeting, um, over the last like two years and uh, yeah, worked with a bunch of clients doing kind of like ongoing iterative testing campaigns.


So we were doing anywhere from like two to five tests per month, um, across different aspects of their site. Um, and so that's what I've been up to. And then now I'm no longer doing that as much. I'm working with just a couple clients, like, um, not like I don't have a full client list like I used to working all the time, but uh, yeah, I'm still like in the space but obviously what we're gonna talk to about today is like that transition of like why I'm not, why I didn't decide to like double down on cro, um, a little bit. And then as well as just all other things CRO related. Yeah, that's, that's me.

Rabah Rahil (00:02:36):

Amazing. And you're coming live from Florida and then ashes right above you and then I'm to the left of both of you guys. So we, we have a pretty good swath of the states covered and for people that don't know, um, CRO is just conversion rate optimization so we wanna make sure we can bring everybody along but I won't say conversion rate optimization cuz that's just too fucking long <laugh>. Um, so give us some color in terms of you used to be a big CRO maxi and now the more you skilled up, the more you've realized there might be some kind of more correlation than causal mechanisms and there's also just a lot of intricacies to running and experiment correctly. Mm-hmm <affirmative>. And so yeah, kind of tell me a little bit about that and then we can get into some stuff cuz Ash has some really good questions for you as well.

Shane Rostad (00:03:24):

Yeah, so um, I guess like to kinda kick things off where I learned, like I, I knew about C but I went and I took the like all the CXL courses, they have like this mini degree which is like Pep LA who's like one of the original like

Rabah Rahil (00:03:36):

Austin Guy.

Shane Rostad (00:03:37):

Uh, yeah, yeah. One of the big promoters of CRO is a thing built a big agency also like, and talks a lot like he talks a lot about how you have to do research and like this can't just be like I look at your site and be like, Oh here's an idea, you should change that. And like that's not cro that is best practices and best practices. What I've learned like through him and all of his experience. And it was like, that doesn't work. You need to like get your hands dirty and like go deep. Um, and one of the things that he also preached was like, statistics matter, you can't just fake them. And so as I got into it, obviously when you first starting out, not working with as big of clients, but like it's a little bit harder to get those statistics to really work.


You know, you're, you're running an experiment, it's going on and I'm trying to be like, I'm running a test for four weeks, you know, a brand is doing millions of dollars in revenue but they're selling $150, 200, $300 products so they're not getting a ton of orders. And I'm realizing pretty quickly about like, oh God I, you know, this is a 7% increase but when I, you do the math and I'm not a statistician, you know, I'm not like I know the enough to use the calculators essentially. Like I'm not gonna actually break down the math for you and really explain it at a deep level, but like I know what is valid and what is not and how to kind of understand that. So, you know, I started doing that and realized really quickly that that 7% was just like not provable. Like it just wasn't the more I tried to learn about it and I found myself doing a lot it like really where it started to break down was the more clients I worked with, I found that I was doing more storytelling and less like sure real statistics and I was more like it's 7% but we don't have the right like perfect numbers but like we could assume based on blah blah blah and like just kind of hand waving away the like important statistical concepts that say like you can't be confident in this.


Like if we were in a lab they'd be like, Yeah, well you better try something else cause like there's no way, like we can't be confident in this at all. Um, but so yeah, I kind of went through that process and started realizing like, oh this is really hard. And then I was working with bigger brains that were doing a lot more orders and it was a little bit, it was easier to get this, the statistics down, but it was just as hard to get good results. You know, like you're running, I'm, you know, a month of running four tests and like none of them work. And I'm like, do I know anything about design? Like do I know anything about doing research? Like I, you know, I'm talking to customers coming like I'm doing the work and like, like things aren't working and I know like that's just part of the process I think is an important thing to note.


Um, CRO in general, like it's not just like a instant win, but I just, it was, it's hard, you know, that's the things that it's really hard to get results that are positive and especially hard to get results that are positive and also real in a statistics sense. Um, and so as I started doing that was working with clients and I just kind of started slowly being like, hey, to avoid this, um, while I was, I had other interests and that's, you don't have to go all into like the topics of other things that I was just more interested in, but I kind of was like, this is really challenging and it's hard to sell this when I can't provide the like confidence that I need. If I have to story tell this much. It was harder for me to sell and really be like, you're gonna get the ROI from this cuz then you do, I know I might be i'll, I'll end after this.


But you start doing the ROI math and you're like, okay, people are like, oh if you get, you're doing a million dollars a year, if we improve it by 10% your convers rate, that's a hundred thousand dollars next year that you're gonna make. But then you start talking to brands and like every brand is at like a 10% profit margin if that, and it's like, okay, so they made 10 grand in profit over the course of the next year because of the work you did and like how much are you charging them? Like you charge them, even if you charge them a couple grand a month, that's five months worth of work to get that. Like all of a sudden they just spent all that money profit wise and it's like, are you re like you're not getting them a real roi like when you actually break it down. So it was harder for me to like, I wanted it to be like a home run. Like you're paying me, I don't want you to have to think about it. I want you to be like, yeah this is, this is amazing. I'm getting so much value and so just it wasn't lining up as much um, and was harder and then I had other interests. That's kinda like the long path of from start to kind of where I've gotten to with CRO stuff.

Rabah Rahil (00:07:50):

So totally tracking a couple things there. So when for example if I do have a measurable lift in my ecosystem, is your issue that it wasn't the change that caused that lift or cuz that like for example, like you can say, you know, like you can objectively say, Hey this month we made 20% more after we hired you Shane. And then you're saying that even though experiment did not, does that make sense what I'm saying? So is it like causation kind of issue or

Shane Rostad (00:08:22):

It's more, it's an attribution issue. It's the same thing as with like marketing spend. It's like how are you actually attributing what's working And it's really hard cuz it's like think about in in like the ad, you know, this show is called ad spend. Think about with advertising it's like you're spending money on Facebook, YouTube, like yeah it's digital so you should be able to track it. But it's like we all know that Facebook is counting conversions that came from Snapchat or from YouTube and like sure it's messy. And so when you start making changes on your site, like okay, where did the lift in conversion come from? Was it because you finally remove the product that nobody likes off your site? So it kind of like removed that distraction or is it because you ran a 30% off deal like oh we also happened to this month while we're looking at the stats, like we ran a free gift with purchase and that crushed it and like so this test we are running.


Is that still like, so just looking at it holistically on like your analytics, you see a 20% improvement like your ad creative could have popped off. There's so many things if you look at it at the test level, it's also still, if you're not like this is where it gets like into test design, there's like the actual statistics of like the after kind of analyzing the test the results and then there's the actual designing of the test. Cuz it's like, um, you know, I I guess that's where you run into a lot of, and we can kind of go into those, I don't wanna just like rant on here about it, but there's kinda those two aspects of like designing a test where you can run into a lot of issues where you'd be like, yeah you're reaching saying that that was the thing that caused the change. Um, or doing the statistical analysis afterwards to be like, we know for a fact that this change made that uh, that this update made that change. Does that make sense?

Rabah Rahil (00:10:05):

Yeah. So let me push back a little bit more, or not push back, but I wanna, I wanna be uh, the CRO maxi to your uh, Yeah, yeah, yeah. Go cro bear just to just for fun conversation and then I want ask you to kind of talk about, well when I toss it over to you, your experimentation path because I think you actually do a really good job of trying to mitigate some of the things that Shane's touching on that are very valid. So, uh, experiment design I think is huge. I think is probably one of the biggest things. And when we say experiment design, we're saying you shouldn't just throw everybody to your homepage or whatever. There, there's fancy terms like confounding variables, right? Like there's just so many things that can start to your point. Mm-hmm <affirmative> dissolve the credit where it's just like who gets the credit?


And so for example, let me give you kind of a hypothetical and then you kind of tear it apart where I'm ash and I'm gonna have a landing page that nobody else it's on. Nobody else can find it unless they go to this ad. And then I have an ad that I've been running that I know is successful. I don't change any of the copy, I don't change anything and I only run it on this one channel to start to use another kind of fancy term controls, right? So I put as many controls in place as possible to make this experiment where I'm only changing one variable, right? And so if I push all this to this landing page traffic and then I get a baseline and we can call that kind of what the status quo would be or what have you, and then I change something else, say a button color or the ordering of the page or just just one thing.


Like I don't change the copy and the order color. I only change one thing to see like, okay, what's the best ordering for people that I showed this Facebook ad to? Yeah. What would be wrong with using those results to then roll out to the broader website. So if there's like say Ash is converting at 5% on the main or on his product pages or whatever and then he has this landing page that is now converting like 7.5%, why wouldn't he take that ordering from that landing page and say, Hey let's test it on the main sites landing page. Does that make sense?

Shane Rostad (00:12:14):

Yeah, yeah. The question I would have to ask is how much money are you putting behind? How much traffic are you driving to this landing page? Cause like say let's let's

Rabah Rahil (00:12:23):

Say a hundred thousand impressions, say a hundred, like

Shane Rostad (00:12:26):

A hundred thousand impressions and then you know you're at some CPM so you know you're at uh, you know, let's just say $5 CPMs. You're right, you know a hundred, you know a hundred thousand impressions, you're at a 5% conversion rate. Yep. If that's like, you know, and so you start getting down to the numbers and you're like, okay, you're probably at, you know, let's say, um, you said a hundred thousand impressions, so that would be like 5,000.

Rabah Rahil (00:12:52):

Maybe we should do reach, maybe reach is a better metric cuz reach is reflective of a person versus you can be impressed multiple times. Say he reaches clicks,

Shane Rostad (00:13:01):

Actually click like cost per cost per click I think is probably better. Like people that actually go to the website like cuz you have to look at cost per clicks are more expensive than impressions. So it's like just, I'm just saying this because like you might have to spend tens of thousands of dollars to get enough people to go to that landing page to get statistically valid results. You know what I mean? Like you might need a thousand conversions on each side and probably more if your change like how, So let me just ask you a couple quick questions before getting into this. Like what is the change in this test, in this hypothetical? How big of an improvement is this change?

Rabah Rahil (00:13:35):

Let's call, um, well let's call it, we changed the ordering of the information landing page. And so like we put, instead of ordering X, we're now trying ordering Y

Shane Rostad (00:13:45):

And what is the delta in conversion

Rabah Rahil (00:13:47):

Plus? Yeah, plus two, two and a half points. So 2.5%. So the OG page is getting a 5% conversion rate and the new page is getting a seven and a

Shane Rostad (00:13:56):

Half. I can assume, I can estimate just based on in the past you would need at least three to 5,000 conversions on that landing page, maybe even on each side of the test. So there's a, the control which is your, what you have currently. Sure. And then B is the variant, you would need probably 3000 people to convert on a and 3000 people to convert on B to know if that 2.5% is statistically valid. Like actually to know. And so if you, if you say, if you say it's a 25% change, sure you probably need like 800 on each side, you know what I mean? Like just that's how the statistics actually break down. So it's like when you run that experiment, you're like, we're running traffic to this landing page. Are you really driving 6,000 purchases through that landing page, let's say in a single month?


Luckily like Avi I know sells a shitload of products and also has like not a super high aob. So like that for a brand, like obviously it might actually really work if you sell a ton of products like a, a ton of conversions, the numbers might actually make sense. Um, and it might be great, but for a lot of brands like getting five, like let's say it's a 5% change or 2.5%, like getting 5,000 conversions in the test, like on a and 2,500, a 2,500 B or 6,000 is really expensive. Like that's a very expensive test when you actually do the math. Like we're driving a lot of track, like that might be your primary landing page anyway. But like if you were doing this as a test, you're like, hey, we want to test this new landing page and see the effectiveness of these two variations. Like let's spend, I don't know, you guys do the ad spend. I don't, I have no clue how much that would cost, but I assume it would be like pretty expensive to drive 5,000 conversions, right?

Rabah Rahil (00:15:38):

Yeah. I mean even if you say it's $10 a conversion, right? That's 50 k right there. So yeah, the, the, I'm totally tracking what you're saying. I guess my biggest pushback is there's kind of two pathways, right? Is that either you never experiment or you can use directionality to extrapolate.

Shane Rostad (00:15:57):

I don't think that you sh like, so this is again, like I I ha like I I can, I can argue both sides here. I'm happy to kind of do so cause I've done it. Like I've, I've done this and I've, I've I have sold this the, hey, we should do this anyway because you should just test things. But like, the problem is that like when you actually look at the statistics, if you see like for example, you might have a statistically significant result which says there is a 95% chance that the change we made, and this isn't actually, this is a terrible definition, but I'm not a statistician to remember. So like there's a 95% chance that the change we made had an effect on the page. So that that ch that delta of conversion rate isn't chance. It is actually something that we did in the experiment.


But the main metric that people miss is called statistical power, which is basically like the, the no hypothesis, the default hypothesis in a test is like, this is all chance, like any change changes chance. That's the default hypothesis. And so when you re, when you accept the test as like this is right, this is a winning test, you are rejecting the null hypothesis. So you're saying this is not due to chance and in right in science, you, the, the standard is an 80% statistical power, which is basically an 80% chance that you did not do that incorrectly. You did not accidentally reject the null hypothesis when it was actually supposed to be accepted. So like you can have 95% statistical significance and I'm, I'm not a statistician so I don't, I can't explain to you the tric of like how that can happen, but you can get valid test results with only having like on a statistical significant basis with only having a 10% statistical power.


And so what people get caught up in is that you have a 10% chance that you correctly assign that the winner there is a 90% chance that that's not a winner. Either it's null, it's flat, or it's actually a loser. There's a 90% chance. So if you're not getting that statistical power and the standards of science is 80% and so yeah, you can run a test and be 60% and like yeah that's what you do. Like that's part of it. Like you, we're not curing cancer here. You know, like you can, you can interpret things that's part of the job and why you do it enough and you start to get a feel. But if you're way off, like if you're at 60% or 50%, you can interpret maybe if you're at 70, like you could be really confident like, hey, we're almost at laboratory standards.


But like a lot of people get into this, like they're probably at like 20% statistical power. Like this test, like I said, the reason why you need 5,000 conversions is to get that statistical power. But the thing that comes up is that tests can run away from you. So you're running a test and you're at five, like you're, you're at 3000 conversions and you're at a 10%, uh, convers like increase in conversion rate. But the statistics say, Hey, you should probably run this for another week to get enough conversions. If everything stays the same a week from now, you'll have enough conversions where you'll be at 80% statistical power. But what happens is that over the course of the next week, the performance might go down and now instead of 10%, it's a 7% improvement and all of a sudden you're, you don't have statistical significance anymore or you're not even close to the 80% power.


And so like you run it and then you're like, okay, we'll run it for another week cuz that that says with the next week you look at it and you're like, we need more conversions to prove a smaller change. And then it keeps kind of, most tests revert towards the mean. And so the longer you run it, the smaller that impact gets and then the longer you have to keep running it. So you get into this like you're not gonna do that, you know what I mean? But then you're like just guessing. So that's where you get kind of caught and you have to interpret things, but you have to trust the interpreter a lot and that they know better that they can like tell you what the statistics can't, which is like really hard to do. Like I do it all. I used to be fair, I would do it and a lot of people do do it and I think it's a valid thing to do.


I don't wanna say it's like a, you have to, at a certain point if you're going to test like not like these tests are gonna run away from you, sometimes you have to make a decision. You can't run them forever. You have a business to run like, you know, you have other tests to run but like, it's really hard is what I'm saying. And like people, this is the thing that happens a lot where people just say, they just end it when it's convenient and they're like, Oh, we got 95% statistical significance and we got a 5% lift with 500 conversions on each side. Like it's been a week, let's just end it. And then they're like, oh, that ordering change. Like I, I don't have to like go into a 2d, we can talk about what like the cycle that that creates of like what you might call marketing debt or like technical debt of like that, you know, like computer programmers, they have old code they have to like get rid of. That's really hard to do. You get this like CRO debt where you have these false beliefs about your site that pile up and then it's really hard to go back and be like, let's just ignore all that and retest. But there's a lot there. So I know I just kind of like went on a little bit of a ran. But um, yeah, that's the issues that I was facing that I had a hard time justifying to clients essentially.

Rabah Rahil (00:21:10):

Now I'm tracking there one more thing or Go ahead Ash. You haven't, you haven't talked and it's like a million Yeah.

Shane Rostad (00:21:15):

Sorry, podcast.

Ash Melwani (00:21:15):

Sorry, go ahead. It's uh, no, you're all good. Um, no, I, there's a hundred percent, there's a lot to unpack there. Um, especially when we typically do a lot of like changes which is around bigger changes than like kind of the minute stuff. So for example, if we're split testing, um, our homepage, it's, it's completely different. It's not even like moving sections around, it's like completely different. Like is it, do we just go like very product based, like just showcasing product or is it like storytelling, right? Yeah. So when it comes down to like you saying that you need like 5,000 conversions on each variant, I would imagine that's probably for, that's for

Shane Rostad (00:22:00):

0.5% change. Like that's for a small change that only gets you 2.5%. Yeah. If it's 25% or like that's the thing you gotta like swing for the fences, which like yeah, there are ways to do it. Yeah, I, I agree. Like I'm, I'm totally with you. It's just that like a lot of people are not swinging for the fences when they need to, you know? Yeah. So like what you're talking about is great. Yeah.

Ash Melwani (00:22:19):

Yeah. That makes, that makes like, that makes total sense to me cuz it's like there are, there are certain ideologies where it's like okay, makes small changes and then have those stack up, right? Yeah. Cause that, like, that makes sense. Like having 2.5% swings and having like 10 to 20 of them, right? Yeah. But what you're saying is that is that it needs to have a lot more conversions, impressions, visitors, whatever it is to even confirm that 2.5 is it even worth doing those and rather like making these bigger changes. Like, so for us, like I said, the homepage completely night and day difference landing page, um, completely night or day difference whether it's like a completely different offer entirely. Yeah. Yeah. Right. So like this way when you do get those bigger swings, like, so for example, say one landing page is like 5%, um, conversion rate, um, versus, you know, you now get it to 8%, right? That's like, that's a pretty big difference and I feel like you can at least tell over maybe 5,000 visitors. Yeah. Yeah. Right.

Shane Rostad (00:23:24):

Yeah, no, and I think this is a great, I don't want to, I don't mean to cut you up, but I think this is a great difference between having a, hiring someone like a service to do CRO versus doing it internally. And you may have people like externally that help you, but it's when you do it internally, you don't have to fit. Like I was just talking to someone about this is that the challenge with being a consultant is that you have to create a product like it's create a service offering and like you have to fit CRO into a box that you can sell. And so the way I did it and I was like, hey, we're gonna do like two to four tests per month, um, depending, and like that's what we're gonna do. And when you're doing two to four tests per month, like I can't rebuild your entire homepage twice.


You know, I can't rebuild your entire homepage and rebuild your entire product page every month. You know what I mean? Like, it just doesn't make sense. And so the incentives are like to do, the incentive of a consultant doing CRO myself is to do smaller and smaller tests. Like I, I was subject to that and I had to like mentally try and convince myself to like not fall into that. Cause I'm like, oh, this is gonna take me five minutes to te like to build and that's one of the tests that they're paying me for. So like, but it's sometimes that would just be lucky. It would be a big thing, but it would be easy to change. But most of the time, like it's hard to justify doing a full homepage refresh within the construct of a service offering. Um, and that's where you get this like, so what you're saying is like, basically just like there's that mis incentive where you have to swing for the fences.


Cuz I, I actually have the data here. Like for example, on on Cxl they have this ab test calculator. So you, so anybody listening can go and just play with numbers. It's called a pre-test analysis. So they have this like, it's literally an AB test calculator. It's probably the first thing that comes up and you Google it. But they have this pre-test analysis so you can kinda like look into the future a bit. Um, and if you're ma getting 20,000 visits a week with 200 conversions, so you're at a 2% convert, uh, that's 1% conversion rate. Let's make that 2%. If you run the test for one week with that amount of traffic, your minimum detest uh, detectable effect is 25%. Um, so a 25% improvement is going from 2% to 2.5%. Essentially that's a 25% change. Which is really hard. Like honestly it's, it's very difficult.


And if you go for two weeks, if you run it with that amount at like 40,000 visitors, you're looking at a 17.6% minimum detectable effect in order to get the confidence level of the statistical significance and the statistical power. So like exactly what you're saying, like if you're, the less traffic you have and the less time you wanna run test for the bigger changes you have to make cuz you're not gonna get a 25% change by like moving things like, like you're talking about moving a section above one another. Like maybe once in a blue moon, but like it's not reliable, you know?

Rabah Rahil (00:26:23):

Can I, uh, interject a little bit and say, so to your point, we're not making medicine.

Shane Rostad (00:26:30):


Rabah Rahil (00:26:31):

What is the, like if I'm making more money with the changes, like for example, if I have a thousand people come to my site and I'm a small store and I like know that a thousand people bought more on this landing page than that landing page. The thesis for you is that that's not enough to make that a winner.

Shane Rostad (00:26:55):

It's not statistically, but like you, I mean that's, that's

Rabah Rahil (00:26:58):

I guess

Shane Rostad (00:26:59):

That's, I'm

Rabah Rahil (00:27:00):

Trying to get important. I don't care about statistical

Shane Rostad (00:27:02):

Little No, no, it doesn't matter. Yeah, yeah. I think don't

Rabah Rahil (00:27:05):

I'm making more money. Like who care? But

Shane Rostad (00:27:07):

You shouldn't not tell killing people. Yeah, yeah, yeah, yeah. You shouldn't tell people that you're doing, you shouldn't be running ab tests. Like you just shouldn't. Like you should just stop doing that and just build stuff like, hey, maybe we should add a quiz. Just fucking build a quiz. Don't be like, oh, let's test it and let's like put it on our page for like 50% of the time. If you only like just fucking build the quiz and hear and talk to your customers. Like literally like if you don't have a, like the quantitative data to learn from doing that, just go and talk to your customers and like get on the phone with them and be like, Hey, how do you like do the work but just make the change. Like that's stuff that I still do today is like I do less testing and more building cuz it's just e like talking to customers and shipping.


So it's like, just don't pretend that you're doing CRO and that this is like, don't pretend you gotta like, don't run an AB test that means fucking nothing. And then say we got you a 20% and then tell the client, Hey, we just made you an extra 20 grand this year when like that's fucking bullshit. Like it's just a lie. Like just do it, do the work, but don't lie about the impact you're having. Just be okay with, we're gonna keep making things better every month. I think that's a better approach for the majority of brands just to like, you can make things better and you just know, like you said, Hey, this is better. You know, like that's, that's, I

Rabah Rahil (00:28:24):

Don't know. I'm on board with this more. Okay. I'm, I'm way more on board with this now. Yeah. Ash talk to me then about, tell him about your post purchase surveys and how you use that to integrate into changes. Cause I think this is kind of very aligned in terms of what Shane's saying. I'm tracking what you're saying now, Shane. Okay this now, now the mind do it.

Shane Rostad (00:28:44):

Do the work. Don't just stop doing the work but just plug,

Rabah Rahil (00:28:46):

I get what you're saying now.

Shane Rostad (00:28:47):


Ash Melwani (00:28:48):

I like your example.

Rabah Rahil (00:28:50):

It was muddy, it's clear now.

Ash Melwani (00:28:51):

Yeah. <laugh>, I like your example of the quiz because like that was something that we did, right? Like I really wanted to add, cuz the biggest question, right? To Rob's point, like in our post purchase service is like, you know, where, how do I get started? Like what's the information that I need? Blah, blah blah. Mm-hmm. <affirmative>. And even like pre-purchase, the questions are which product is right for me, right? We have collagen, weight loss, immunity, like what do I, like, what am I supposed to use, right? So putting a quiz on the product page was an answer to that. And what I wanted to do was not see if it's like statistically, statistically significant like gonna increase like conversions or whatever it is. I wanted to make sure that it wasn't gonna break the site, right? Yes. So if I'm gonna ab test it, am I gonna, like if people are getting distracted and not buying, but rather going to the quiz, am I going to lower my conversion rate?


Like is it gonna be a yeah, yeah, yeah. Fat loser where I need to take it off entirely or is it gonna be somewhat around the same? And if it is that's added value, right? Then I'm just gonna keep that as a business owner. I'm just gonna keep that and it's for the cuts, it's for the consumer. If they need to access it, they can right there and then, Right? So that I agree with, right? Mm-hmm. <affirmative>. And then even like for example, adding our sample packs, right? Like adding that to like the, the navigation. It's like do we, do we kill our AOV because now people are buying $12 product or does AOV kind of remain the same? And that was a test that we did too and AOV like was hovering around the same amount, right? Yeah. Maybe it was like 60 versus 59, but like, is that like enough for you to be like, Oh no, no, remove this. Versus people are getting samples that they're looking for and have a higher chance of finding a product that's better for them, right? Mm-hmm. <affirmative>. So like there's, there's more than just the, like the data, right? Yeah. Like if something's really screwing with something, obviously remove it. But like if things are kind of like, you know, on par then I, I I don't

Shane Rostad (00:30:48):

If customers are asking you for a quiz Yeah. And you add it

Ash Melwani (00:30:51):

For it. Yeah.

Shane Rostad (00:30:52):

Like, you know, it's the right thing to do. Like just build a quiz and add it and like, like you said, I think that's, there's so much nuance here. Cause like you said, I when like even the, like a client that I'm working with now where we build stuff and like ship it and sometimes test when it makes sense to like with the quiz, like per like perfect idea. If you're putting something on your product page that could just completely, really hurt your conversions, run the test and just don't look for, is this a valid result? Just look at like, it's not destroying your conversions. Like if you're down 10% and it's like steady then like keep running the test and just like, you may never get the statistically valid result but you could be like, look, this isn't looking good. Maybe we should move it.


You know, like you can infer that but like you can't be like, it's the attribution to like, look at how much money I'm making you. That's where it gets a little like, that is really hard to prove. You could be like, look, it's get the site's better, ask your customers. Like that's real. But the attribution of the money back and like the quiz is just, there's one more thing with the quiz that's a perfect example. It's like if you're gonna test a quiz like where, uh, the effectiveness of a quiz, like where do you put it to test it? Do you put it in your hero? Like your, your first like homepage section? Do you put it in the nav? Do you put it like behind a dropdown? Cuz like, are you gonna keep it like you could put it somewhere like your hero where it's easier to test, like it's just a homepage, it's just, you know, 50 like easier to test, but like you're not gonna keep it there.


So like if it works or doesn't work, like you're gonna change that anyway. So like why are you testing it there? And it just, it's hard to like make the argument for like, hey let's test a quiz. It's like, okay, yeah, let's spend hours coming up with questions for a quiz. Like you can come up with like the mvp but then you're like, this is the whole test design thing where you get in a bad spot cause you're like, hey let's do an MVP quiz and then you make some shitty quiz using like, I know brands that use Typeform for their quizzes and they, they crush. So no offense to Typeform, but like you're like, let's just add three questions and it doesn't actually add any value and it distracts people and then you ask for an email and then they leave your site and it's like, oh, quizzes don't work. And then like, do they not work or did you just make a shitty quiz cuz you wanted to fit it into a test? You know? Yeah.

Rabah Rahil (00:33:03):

So I don't know.

Shane Rostad (00:33:04):


Rabah Rahil (00:33:05):

Like a, in that point there's so many variables.

Shane Rostad (00:33:08):

It's hard. Dude, this is the thing is like, this is hard and you need to like be like, there's a lot of value in having somebody who knows these things like working with you, whether you're getting that percentage increase or not. But like you just have to know kind of what you're, what you're getting. You know, It's not just, let's see number go up. I think that's the, my main gripe is that the current conversation is all about number go up, conversion rate go up, like, or AOV go up and it's like, it should be more like what about reducing customer service increase? Like what if customer service is down 25%? Is that not a ton of value? Like there's so much you can do on your site that's beyond just like number go up and that requires a lot of hand waving and storytelling that like, there's a lot of value you can get outta doing that stuff. So that's just like, I dunno, that's what gets me worked up. Is that like hyperfocus on that. Um, when there's more to it.

Rabah Rahil (00:33:59):

So in that sense, your thesis is then just basically perpetual experimentation versus like ab holdouts kind of thing.

Shane Rostad (00:34:10):

I recommend people and like I've, you know, what I do with current clients is I, we don't test first. It's not this testing f it's hey, we're just gonna, every month we're gonna ship new things. Like for example, sure. One client I work with is named only curls. I've been working with them like forever and we started doing tests and then we got to a point where we just like, there were some things they just wanted, they were like, do we have to, They were like getting like, do we have to test this? Like we know we want this regardless of if it helps or hurts, we want it to, we wanna have a progress bar in our cart. Like we like it, whether it really helps hurts. Like do we have to wait four weeks with like the 50 50 before just public? Like can we just publish it?


And like I was like, Yeah, that's a good, like why don't we just ship it? And then we started building more and like we just, today they just launched like a bundle builder, another thing that's just so hard to test and but also took like a few weeks of design and development time Sure. To get right. So it's like you can't fit it in this framework. So like it's harder to do, but my core thesis is like just make the goal to improve the customer experience every month. And a lot of people like to assign metrics that quantitatively and say like, oh, conversion rate is a measure of customer experience, but like oftentimes it's not. Um, and so that's my approach is just, hey, let's ship things and talk to customers and hear what they want and build things that enable marketing campaigns. Like hey, let's build a feature, like a little like free gift with purchase like modal popup thing that like custom rather than using an app that like sucks and slows down your site.


So like, let's just build that and we built like an add to cart little thing. Like you could add, like you, we have two products that you could add to your cart and it, we don't use an app like rebuy because like it just didn't work. And so we just built it and now it works and like we don't have to deal with all the limitations of not being able to like show this product but hide that product and we can do it really custom and it's like a much better ux but we can't do that in like a testing environment. So that's my thesis is just like build things and if you're, if you want to test for actionable people that are smaller, like make exactly what as saying make big changes. You can see if you really wanna run a test, make big changes and you can see big results or you'll see no results.


Like that's the only way to get the statistically valid stuff or run tests when you're worried that it might negatively impact things just to see, like you could actually see does this hurt or not? Not perfectly, we just talked about like your kind of interpreting a lot, but that's what you're doing. And then if you're, once you get to like 10,000 plus orders per month, yeah you can do small changes and measure like 4% differences, you know what I mean? And like that can go hand in hand. Then you could have people building and shipping bigger features and doing the smaller tests, but like most brands are not at that 10 grand, 10,000 orders plus 20,000 orders per month. So it's like, that's the actual stuff. It's like don't focus on the, just don't focus on like the number go up. Just make your site better and every like consistently ship and improve things, um, subjectively rather than going objectively. Um, I know that it's like a, I dunno, some people are like, Oh, well how do you really know? And I don't know, you know, like it's, it's hard but like it's, it's the honest approach, you know, it's the more honest approach in my opinion.

Rabah Rahil (00:37:33):

It ain't much, but it's honest work. People get that meme. Uh, so, so question for you then, cuz I'm actually even more on board with you now because, um, I'd always preach this to people where if you have, uh, you know, if to your point, if you're doing $10,000 a month, like a one point cent, a 1% lift in conversion rates, you're not gonna fill that. But if you're doing, you know, a million dollars a month, a one point or a million orders a month, excuse me, orders mm-hmm <affirmative> a 1% conversion rate, you're gonna fill that. Like, that's substantial. Yeah. Like, and so I totally agree with you there that as you get bigger, the optimizations become more important and like almost like a spectrum of building to optimization and and that's a core like positively correlated with your run rate or your order volume.


Excuse me. Yeah. And so like the less order volume you need, you just need to be building and shipping and then the more order volume you have, then it makes more sense to optimize. Cuz not only do you get the quote unquote stats sick, blah blah, blah, all the fancy stuff, but you fill it mm-hmm. <affirmative>, like you actually will fill a one or two point bump where it's like it that a big, big established business like that, like a 1% change or a 2% change in their conversion rate is gonna be substantial. Like it's real money. It's not like, well it's 30 more dollars this month.

Shane Rostad (00:38:47):

Why the institutional buyers are like happy with the 2% return and then all the normal people are hap are like on Reddit and like looking for Robin Hood options trading during covid. You know what I mean? Like it's just the difference of like what, what feels satisfying to you or what actually moves the needle is like when you're, when you don't have a lot, you gotta swing for the fences. I don't recommend that. Like this is not financial advice. Can't

Rabah Rahil (00:39:09):

Leave them out. Don't.

Shane Rostad (00:39:10):

But like, you know, but at the same time when you're big, those small changes, you really, like you said you feel them a lot and you're like, oh wow, that's an extra, you know, 200 grand by that one test. Like it's, yeah,

Rabah Rahil (00:39:21):

It's, it's funny you bring investing up because there's a axiom in investing where it's better to be right, less frequent but at a bigger magnitude than it is to be frequently right at smaller bets. Mm-hmm. <affirmative>. So set a different way. It's better to say if you hit on two bets a year, but you put the house on those bets, that's a better payoff than if you hit 20 bets. But you only bet pennies on those bets, if that makes sense. And so it's a little, little bit kind of aligned in what you're saying in terms of your thesis of like, these little paper cuts aren't gonna be as important to like the analogy of return on investment versus, um, making more money for your G TOC store is like those little paper cuts, even if you're right, 20 times those paper cuts aren't gonna add up to, um, ashes totally changing the homepage and now my conversion rate is 7% compare, 7% bump or something like that. It's interesting.

Shane Rostad (00:40:15):

Yeah, I, yeah, it's um, there's a lot of like, uh, you know, I think it really drives the point home of like what I think with Ash brought up, which I do wasn't even thinking about. I'm obviously have my own, you know, uh, um, blanking on the, on the words here now, but, um, my own biases around like what's on top of my mind but even aspiring up, I'm like, okay, yeah, there are like situations where like the quiz thing, like it makes a lot of sense and swinging for the fences. Like there's um, a lot of those situations where it does make sense to be doing some of these things and like it would be good to know how, um, and it would also just be good to get a, like it's not that complicated, the statistical stuff, I don't really know it, but like I don't know it that well. I know it enough to know what, what's right and what's wrong. Um, but just a baseline understanding isn't that hard to get, um, to know if you're tracking against that and then just strategize around that. Like I think the main lesson is like if you're small, just make big changes and ship and don't hold yourself back by thinking you need to get perfect results cuz you're not. Um, and yeah, there's a lot more that you could do on your site rather than focusing on that single metric of conversion rate.

Rabah Rahil (00:41:30):

Interesting. What do you got Ash? You you got the, the perplexed smirk over there?

Ash Melwani (00:41:37):

Um, I think this kind of just backs up more of this argument more than anything, but I think a lot of people also tend to look at it more black and white when things right now in terms of like quality of traffic isn't black and white, um, especially with like, just volatility and quality, whether it's on Facebook or TikTok running these tests and you know, say you launch something, say, say you launch a split test, let, let's just call it the quiz, right? Put the quiz on the product page and Monday rolls around and you have like the lowest conversion rate and you're like, was that because of this? Like, did, is it the, is it the test that's happening? Like what if it's not that? What if it's just quality coming in from, from Facebook, right? Mm-hmm. <affirmative>, what if it's quality coming in from TikTok, it's

Rabah Rahil (00:42:24):

Labor Day or something, right? Yeah,

Ash Melwani (00:42:26):

Yeah, yeah. Right. And it's like you immediately feel like you have to pull that test cuz it's like internally you're like, Oh, well I, I, I can't, I can't, you know, stomach a lower conversion rate, blah, blah, blah. And then people like kind of let it ride out and then you start to see like the results kind of like get closer and closer to each other later down the line and it's like, well actually this actually this is fine now. Right? Yeah. So you have to, you know, that's the biggest thing. It's like when we talk about conversion rates on like ads, you know, just in general it's like, oh, this ad has this conversion rate versus like, this ad has a conversion rate and it in itself is like a AP test, right? Mm-hmm. <affirmative>, it's, it's purely like the quality of the, the traffic that's coming in, right?


So there's so many variables, which is why like, again, just to drive this point home is test the bigger things because then you could be a little bit more certain that it's either making a difference or not, right? Mm-hmm. <affirmative>, so like whether you're doing like really like high quality polished videos as an ad versus like your usually C style content. Like that's a good test to me where it's like all which style does better, but like using like two u GC style videos against each other, like it's gonna attract a different audience. Mm-hmm. It's gonna like kind of, you know, the traffic coming in is gonna be a little bit more different where it's like, eh, the the, the results aren't as like, you know, statistically significant as you, as you would say. So I think again, just driving the point even further home. But yeah, that's just one thing that I always like, like to tell people. It's like there's so many variables and especially when it comes to like paid traffic, it's Facebook could just have an off day and it's like it ruins the rest of the, the experiment

Shane Rostad (00:44:12):

To your, to your point, the last, the thing I mentioned previously, but that I want drive home is that the reason why this can have a negative effect doing traditional CRO when you don't have the data necessary is that when you run a test, it's, and like let's say you changed the headline on your homepage and you see like, oh, that's a minus 0.2% and it's like, maybe not those statistics. You're like, that headline is bad or like you change a headline and it shows a 10% improvement and then like you love that headline, like that headline is like you're never changing it. And then they're like, wait, let's test a new headline. And you're like, no, that this is the best one, this is 10%. Why would we waste our time? We already tested that. We have other things we could test. Like why would we waste our time going back?


And you, like, every test you run, you build up either like a negative stance or a positive stance towards the change and you're like, or just neutral like, ah, that didn't really do anything and you get this. It really does build up to that. You get to the point where you're discussing other changes, you're like, but we already tested that. How are we gonna make sure that that's not gonna hurt the results that we got here? Or why are we doing that? And like, you get into this, you just kinda almost get, um, you like freeze up, like you decide, like you stop being able to really make the changes that you should, you get feedback and you're like, Oh, but this customer told us this is so confusing. Like it makes no sense. But we tested that and it got us a 10% improvement, but it wasn't a real 10% improvement.


It's like the point of if it was then that customer's probably wrong, but when you're doing this, you get this false confidence about positives and negatives and it becomes this technical debt, this CRO debt, um, design debt I guess is more apt. Like you get this design debt on your store where you're afraid to make changes or you like don't like things. Um, and that can build up, and I've seen it happen with my own kind of stuff that I've done is that the more tests you run, the more things you learn. But if you're not, like the best part about testing is that learning. Like every time you do something you're like, oh, up or down. Like you do that enough, like you do four, like three test months, two test for month one test. Like every month you're learning something new. But like you have to be, make sure that you're learning the right thing.


You could be learning the wrong thing over and over and over again and just going a bump up, a bump down, a bump up, bump down, and then all of a sudden at the end of the year you're flat and you're like, Wait, he told me that we were getting, I saw the numbers going up and we only kept the ones that went up like, how are we flat or how are we down? And it's like, oh, well then you kind of argue like, oh well there's so many other factors to your point. It's like so easy to hand wave this stuff away. And I've done it. Like I don't hold this against anybody, like I've done it. It's just more complicated. Like there's so much you go on. And one thing that popped into my head is that four things you should test is focus on the big things like offers.


Like I think you should be testing on like on Facebook and on your site, what you're actually offering people, like discount wise, promotion wise. Like for example, one client, uh, one other client that I work with, they have like a subscription attachment to their products like ANT product. Originally we started off with like monthly, an annual, of course we wanna get all the annual subscriptions like yada yada. We were testing like, should we remove change the copy, do this? We were testing, they have a lot of traffic and we were trying to see like trying to learn what works. This is really important. This was like a big investment. And all of a sudden, like this past weekend today, we made, we changed it and we took away the annual offer entirely, which would've been better. But we were like, let's take that away. Let's do a month free.


So their first month is free and let's actually make it so that with like Shopify, it shows when they add it to their cart, it actually charges them $0. Like let's do the technical work to make sure that everything actually applies correctly. And then let's give them a small discount on the device. So they're getting a small discount, they're getting their first month free and let's see if we can get people in the door. And they sold like hundreds of units a day. When previously we were selling like, it was like scary, like five or 10. And they were like, Oh my God, is this ever gonna work? And then over the past weekend everybody's like, Holy shit, this is, this works. Like this is actually working. And like we weren't, none of that was statistically significant or valid. We were just testing it via like sending out emails and stuff.


Just getting a feel like you're saying getting that feel like that that more intuitive approach. And then it was like, Oh wait, we sold 200 yesterday and every day before that you sold 10. It's, there's, it's, there's no argument. You know what I mean? It's just cuz like you found it clicked finally. And I think a lot of brands are at that point where like, it hasn't clicked. Their grinding, their Facebook spend isn't working. It's like you have to find like just try things and you'll know, like you will know when it works cuz if it's not working, it's really not working. And when it works, it'll probably be like that first moment when you run that ad that just really hits and you get that insane reductions. Like you're like, Oh my god, Facebook works. Like there's a lot of brands that I've talked to that had that moment where they're like, Everybody said Facebook Works.


They were lying, I swear. And then one day they're like, Oh my god, Facebook works. They love Facebook ads and they just wanna like invest more money into it. Like cuz they had that one thing that finally they were like, Oh we, we figured it out. And a lot of brands don't get there, but like that is the, the moment that you should be looking for when you're smaller. Um, and don't focus on the minutia too much. Um, guess that's the moral story. Just test the things that really make an impact. Um, and keep trying. Don't try and don't worry too much about the statistics. If you're small, just keep shipping. Um, and if you're big worry about the statistics and find somebody that knows what they're doing and does it right. There are a lot of people out there that do it really. Like there are people not gonna like shout anybody out or call anybody out for not doing it. Right. But like there are people in the space that do it well and are at least really trying to get good results for people. Um, and I think most people are, but like there are people who are just genuinely doing a great job at it. Um, I don't wanna shout anyone cause I don't wanna not shout someone else out. Um, but yeah, there there are people doing great work for brands that are big enough to really do cro well.

Rabah Rahil (00:50:02):

Amazing. So Moral story. Give shit away for free. It works for, Come on. I've been sitting on that

Shane Rostad (00:50:10):

On that.

Rabah Rahil (00:50:10):

It's amazing. Come,

Shane Rostad (00:50:11):

You know, it works. Yeah.

Rabah Rahil (00:50:13):

Um, no, I, I'm totally tracking with you there. Uh, alright. I have some user submitted questions, so mm-hmm. <affirmative>, you might have to take off your cro, cro bear hat and go back to uh, Han cuz these are a bit cro.

Shane Rostad (00:50:26):

I'm, I'm happy I'm big, big CRO fan. We

Rabah Rahil (00:50:28):

Play the side,

Shane Rostad (00:50:29):

But Yeah. Yeah.

Rabah Rahil (00:50:30):

Um, what's the, and this has kind of basically been the whole episode, but what's the biggest CRO mistake econ brands make and let's say, let's say, uh, small, smaller econ brands. So under, under, you know, 5 million a year kind of econ brands.

Shane Rostad (00:50:45):

Um, yeah, I mean there's probably a lot, uh, besides all the stuff that we talked about, about just testing, like just don't test cause it doesn't make sense to, um, build, build

Rabah Rahil (00:50:55):

Ship. And

Shane Rostad (00:50:56):

I think the biggest, honestly the biggest CRO thing mistake that people make is that if you're not, if people aren't buying your product, like you really need to just talk to your customers. Like stop. Like they might just like you might, I, this is like sound shitty, but like you might just not have a great product right now and you might need to work on it cuz people are seeing it and they're not interested. Like there are a lot of people who start brands and they build a product and they're like, they like kind of that Steve Jobs moment. And I think like just going back and talking to customers and like literally going up to people on the street or in a store or whatever, like, like I even heard like Nick Sharma had like a point where he would go when he was at, um, like Hint, he would go down to the bodega and talk to people who were buying it and be like, and then they would say things he'd be like, he'd learn from that.


Like, do that work. And that's the work that you do that gets you that unlock where you're like, Oh my god, Face works because people were, were selling the right message. It comes down to messaging a lot at that point. Like you just have to, your value proposition has to be correct and you might not have a value proposition that resonates and so you might have to work on your product. But, um, that's the biggest mistake I think I see people make as obsessing about my website looks like this. It's like, no, I think you might just have like an undifferentiated product in a very crowded market and you're not selling it in a way that's really resonating with people. Um, and that's a hard thing to figure out. I'm not an expert at that, but that's what I see. I, I've tried to solve that problem with website stuff and it just doesn't work.

Rabah Rahil (00:52:22):

Yeah, that's a really, um, there's a really super wicked smart, uh, we just, it's actually going on today and tomorrow. Our Blue whale group where it's, um, our biggest store is we invite him into Austin and we put him in a room and just have him talk. Mm-hmm. <affirmative> and the CFO of Thread wallets was very, very like, he, um, they make just all sorts of cool little wallets and they have all these little gizmos like places to put your chaps to key rings built in the wallet. And he did the kind of SMA thing as well where whenever people, um, would go out or he would see like at a table and stuff and people would, you know, sometimes you'd take your wallet out and just kind of put it on the table so you're not sitting on it and he would just pepper these people with questions and stuff like that. Mm-hmm. <affirmative>. And that was, uh, he said one of the, to your point, biggest unblocks for them to, uh, get their product differentiation. So go talk to your customers people. Mm-hmm. <affirmative>. Um, I know this is, this is gonna be like nails on the chalkboard for you Shane, but the people submitted the question so we're

Shane Rostad (00:53:13):

Gonna have to answer. No, that's fine. Fine. Go for it. I, I won't, I'm Yeah,

Rabah Rahil (00:53:16):

You're being a good Go

Shane Rostad (00:53:18):

For it. It, yeah.

Rabah Rahil (00:53:18):

Yeah. What's the, what was the most surprising split test you've ran?

Shane Rostad (00:53:23):

Um, the most, well this is an easy one, is that I, that same brand only curls. This is why, this is a great example of why best practices and looking at a site, you're probably wrong. Um, if you just look at a site and you're like, you should change this, you probably don't know enough about their customers. And so that, that brand only curls we had just started working together. I was like home run idea. I look at the site they don't have, people can go and [email protected], they don't have a hero section, they don't have a value proposition. It just says like your perfect, like get your, I don't even remember I was looking at it today. It's like, get your perfect curls, like four products for your perfect curls and it's a collection grid of their four core products and that's like the hero where you land.


And I was like, that is not best practices that I'm gonna have easy win right off the bat and I'm gonna look so good. And I put together this, like I did the customer research, I surveyed, I talked to customers, I found out like what they really care about. Crafted a headline with an image that like, it was about frizz, like is the main thing that everybody cared about, like frizzy curls. And I was like perfectly like before and after frizz versus not in the hero and like having people go there and like, it was so bad. Like it literally was like a negative 10%. Like right off the bat I was like, this is probably like, I can let it run for a little bit. But I was like, I, it was like two weeks in and I was like, this is so embarrassing but like I think we have to stop this test like it's two weeks.


That hasn't gotten any better. It hasn't even gotten close to going back to normal. Like we should get rid of it. And then it was a moment where I was like, Oh you have to like actu like this is where you really have to do the work and why this is hard is that I was realizing like, oh I was, it was that first initial period where I like to get tests running. So I would just pick ideas, be like, let's just get these going and we'll continue to do more research. And so I didn't look into like the quantitative stuff as much yet. And like they have 50% of their customers are repeat cuz they don't do uh, subscriptions. And so everybody comes back to buy. And so like people land on the homepages type in only curls and they're four products that they buy every time are right there and they click add to cart, like it's right there and like I removed that or I put it below the fold and people are like, where are my products?


And like that was a terrible idea. And what we did, again, this is the more like iterative thing is I was like, okay, terrible idea. Get rid of that. But they have like a starter pack. So I was like, let's just put a little banner that says like, are you new here? Cause like most people are not like, are you new here? You should check out our starter pack. And it was just like a little banner that we had above, uh, across the top and like did not push anything outta the way and that test was a winner. And like, I actually improved things by like seven or eight, 9%. We ran it for like four weeks and it was like maybe a little bit of like at the end a little hand wavy, like it looks good kind of thing. But like it, that was so surprising. I think I wrote a Twitter thread about it and like was super dramatic about how like dumb I was. Cause I really felt like I knew nothing. I was like, am I that bad? Like I just ran this test and I was so confident and it was so bad. Um, and yeah, that's definitely the most surprising like change of events where I also learned a ton.

Rabah Rahil (00:56:22):

I candidly I did the same thing where I got a really amazing audit done of a site, got it all done up, blah, blah blah. And um, the commercial rate tanked because the biggest spending returning people like, it's like getting a new bartender, right? Like who the fucks this guy? I don't know you. I'm gonna go to another bar. And so that's interesting. Yeah. Oh, okay. This will be fun for you. What's a uh, popular piece of CRO advice that's kind of bs?

Shane Rostad (00:56:50):

Um, test everything. I don't have to go any further. We've talked about it, the whole podcast but yeah. Test everything is

Rabah Rahil (00:56:56):

Statistical significance.

Shane Rostad (00:56:58):

Yeah, yeah, yeah. Like statistical

Rabah Rahil (00:57:01):

Significance, Sig sun. Yeah. Yeah.

Shane Rostad (00:57:02):

Stats and test everything are often put in the same, like same bucket. Um, and I'd say that that's just for most brands, that's terrible advice for all the things that we talked about in the past. Um,

Rabah Rahil (00:57:15):

Build and talk to your customers people. Yep. Just talk to your top three websites that you've ever seen.

Shane Rostad (00:57:21):

What are

Rabah Rahil (00:57:22):

Some of your favorite websites that you think are put together really well?

Shane Rostad (00:57:25):

I mean, honestly like, I don't know how these sites, how certain sites convert cuz like I've,

Rabah Rahil (00:57:30):


Shane Rostad (00:57:31):

Fine. I've seen beautiful sites and they're not great, but like they don't convert great. I've seen terribly like over the top beautiful sites that are just obviously terrible experiences for buying products. But like Bite toothpaste I think is my favorite website. Huge site. It's cuz they're their co-founder. I'm blanking on his name cause I haven't seen it so long. But he's a designer and his, I think his wife who's the co-founder is the, the product. Um, yeah. And their site is amazing. And I, and I like, that's a site where I go and I definitely like steal some ideas from around like how they do navigation and things like that. Like I love the, I love this trend, maybe it's not really a trend, but of not using collection pages if you're that mid-size collection, like putting things in the navigation. So when people click like hover or open, like having that mega menu slash like you click shop and it's like, here's our products rather than like, here's a collection of them.


Um, they do a mix of both. I think they do it really well and um, I know that they're constantly updating things. It's like sometimes it's a little bit too much going on, so it's like laggy and I'm like, okay, you probably went over the top on this one. But like, they're having fun with it. And I think that's like the most important thing is that they, they like it clearly is having fun with it. And uh, I love that. Say, I don't know if I can, um, on the back of that, the other one that's not as like super beautiful and fun is like I look at like native deodorant. Um, because I just think that

Rabah Rahil (00:58:51):


Shane Rostad (00:58:51):

Works. Yeah. Their site just works and like I know that it converts well and like, you know, they clearly explain things. It has no frills, you know, the complete opposite almost of Bite Bite has the scrolling animations and all that stuff that looks so cool. And the moving text and stuff that I love it. Like honestly like, but Native is just, hey, we have a promotion and like the banner, sometimes the image isn't even like cropped a hundred percent Right. Cause they're just like, we're just shipping, you know? Um, and I really, I definitely respect that approach a lot. So

Rabah Rahil (00:59:21):

I I love that. Do you got any sites you like Ash? Schoolyard, Snacks, Obvi obviously

Ash Melwani (00:59:27):

Shout out to Shout to Helen. Yeah, that's my

Rabah Rahil (00:59:30):

Favorite site. They're a client. Yeah, they're, they're awesome. That is a beautiful site. I actually, Shane going back to your previous point. So there's a site called Drink Update and it was so beautiful. Um, it's like little like clean energy drinks kind of thing. So beautiful. Mm-hmm. <affirmative> I bought from it.

Shane Rostad (00:59:48):

Yeah. <laugh>

Rabah Rahil (00:59:49):

And like not trying to poop on a product but like the product's. Okay. But every single can came and blew up on me and it blew up on my car. And like, so, and it's a perfect example of like, yes you can get people to buy but they won't ever get me to buy again. Yeah. And so like really being product first and product focused and understanding like that is the main thing to make sure that the product is awesome. Mm-hmm. <affirmative>. Cause the site's one of the most beautiful sites I've ever seen. Um, and it's

Shane Rostad (01:00:15):

Probably done by uh, an agency in New York City that charges a lot of money. And there are a couple that are very well known that make really cool sites that just like are have no, no care in the world for people buying products and like, I don't know, like branding is, uh, one person I shout out is Xavier Armand who uh, with Van Group, um, he is opposite. Like he really like, I'm like alright, the design, like that frill stuff isn't that important. He is like a very strong proponent of like brand and like that impact like, hey, it's so cool that I'm gonna buy a product. Like, cause I love it. He's really a big proponent of that, but also has a nice mix of like, it has to be usable. And there are a lot of agencies that just do one or the other, which, um, I don't know. That's their, that's their thing. It, they look cool. Um, I'm, I'm a fan. They're inspirational for like, from an aesthetic standpoint. So yeah.

Rabah Rahil (01:01:06):

I also do think it depends on the vertical. Cuz a lot of luxury sites, like a lot of luxury quote unquote products are notorious for having like aesthetic over like Yeah. Because you want the product so bad that you're gonna jump through the hoops to get mm-hmm. <affirmative> like you're gonna figure out how to buy. Um, and so converting isn't really top of mind for them. No, but the other thing that kind of like one of the most data driven companies in the world, Amazon and like look at their site, right? Yeah. Like it's, I took a screenshot of um, their buy page and there's 78 million things going on and you know, this has been tested and to your point, probably stats Zig because they have the traffic. Yeah, yeah. Push. Yeah. And so it just kind of blows my mind how there's these these different kind of bell ends of like aesthetic and conversion. And obviously the most converting site is gonna be an Amazon design, but to be fair, like Amazon really doesn't have, or Amazon's brand is built more on trust as well as speed of shipping. Right. And they have

Shane Rostad (01:02:09):


Rabah Rahil (01:02:09):


Shane Rostad (01:02:10):

They have prime know like two day shipping. It's

Rabah Rahil (01:02:12):

Different one time. Exactly.

Shane Rostad (01:02:13):

It's like, you're not, like for most my number one takeaway is like, yes, like great, but like most brands, like you're not Amazon so don't even look at Amazon for ideas. Cause like you don't have Prime dude, their conversion rate's like 50% or something. Like you're not, you're not getting anywhere close to that. Like, I'm sorry,

Ash Melwani (01:02:31):

There's so many loves that are so counterintuitive on there too. Yeah,

Shane Rostad (01:02:35):

I know. They just have so much, It don't, it almost doesn't matter. They can't turn people away, I think is the thing. Like they, no matter how like many buy buttons they have, like nobody's gonna be like, I'm not gonna buy this. So they're like, every little buy button is another chance to like get someone to click and buy something. So it's like, it's just a, I like Yeah, it's

Ash Melwani (01:02:54):

Hilarious. It Get Tomorrow is probably the biggest driver for anything. Oh yeah. So like, and

Shane Rostad (01:02:59):

It's more

Ash Melwani (01:03:00):


Shane Rostad (01:03:01):

Like free shipping for the most part. Like Yeah, you already have your info in there. Like it's just, it's just a

Rabah Rahil (01:03:07):

Different thing. Prime toggle the whole thing. Yeah. But I love what you're saying there Shane, cuz there's also an investment maximum of like, uh, going, I studied economics, so I always go back to kind of investing economics, that kind of thing. Um, and it's basically like when like Warren Buffet's giving advice in terms of like how he's investing his money, it's like mm-hmm. <affirmative>, unless you have that wealth, that advice does not apply to you. So I think that's kind of the same analog to like using Amazon's, um, conversion rate. Yeah. Uh, strategies probably doesn't apply to you.

Shane Rostad (01:03:39):

Yeah, exactly. You're just, you're different. They're great at what they do. Um, clearly. But yeah, <laugh>, I wouldn't, I wouldn't copy their site.

Rabah Rahil (01:03:50):

Okay. One last question and then we'll wrap it up. Um, what's something that seems stupid but actually increases conversion rate?

Shane Rostad (01:03:57):

Um, I'm trying to think of something that's like measurable. Um, let me think Ash. I don't know if you have any thoughts while I think of an example, um, or of an idea, but something that's stupid that actually increases conversion rate. Yeah, I don't wanna just like repeat other, um, I don't wanna repeat things that I've already said. So I'm trying to think of a unique example. Ash, do you have anything?

Ash Melwani (01:04:34):

I honestly, I didn't really tested anything stupid. <laugh>.

Rabah Rahil (01:04:38):

Of course not. Of course, of course you haven't. This

Shane Rostad (01:04:40):

Well, probably all of the tests that you haven't run, like all the tests you haven't run are all the stupid things that probably increase conversion rate, you know, like Exactly. You know what, you know, what's a stupid thing removing all of the scarcity indicators, like all of the, all of the advice of like from the influ, like, uh, what is it, Psychology of persuasion, like the influence, um, like all of those principles, like scarcity and things like


Not like, you know, overdoing. It does work. I mean it's kind of like sounds silly. I don't know if that really fits, but like it does work, but if not, if you're like overbearing with it or you're lying, like it does like, I don't know, that's not like a great example but like doing it, doing it oftentimes will backfire I guess is what I'm saying. Like using those principles to put scarcity out there like will often backfire. So if you have those, they're dumb and remove them and then you'll have an increase in conversion rate probably. I think that's what I was trying to get at, but that's the only thing that like, stupid thing that really comes to mind.

Rabah Rahil (01:05:43):

Amazing. And e easy there. Chill Dini. He's one of my influences. Dope book's. Fantastic.

Shane Rostad (01:05:49):

Oh no, it's great. It's a great marketing book and I love it. Um, I think it's just, uh, another one of those where people take it as gospel and then lie to their customers and you know, a lot of marketers on Twitter read influence and learn a lot. You know, it's like me, me actually, I actually have one, you know, it's like <laugh>.

Ash Melwani (01:06:06):

I just thought of one <laugh>. Um, what we ended up doing was, uh, so we offer free shipping all the time, right? Like it's just set right? But on our, on our gift bar in the cart, we say you've unlocked it after a certain amount. So it's like it's pushing AOV because they think they've unlocked something, but regardless if they didn't hit a threshold, you're still gonna get it. So like mm-hmm <affirmative>, I don't know if it's stupid but like sneaky. But I mean it works so weird. We

Shane Rostad (01:06:36):

Kept it. No, it's related to that is actually, um, I think it's like called like that band brand boom by uh, who is it? Cindy Joseph, I forget who is the guy blanking on his name that runs the brand well known in the space. But they do this thing where they like, you buy one, it's actually not this brand. They have another brand, but you buy like a box of something and they just like add random shit to your cart for free and they're like free stickers and free this. And like, I know like, it definitely works and like the, the cost of those little things, like, it's like 20, it's like free mystery sticker pack or whatever and it's like probably 30 cents or whatever, but they add it in and it's just there after you hit Add to Cart and it's like free stuff and it's like, get it. And like sometimes they might have the scarcity of like get it before it's gone or whatever, but like that's the kind of dumb stuff where you could like surprise people and just be like, Hey, here's some free stuff. Here's, And if people are constantly coming back, you don't want to give them too much of the same free stuff, it'll probably wear down. Um, but stuff like that can definitely work. Um, and it's kind of dumb of just like random shit that you might find on Alibaba for like a dollar or 10 cents or something.

Rabah Rahil (01:07:45):

Amazing. Surprising to delight your customers. Go talk to your customers, Go build some more. Um, Shane, how can the people find you? Are you taking clients this time? Is yours my friend?

Shane Rostad (01:07:56):

No, um, well they find me on, on Twitter. It's at, uh, Shane Rota. S h a n e, Ros t a d. Um, I'm not taking any clients on, I mean if you really are looking for like, I I don't wanna like have a flood of dms from people, but like if you're looking for who you should talk to, I can't give everybody like, specific advice on their site or things and I wouldn't, cuz I don't have time to do that research, but like I can maybe help recommend a number of people that I've seen doing good things. Um, I'm not gonna like, I don't wanna shout you out cause I'm gonna forget someone and then that's not gonna be a nice thing to do. And then what you can also do, so you could definitely go to Mentor Pass and RA has a profile on Mentor Pass sneakers and uh, yeah, yeah, you know, this guy's smart. I mean Triple Whale is over a hundred employees now, largely thanks to him. I mean, I'd go check out Mentor Pass and his profile and maybe Ash two, I think he's on there. But like, you know, uh, we'll see,

Rabah Rahil (01:08:52):

We'll see how this AB test works out if it's Statsy or not. And make sure to ask Shane what's the best AB test in dms for his site. He'll tell you how to get statistical significance, uh, be won. I think you set a record for least, least talk this, this has to to be your quietest episode

Ash Melwani (01:09:11):

About some heat. Man, I would just like absorbing this.

Shane Rostad (01:09:14):

I was ranting a bit. I told you I had to get, I had things I need to get off my chest about this stuff that I hadn't, you know, cause I kinda left and I just left people hanging. Yeah, definitely. I mean, I kinda left and I just stopped posting and stopped talking and then I, I feel like I didn't ever talk about why or like, you know, and then a lot of people started posting and I felt like, you know, it's, I made a lot of those mistakes of, you know, over promising underdelivering and I think it's important to just, you know, set the right expectations. So it's definitely felt like almost like me apologizing for all of the times I did it, you know, and warning it's continuing, people continuing to do it. But yeah, this was, this was great.

Rabah Rahil (01:09:53):

Amazing. And then if you do wanna run an experiment and you drive by a vitamin shop, what should you do? Asani?

Ash Melwani (01:10:01):

I need you to go in there, pick up some, a <laugh>, take a picture

Rabah Rahil (01:10:05):

Of it,

Ash Melwani (01:10:06):

Kids send it to me on Twitter. Um mm-hmm. <affirmative>. But yeah, follow me at asme milani on Twitter. Uh, Mentor pass Q4 s coming up Black Friday, everything. Get your strategy in place. Um, if you need some help there, we've already planned our whole Black Friday plan, so let me know. Uh, find me on Twitter and Metro Pass.

Rabah Rahil (01:10:23):

Also, also, uh, you, you and Ron got something cooking. Can we talk about that yet? Or is that not, That's still on the low. Low. Yeah, he hit he I'm in a DDC fantasy. Are you in the fantasy league? You didn't come in, did you?

Ash Melwani (01:10:36):

No, Nah, I'm not, I'm not being on that stuff, but Me neither. Yeah, but there are some, there is. I'll drop a little sum sum but, um, Ron and I are probably gonna be dropping some, some new stuff. Um, whether it be a, a new pod, maybe in a newsletter.

Rabah Rahil (01:10:55):

No, are you gonna say the name yet or no? The branding on the si the branding and the name's fantastic, but

Ash Melwani (01:11:00):

Yeah, you're doing, I'm gonna drop it on Twitter

Rabah Rahil (01:11:02):

Chest. Okay.

Ash Melwani (01:11:03):


Rabah Rahil (01:11:04):

Breaking news here. This guy's too big for the podcast now. He's, he's gonna break his news on his own new podcast. Yeah,

Shane Rostad (01:11:09):

Yeah, exactly right. He's like, doesn't wanna give you the break, he doesn't wanna give you the story, right? He's like, Oh,

Ash Melwani (01:11:15):

Unbelievable. I wanna maybe, maybe next week Twitter cloud Maybe next week. Maybe next week.

Rabah Rahil (01:11:19):

Maybe next week. All right, well it looks awesome, Ron. Ron dropped a, a little screeny in the DDC fantasy chat and uh, it looks cool. So I'm excited for you guys. Um, alright folks, I hope this episode is helpful. By no mean, Shane is not saying don't test, but I do think he's really onto something in the sense of at a certain scale it makes way more sense. Talk to your customers, build and just can you create more value for your customers? That's gonna be what your success is rooted in. And don't get caught up in moving decimals here or there. Um, try to put coms in the account, Don't worry about that half per percent lift. And, um, amazing. If you do wanna get more involved at Triple well, it is triple well.com. We're on the Bird app at Triple Well. And then we have a wonderful newsletter that goes out every Tuesday, Thursday called Whale Mail. You subscribe right on our [email protected] slash whale mail. And then if you enjoy this podcast, be sure to share it with your friends. Um, subscribe on our YouTube channel. We put the videos if you wanna see our beautiful faces or if you just wanna listen to that sweet nectar in your ear, that is also available at any podcast. Catcher. Me. It's always a pleasure. When am I get you, Austin. It's cooling down man. It's cooling down. When are you gonna get out here?

Ash Melwani (01:12:27):

I think, uh, I think we said January. Oh. Oh yeah. Whales. Wait. No, you gotta come New York, man. Some I'll

Rabah Rahil (01:12:33):

Be out there.

Shane Rostad (01:12:34):

Hey, are you in Ash? Are you in New York City?

Ash Melwani (01:12:37):

Uh, New Jersey, yeah.

Shane Rostad (01:12:38):

Okay, I'm gonna be over, I'm gonna be in the city pretty soon, so maybe I'll let show up and we can, uh, hang out.

Ash Melwani (01:12:44):

Amazing. We're having the Parker launch party September 29th.

Shane Rostad (01:12:49):

Let's gom Not sure when I'll, I'll check my check when on there, but yeah, we'll figure something out. Parker

Rabah Rahil (01:12:56):

Contract that cash conversion cycle. Let's go. And they're not even paying us for ads yet, yet. Sponsor this, Parker, let's go. Um, that's all we've got folks. Thanks so much Shane. Thanks for the time. Enjoy Florida. Let me know when you make it to Austin Milani, we'll get you out sooner than later and then I'll let you know when I get out to the East Coast. I think I'm actually coming out for something. I can't remember. There's, there's a bunch of cool stuff going on that um, I think I'm speaking out there or something. Who knows? Yes. Anyways, folks, we love all of you. Thanks so much for listening. And that's another ad spin in the books. See you later everybody.

One Tab
To Replace Them All

Supercharge your growth with a purpose-built ecomOS for brands and agencies.

Get Started
Start Making More Money
Start Making More Money
Start Making More Money
Start Making More Money
Get a Tour
Before ...
Ads Manager
Triple Whale