VWO Logo Partner Logo
Follow us and stay on top of everything CRO
Webinar

How to Run a Cost-Efficient Optimization Program With a Limited Budget

Duration - 40 minutes
Speakers
Jan Marks

Jan Marks

SVP Customer Success

David Otero

David Otero

Sr. Consultant International Data

Deepak Lamba

Deepak Lamba

Ex - CRO Leader & Head of Europe Business

Key Takeaways

  • Use professional tools: To avoid wasting time and resources, it's crucial to use professional tools that allow for efficient project management and optimization.
  • Run concurrent experiments: Utilize tools like VWO to run multiple experiments at the same time. This can help identify winning strategies more quickly and effectively.
  • Implement winning strategies promptly: Don't let IT bottlenecks delay the implementation of successful strategies. Use tools that allow you to direct 100% of traffic to winning variants promptly.
  • Leverage tools for implementation: Use tools like VWO Deploy to implement winning campaigns directly, skipping the need for dev support and speeding up the process.
  • Gain experience and test frequently: Good ideas come from experience and frequent testing. The more you test, the better your results will be.

Summary of the session

The webinar, hosted by Deepak from VWO, features Jan Marks and David Otero from Multiplica, discussing the challenges and solutions in implementing effective optimization programs, particularly in the context of limited IT resources and budgets. They share their experiences of projects where the lack of professional tools led to time-consuming revisions and the inability to implement winning campaigns due to IT bottlenecks.

They highlight VWO’s solution, VWO Deploy, which allows for the direct implementation of successful variations to the audience, bypassing the need for IT approval. The speakers, with their extensive industry experience, offer valuable insights into creating high-converting user experiences and encourage audience interaction.

Webinar Video

Webinar Deck

Top questions asked by the audience

  • What percentage of experiments are inherent wins versus failures? How many times are we proven wrong when we make hypotheses? Are these failures a loss?

    I would have to take the mask off. I don't know. Well, first of all, we all love winners, right? To give you a better idea of what I've seen, in projects we've been involved in around 70 to 90% winner ...s. Is that correct, more or less? That is my last number that I have seen. It's not true, well, first of all, you're safe from implementing wrong stuff. You know? If you would not have tested it, you would have implemented it, and you've wasted things, added code to your page, and it wouldn't have delivered the result. That's one thing. The other thing is that you learn from every experiment, whatever the result is, you learn for the next ones. The example that I gave you earlier, so you see that, for instance, markets are acting completely differently. I would say a loser is not a loser. It's not a loss.
  • What are your thoughts on how to deal with the VP? We have 2 VPs on the call who get in the weeds on approving test creative for every effort, slowing down program velocity.

    Well, yeah, good point. We've seen that, you've seen that. I think it can be, to a certain extent, complicated at the C-level. That's for sure. But I think it depends very much on the cases. If you ... have the support of somebody on the board and the buy-in of the related stakeholders from the very beginning, then you're much better prepared. The worst thing you can do is hear something about conversion rate optimization, then run a little trial and put it in a niche, and so on. Until at some point, the chief technology officer finds out that you're playing around the site. So it's better to make sure that you have buy-in. Once you have the general idea, once you or your agency explain the huge potential of testing, then you'll easily get the buy-in and then the general consent from managers and senior vice presidents to do these experiments and have less trouble afterwards. I don't know if this answers the question.
  • How do we test the hypothesis?

    - by John
    Okay. So, now the way we do it at Multiplica is we have what we call a prioritization framework. In which we use different variables to define the complexity and the kind of the return of investment o ...f an experiment. So we take into account if we need to do custom graphics or pictures or we need to write content or we need custom HTML or JavaScript. So all the parameters that can make an experience or an experiment more complex. And with that, we get a score of complexity, and then we run them based on that. And that's what we use to validate what makes sense. So what are quick wins versus the rest?

Transcription

Disclaimer- Please be aware that the content below is computer-generated, so kindly disregard any potential errors or shortcomings.

Deepak from VWO: Hi, good morning, everybody. This is Deepak from VWO. Welcome to yet another webinar from VWO.  So today, we’re gonna talk about how to run effective optimization programs today. Along with me, I have two special guests from Multiplica – Jan ...
Marks and David Otero. 

 

Jan Marks: 

Hi, everybody. 

 

Deepak: 

Jan, you’re wearing a mask. I think video distancing is the new normal now?

 

Jan:

Well, actually, I had to travel today. I’m calling in from an airport. And if you could see it around, I’m kind of surrounded by some watchmen who just wait for me to take the mask off so that I could have to pay a ticket. So, I’m sorry, but I have to wear a mask today.

 

Deepak:

Right. Right. No issues. You know, it’s better to be safe than to be sorry. Yeah. Hi, David. How are you doing today?

 

David Otero:          

Hi, Deepak. I’m doing alright. I am not wearing a mask, thankfully, because I’m at home. So I have nothing to do for me.

 

Deepak:

Okay. Right. So, you know, let’s get started. You know, but a few people joined in. So, Jan, you can start presenting the screen, and I’ll introduce the topic.

 

Jan:

Okay. Alright. Yeah. Hello again, everybody. Let me just put my camera off, because it’s probably better.

So, that’s right. Welcome! We had the idea the other day when we noticed from many conversations that cost efficiency in situations of more limited budgets is a very important subject for many of us. So, we wanted to share with you a couple of things that are related to this very subject. So, let me just jump right into that.

Well, with us is my colleague David Otero. He is in charge of growth and innovation at Multipica.  

Hi, David!

 

David:

Hi, guys.

 

Jan:

Alright. Great. Okay. Let’s jump right into it. So we talk about 4 different subjects which all relate to how to run conversion rate optimization with limited resources. The first one is effectiveness, doing the right things, and testing the right things. The second one is efficiency. It is optimizing with a lean and agile process. And the third one is to stay flexible – How can you avoid fixed costs? And the fourth one is, actually, well, Deepak will know to explain it much better than I do. What is a great, affordable, technology for conversion rate optimization? 

Who is Multiplica? Just to introduce us a little bit – Our company was born 20 years ago. We are around 300 digital artisans or experts. And we’ve served around 750 clients with more than 3000 projects. Compared to other agencies, we are really strongly concentrated on creating high-converting user experiences; that’s our core business.

And if you talk to our customers, what they say is, we’re getting things done. We operate many conversion rate optimization programs as a service provider. These include several brands, some of which you might recognize, ranging from well-known travel and insurance companies to retailers and others. We work with both larger corporations and smaller businesses.

Yeah. Deepak, 3 words to VWO.

 

Deepak:

Absolutely. So, I’m pretty sure that most of the audience knows about us. We are one of the leading A/B testing solutions and optimization platforms, well-connected in the industry. Our biggest differentiator is that we provide all the optimization tools under one roof.

We serve companies like Ubisoft and Norwegian. We’ve been in the business for 10 years and offer one of the most affordable technologies out there. 

We’ll talk more about this in the next slides.

 

Jan:

Okay. Great. Well, let’s jump right into it. David, please tell us more about effective prioritization of experiments.

 

David:

Yes. So as we were saying, we want to run a cost effective program, but we need to prioritize effectively. And to do that, we need to look at data. That data is key.

Yeah. And in this day and age, we’ll have access to a Google Analytics account or there are other free tools that allow us to understand what our users are doing. And that is basic to focus our effort. Also, Okay. Perfect.

Alright. Then we also need to go deeper. So we have analytics to understand what the user is doing. Where are we gonna have more impact, and then we can create where to do it? And then we can also combine that with user research techniques to understand what gives the user a voice or understand what are the pain points that they are suffering.

So with these 2 combined settings, we can understand what are the parts of our website that we need to optimize the most. Then we have the journey. Users are different from when they start, coming to our website to when they finish buying. And they have what we call the user journey.  When they discover us, they go through different phases and experiences, encountering various pain points and whatnot.

So we need to really understand at what point each user is to effectively attack or improve that user’s experience. So we can detect the opportunities.

 

Jan:

Right. Thank you, David. One thing that I wanted to point out because when we’re talking about data sources, one of my favorite tool cases is actually a package that’s called VWO Insights, which is less known than VWO testing, at least from what I’ve noticed in the market. It’s a range of beautiful tools. Deepak, what is VWO Insights?

 

Deepak:

VWO Insights is one-of-a-kind, a user behavior analytics platform, I’d say. It combines 5 products: total session recordings, heat maps, form analysis, service, and the ability to track your goals and funnels. So all in all, one particular suite where you can collect data for all the metrics for your visitors. Then, it converts this data into observations and hypotheses, providing a ready set of data to craft effective testing strategies. It’s one of the most well-known packages in the market. For more information, you can check out the website: vwo.com.

 

Jan:

Thanks, Deepak. I think ramping up things David, you said data is important. And from your experience in having run many projects, do you often notice that people just jump into optimization without having had a deep look into its data? And how important it really is to look a little bit deeper, than just looking, on a couple of pages in Google Analytics from your personal point of view?

 

David:

Yeah. So, sometimes, yes, people go with the buzzword, which is right; CRO can optimize, and then they just go for it. And, obviously, as I was mentioning today, everyone has access to some kind of data. So they look at maybe the pages with the highest abandonment rate and whatnot. But absolutely, what we were seeing on the previous slide with all the solutions that VWO offers, that gives you a much more in-depth insight into what’s happening on the website because we kind of take those specific pages getting a lot of abandonment or the clicks are going to where we don’t want them. And with using a heatmap, for example, we can understand how that users or how most of the users are interacting with the site. Or even session recording is amazing because you can see what each user is doing on your site. So, absolutely, this is very, very important.

 

Jan:

I think some people in the audience might wonder, okay, our headline of today’s webinar of today’s meeting here is, how to run cost efficient optimization programs. I’d like to point out again, but we just summarize this – It’s all about testing the right things. So if you do not fully understand what the user is feeling, why he’s dropping, why he’s bouncing, and where he’s bouncing. You might be testing things that you’ve seen that you think are extremely important. But in the end, it turns out if you had tested some other stuff, your lift in conversion rate and revenues would have been much higher.

So in other words, testing the right things is mandatory to make sure that your limited resources go into these experiments that are the most important ones and most promising ones. 

Let me jump to the second criterion. So let’s assume for a second that we’ve looked into data. We found some really important things. The second challenge is to set up a lean and agile optimization process.

When we start collaborating with any brand in conversion rate optimization, we always point out how important it is to have a proper organization in place, have a proper process in place, a well defined workflow, that enables the company to actually launch and run an optimization program without a huge budget. Let me explain when I talk about the workflow. What is the workflow we are working? And, that’s a well known HR process of conversion rate optimization. It’s also the process we are following.

Start with analysis. As we said, we then have our strategies, create some ideas. We then evaluate the feasibility and character of these ideas that we, together with the client, prioritize and decide what’s to be done next. Our UX people create mockups. The customer approves it, or the product owner approves it. Then the designers do the high-resolution design, another approval, developers start coding. Once they’re done, quality assurance, make sure it looks good everywhere.

We set the right audience and target groups, the last validation that we launch and monitor. So, that is actually the process quite obvious that it makes sense to run one experience after another in this particular workflow. So, what are the best ways to waste your time and money, or what are the best ways to avoid wasting? So it’s sure bad to waste money and to waste time in your program is to set up too many meetings. But we’ve seen this so frequently that I really wanted to point out how important it is.

To reduce the number of meetings and also to reduce the number of participants in a meeting, stick to ownership, or particular tasks and roles. So, it’s not teamwork. That does not mean that we always have to sit together and decide everything together and so on. So that’s one pretty easy step to save a lot of money.

The second one is to be specific in your briefings to avoid unwanted loops and unplanned iterations. So for instance, if the designer just gets a briefing of, well, let’s put a dark blue banner at the top of the page, then it’s almost certain that it’s gonna have a couple of iterations until it really understands what was meant by that. So the briefing as such and the level of detail that is given is really helpful to avoid this because these loops can be pretty expensive and they can happen on the UX level, on design level, and so on. That’s the next important point.

Another one is you and your conversion rate optimization efforts. You depend very often on people who are not aware that if they do not deliver certain things in time, everything is on hold, and it is also pretty expensive. And a good example is if you have set up a beautiful experiment, everything’s done, let’s agree that your product owner loves it. That’s all. And then you have someone who’s in charge of the content database, and he is taking weeks until you find some suitable photos and images that you need for a particular design, and everything’s on hold there. And this can happen many times.

What is the solution? You need to get the buy-in and commitment from all departments that actually contribute to this optimization process. They need to be part of it. If you believe people outside the program do not understand the program, it’s important. And at a certain point, you are asking for certain input from them. You would see that most probably it takes some time to actually get what you need to proceed.

That’s another important point. And the last one that I wanted to point out is I want to encourage you to have a deeper look into your test results and eventually refocus. We said this yesterday. It’s a good example if you have created a wonderful idea. You worked a lot for it.

And you were expecting a higher lift in conversion rate and say it’s now you who have monitored the experiment. Unfortunately, the lift is not as big as you hoped. So there are 2 things you can do. One thing is that you write off all the efforts that you’ve made before and just continue with the next idea you have, or you have a deep look into it and eventually find out that, for instance, in the German-speaking markets, this experiment worked like hell and created a 7% increase in conversion rates And on the British market, eventually, it has decreased by 6%. So what’s the point?

It’s because users are different. Market people are surfing differently, and eventually with a simple refocus, change of the audience, you can iterate this experience. And without having to invest again from the very beginning, you create a very positive impact on this. Do not give up too early on experiments which did not deliver the results that you have expected. Analyze and refocus.

So these are just four things that are really important when you run the program. And if you do not take care of them, you would see that both internal resources and eventually external resources will waste a lot of time and money. The third point is, how can you make sure that you stay flexible? And let’s have a look at what you need to run this program. Actually, you will need a set of these skills.

You need somebody who runs the overall program, so project manager or conversion rate officer or whatever. And you have a data analyst who takes care to join this process. You have a strategist who understands the market, the environment, and comes up with a lot of best practices, etcetera. Then you have people who understand your UI, you have some coders, developers, people who make sure that code works everywhere, and you have a content manager.

So you need these skills. And the dilemma actually is that if you want to have it run a scalable optimization program, you’re gonna look through something like a time like this. Start slowly, and you have more, and you have less again, more again, eventually more and more. So your team needs to stay flexible. You need to find ways.

And of course, one of the ways is that you collaborate with a service provider and an agency, or you find other help for the peaks of your experimentation work. So it needs to be. Otherwise, you will not be able to scale up your optimization efforts. And I think Deepak will also refer to that. I think the common goal needs to be that when the budget is tight, you might wanna start small, but you also need the capacity to operate the peaks and use resources that are creating variable, but not fixed costs.

So, when you are working in optimization with an agency, whether it’s us or anybody else, it’s really important to have a chat with your agency about this flexibility. So, really make sure that if it’s difficult for you, then you will have a very flexible model with them. Or if you have a more strategic plan, then you should discuss a set of conditions that can provide variable resources to your optimization program. That would definitely be a really important thing that needs to be planned from the very beginning. And the most beautiful subject of today, of course, is how to choose technology.

Let’s talk about optimization technology for a second. Deepak, I was told you know a bit about that.

 

Deepak:

Thanks a lot. Thanks for the kind words, Jan. We talk about how you choose affordable technology at less cost. So the main tools and software out there, you know, I’ll start where we left off. I’ll be talking about the VWO insights.

When we look at the free tools, you’d see that the first importance that I’ve given is to the tools or the software that help us to collect data, like Mixpanel, Kissmetrics, and even amplitude. Given a lot of importance to collecting data because we need to be running data-driven campaigns. It’s an industry fact that only 1 out of 8 experiments or A/B test campaigns yield a positive result or are successful. That means we need to sweat a lot while collecting a lot of data. Hence, we need the tools that give us the right kind of data.

There are many magnificent tools out there such as Mixpanel and Kissmetrics. And in fact, VWO is in talks with Mixpanel to build a very tight integration that would help both the customers and each of the sites to read the benefits of this integration. Google Optimize – I think they have a couple of opinions out there in the market that if Google Optimize is good enough for scaling purposes, I’d say no. Because it’s difficult to use a difficult technology, but on the other hand, I’d like to credit Google as a company for educating and bringing awareness to the market by providing a free product. So it really handed over this product to millions of people across different markets so that they can play around and get the exposure to the much-needed optimization technology.

Optimizely Rollouts – I think the smaller product companies, which were finding it very difficult to manage their products online, and they did not have access to expensive technologies, Optimizely did a great job by offering a free roll-out feature that allowed a lot of product marketers to test out their products in different volumes. Affordable tools – Crazyegg and Hotjar top the list and they have been around for, I don’t know, since time immemorial. And it has become a pseudo, if you have a website, you’re tagged with GA and you put Hotjar or Crazyegg to monitor user behavior. A great piece of technology to be used here.

SurveyMonkey – [indistinct] they do a great job in helping a lot of people like feedback, user testing, usability testing, and things. This was one of the initial companies who started providing session recordings. Last but not least, we got the all-in-one platform of VWO Insights and Testing. A really powerful combination.

So you got the analysis part. You got the user behavior part. You’re covered with insights. You create an observation hypothesis. That’s the VWO plan here that’s in-built with both the products, and then they convert the entire program into a testing program.

And you have the product protocol testing for that. And you’d be surprised that, you know, all of this comes in less than $500 a month. So quite affordable, very powerful. And I think the CEO of Multiplica and we were discussing this yesterday, and we had two opinions. One of them was, is it really free?

I think my opinion is, although it is a bit time-consuming, but still, you know, it’s worth investing and getting started because it’s risk-free. You’re not putting any investment. You’d look at the free tools as a source of educating ourselves.

You know, you can spend a couple of weeks with Google and might just to see how things work out, and then you can eventually move to an affordable technology like VWO. On the other hand, I think the scale-up is important. We can move to the next slide, Jan. Right.

So, what are the factors we consider when we’re looking at affordable or free technologies? We just need not to look at the cost of the software; we need to start including the cost of design, development operations, and the different teams involved. With free software, the design effort, and specifically the development effort, goes pretty high. So while you’re creating a campaign and using Google Optimize, the technology is free, but the manpower that goes into creating the campaign could be, like, 5 to 6 hours.

The same kind of campaign should be created in VWO in less than about 30 minutes. So you can equate the man-hour cost, and you’ll figure out that Google Optimize isn’t really free when you’re trying to scale up little things. Flexibility is crucial; we should be able to easily make changes on the go without running a dev every time. 

Now, this happens with a lot of free technologies; the setup is difficult, the UI is broken, and we need some technology expert to help us get off the ground. With paid tools or affordable technologies like VWO, everything is nicely set up, the UI is easy, and the support is there. So you can get started by setting up a few funnels, session recordings, views, heat mapping views, etc.

And you can have the VWO editor at your disposal to start creating some very nice and crafty campaigns. Support becomes very inherent. So with Google Optimize, the trouble is that if you get stuck, you only have the Google community to support you. And this could become very tricky, and it could demotivate you to make any further progress. So that becomes the support.

The product support that VWO provides is pretty much top-notch. We do not differentiate really between different kinds of clients that we have. We just look at providing great support to ensure that the customers get successful services. And now I’d say when you’re looking to scale up your optimization program, it’s always advisable that you get an expert on the team who has done this for a number of years. The key advantage to bring to the table is they help you to do the 0 to 1 journey fairly quickly. They avoid the pitfalls.

They’d let you know what kind of mistakes that you could probably avoid, and they just provide you the much necessary velocity that you need to be in the optimization stratosphere. So, you know, talk to Multiplica guys and see what they can do to help you off the ground. 

Let’s move to the last slide. Next one. The benefits of having considered those factors, as I mentioned, it gives you the velocity.

Moving at speed is okay, but velocity shows you the direction, so adapt quickly. We learn immediately, and we fail fast. So the concept over here is you give yourself, let’s say, a couple of weeks to play around with the free 2-wheel platform. You just get accustomed to the process and how things work, and you get the mentality. Once you’re ready to take the high road, you start playing with, you know, a paid platform. Things become easier. And then when you’re ready to take off, you get an expert on the team who helps you to do this journey fairly quickly because now the expert and you are going to talk in the same language. You have done the learning.

You have prepared yourself for the optimization journey. You get one single platform to manage it all. VWO is not just about insights and testing; it’s about server-side testing and mobile app testing.

So under the hood of VWO, you get 7 different products, which is considerably the maximum amount of software that you’d ever need to optimize various channels. The leadership is really happy. That’s the eventual goal. You know, a lot of end users in the market years have asked me the question: “Everything looks pretty neat. Where do I go to get a substantial report? How do I make my managers and the management really happy?” 

A very, very detailed-oriented reporting mechanism is a key aspect of VWO. By just a couple of clicks of buttons, you get such a detailed report that you can easily compile and send across to the management explaining your hypothesis, the reason for the testing, what it is all tested, and how the final results look like with the comparison between the control and the variation? 

Generally, Multiplica would come into the picture to shape up and to ease off that journey as well. So for better and for more information, you can submit a query at vwo.com. We’d be more than happy to give you a free tour of the platform.

 

Jan:

Deepak, I’m really, really friendly words. Thank you very much, Deepak. I really appreciate that. And I would like to ask David.

David, well, Deepak has mentioned, you know, in different angles comparing Google Optimize and VWO. We’ve been in this situation. So we’ve seen customers who wanted to start, and then it was the question: should they start with Google Optimize, or should they take a more complete solution? And so what was your experience with this scenario and what was our recommendation when this happens?

 

David:

So, as he was mentioning, sometimes free is not good enough. Google Optimize is just a tool in words. Google tried to impress the world as they did with good analytics. And from our point of view, with the experience of running optimization programs, Google Optimize is just a tool to play with, but not to run a successful program. It lacks so many things that will be, like, roadblocks in your success. I mean, it’s free, but you will pay with your time. It’s gonna be a lot of problems if you lack the support. Do you like the documentation?

So it’s just trial and error on their side. Also, if they want to change something one day, they just change it, and you’re not important to them. You are just another number, another free user. So, if you want to play around to see what CRO is about, do it, play with it. But if you want to start making a serious CRO program, I would say you will need to invest in order to be safe and play with some security. I don’t know if that answers your question.

 

Jan:

Yeah, yeah, no, it does. Actually, I agree that the situations that I’ve seen were always some real downsides, like when in the course of an experiment, you needed to change something, you needed to correct something. And so that you had to go back to square 1 and, you know, to ramp up everything again. And you lost a lot of time again. So I remember 2 years ago, you and I were in a project where this happened all the time. And we both wish that we could migrate the project to a more professional tool as quickly as possible.

Another thing I saw at a large German tour operator the other day is the limited number of concurrent experiments. So a lot of companies, I think, are using VWO as a workaround and bypass of lacking IT resources. So clearly, this tour operator, they were running 65 experiments at the same time, and 45 of them were clear winners, and they were actually not implemented because there was such a long queue on the IT side. So what they did was just to give 100% of the traffic on the variant at which they have created within VWO. And that’s really important. 

That is important because most companies, or many companies, just do not have this all-time available IT resources. They work from release to release. And in the meantime, you cannot deploy the improvements that you’ve seen in your test saying. And I think, actually, VWO created a specific tool for this implementation purpose, Deepak.

 

Deepak:

Absolutely. From an implementation point of view as well, a lot of concerns have been raised. I think the reason is that I’ll talk about a bit of the story here. So, you know, obviously speaking to an expert in the marketing team who said that, you know, you know what? I’m able to run some great campaigns with the editor.

The only trouble that comes is when I need dev support. And the biggest trouble that comes is that, once I have a winning campaign, I have to go ahead and submit a request and the request needs to go through and approve, and then an engineer would pick up and deploy them in the campaign or implement it. We heard that problem. We created a product out of it. We call that as VWO deploy.

Now by just a simple click of a button, you can implement. You can deploy the winning variation directly to the 100% audience, and you can maximize your profits. So you’re skipping that step and you can show the ROI to the organization and to your leaders. So VWO has been one of those organizations where we hear our customer’s concerns and problems, and we go ahead and solve various problems. When it comes to the implementation part, we have tried to put in so many things into the editor.

But at the end, I’d say this, Jan, you know, product can do, to a certain extent, and we’re doing really great at it. But when it comes to the implementations, you really need to have good ideas, and that can come from experience. So more and more testing, good results.

 

Jan:

Great. Thank you. Yeah. I don’t know if we have had the chance to encourage our audience to come up with some questions. I’d love to know a bit more about who’s listening in.

So if you, feel free to share in which industry you’re working and if you have some experience with conversion optimization or, are you starting right now, or are there any burning questions. So now’s the time. There is a button where you can put this, your questions. Deepak, do we already see something coming in?

 

Deepak:

As of now, we don’t see any questions, but I would urge the attendee to make the best use of the time and the experts in the panelists, to clarify any thoughts, any questions, any anecdotes that you’d like to share with us.

 

David:

So, something that I wanted to mention since we are talking about costs here is that sometimes, by trying to actually cut costs, you will maybe incur additional costs. So, a couple of things that I think are important, and maybe people don’t think about them, are QA and analytics. So, QA is really important because otherwise, you will need to redo things once they’re live and whatnot. So, spending the QA time before launching a CRO experiment, I think it’s key.

And it’s said that sometimes things just go by, and we don’t do them. So, that’s one of the things I will take into account. And the other is the analyst or somebody that takes a role of an analyst and to actually analyze what’s going on. As Jan mentioned earlier, sometimes you see a losing test, but if you look deep into it, it’s not losing for everyone. And then segmentation comes into play.

So, maybe German visitors are more prone to react positively to a specific experiment or, I don’t know, we have different segments that we can play with. So, spending some time or money on an analyst, somebody that can actually review the data, pays off. It’s a nice way of avoiding losing money. Just wanted to mention.

 

Deepak:

And can I…

 

Jan:

Sorry, Deepak, one question coming up that you mentioned: playing around for a while can be okay. I think, and I also saw that in a couple of projects, VWO is one of the very few ones that actually offer a trial. I never understood this.

Maybe you have spoken with your beloved competitors, but I have seen how this makes a huge impact for people who don’t know this subject. So, that’s still the case. Is that true?

 

Deepak:

That’s absolutely true. Like, you know, I’ve spoken to so many other partners and customers as well, and heard so many anecdotal experiences. They have always felt that the cost of running a campaign with the three tools is pretty high. And then, they have to really convince people. So, it takes a bit of time for different people, for different agencies to convince the customers as to why they should wait a bit longer for the campaigns to run. If the variation did not win, it doesn’t really mean that it was a failed campaign because you still learned out of it.

There have been various experiences as well. I think we have a question around the implementation in many companies.

 

Jan:

Okay.

 

Deepak:

Another one: what percentage of experiments are inherent wins versus failures? How many times are we proven wrong when we make hypotheses? Are these failures a loss?

 

Jan:

David, do you wanna answer that?

 

David:

I mean, probably no. I would say yes. They are, no? Will you agree? Yeah, I think.

 

Jan:

I would have to take the mask off. I don’t know. Well, first of all, we all love winners, right? To give you a better idea of what I’ve seen, in projects we’ve been involved in around 70 to 90% winners.

Is that correct, more or less? That is my last number that I have seen. It’s not true, well, first of all, you’re safe from implementing wrong stuff.

You know? If you would not have tested it, you would have implemented it, and you’ve wasted things, added code to your page, and it wouldn’t have delivered the result. That’s one thing. The other thing is that you learn from every experiment, whatever the result is, you learn for the next ones.

The example that I gave you earlier, so you see that, for instance, markets are acting completely differently. I would say a loser is not a loser. It’s not a loss.

 

David:

It’s learning.

 

Jan:

As long as you analyze the why, but that is also why I love it so much that you have this integrated insights package with VWO that I haven’t seen in other tools, that you can quite easily switch into observation mode and analyze deeper why A or B was the winner of this particular experiment. And what was, and then you can even talk to the user. So you can first look at their heatmaps and you can first look at some sessions. And if that’s not enough, now you can just create an online survey and ask some customers who have done certain things. And you can talk to them and say, why did you leave at this step?

And you can say it was a, b, or c? And after a couple of days, you have 120 people giving you a good answer. So I think if this is learning, I think it can be learned as long as you analyze afterwards. But, of course, if you do proper prioritization and you create your ideas based on data, then the probability of having lots of losers in your experiments is much lower, of course.

 

Deepak:

Great! A very interesting question coming up, Jan. Thoughts on how to deal with the VP? We have 2 VPs on the call who get in the weeds on approving test creative for every effort, slowing down program velocity. VPs get in the weeds on everything, not just this, so getting the approvals from the stakeholders.

 

Jan:

Well, yeah, good point. We’ve seen that, you’ve seen that. I think it can be, to a certain extent, complicated at the C-level. That’s for sure.

But I think it depends very much on the cases. If you have the support of somebody on the board and the buy-in of the related stakeholders from the very beginning, then you’re much better prepared. The worst thing you can do is hear something about conversion rate optimization, then run a little trial and put it in a niche, and so on. Until at some point, the chief technology officer finds out that you’re playing around the site. So it’s better to make sure that you have buy-in.

Once you have the general idea, once you or your agency explain the huge potential of testing, then you’ll easily get the buy-in and then the general consent from managers and senior vice presidents to do these experiments and have less trouble afterwards. I don’t know if this answers the question.

 

Deepak:

I’d say my comments on that are, I think the insight is, in order to get approval, if you have done your research right, if you have the data to back it up, like, “I want to test this because for the last 2 weeks, I’m observing this, and the number of clicks are going down on the CTA. We’ve got to change the color, the button, the location, and we’ve got to do something about it.” That’s what the data says. Now, if the person is getting too greedy about it, he’s rarely defined by logic. It’s that the numbers can’t be wrong.

So, the only way to deal with the C-suite, what I’ve learned, is data, data, and more data.

 

Jan:

Yeah. Thanks. Exactly!

 

Deepak:

So, you need a platform that gives you the data in the easiest possible format. So that’s my answer to the question.

 

David:

Yeah. Switching from “I think” to “I know”. I know that we have a problem, and we need to fix it. So, data is there.

 

Jan:

Although I have seen CEOs and CMOs who have seen the data, and the answer was I don’t care. I don’t want this. I want this color because I like it more.

 

David:

Yeah.

 

Jan:

But that’s more the exception, I would say. I think data is really, really, really strong. And also, this is something we actually want to work on more. As we head into November, it’s the time for budgeting for next year. Some of the people in the audience might now be preparing to defend an optimization program. So here’s the answer.

If you run some tasks this year, you can show the potential effect of conversion rate optimization. Mostly, your CFO or your management will approve the budget for conversion rate optimization once it’s clear what the real potential is. I think for most initiatives, you don’t know what the outcome will be, right? You can say, ‘’I need €1 million for these marketing projects, these strategy projects, these campaigns,’’ and so on.

So what would be the ROI? Well, we expect this and that ROI. It’s an estimate, around 90%. But testing is different here. You can run this, you can show this. You can say, ‘If we do that, you’ve seen these few experiments that we have run, and let’s do it and scale it up.’ So that’s also why I like the conversion rate recommendation. Any more questions in our chat, Deepak?

 

Deepak:

So, John is clarifying his question. So, his question was more about how we test the hypothesis, not what we test at a high level because of data insights. So I think if I read back John’s question, how do we test the hypothesis? I think he’s talking about, I think John, what you’re referring to is how do we prioritize, how do we decide what to test first and what hypothesis gets to be implemented first.

 

David:

Okay. So, now the way we do it at Multiplica is we have what we call a prioritization framework. In which we use different variables to define the complexity and the kind of the return of investment of an experiment. So we take into account if we need to do custom graphics or pictures or we need to write content or we need custom HTML or JavaScript. So all the parameters that can make an experience or an experiment more complex.

And with that, we get a score of complexity, and then we run them based on that. And that’s what we use to validate what makes sense. So what are quick wins versus the rest?

 

Jan:

And in addition to the complexity, of course, we look into some other data. So, for instance, we always see 2 main factors, well, 3. One is the effort, but size in terms of how many people are looking at a particular page. So, if you think about the product detail page, which in many businesses is the landing page and receives the most traffic, but some often more traffic than the homepage. So you have a product detail page, and what we can do as an agency is benchmark certain KPIs against similar figures of other companies.

So we look into the data of an airline, and we see on the certain destination page, a bounce rate of 47%. And we see on other airlines, there is a that the bounce rate is somewhere between 20% and 30%. So, obviously there’s big traffic in it, and there was a big gap in performance. If these two things go together, this stuff has a high priority because it is a promising higher lift in conversion rate. Whereas if it is more a niche page, and very few people see, you don’t have a real benchmark on that. 

And well, you can’t talk about a high probability of this becoming a leader. I try to always focus on things that are significantly worse than in other cases that we have seen. And again, that means, as we said at the very beginning of our meeting, it’s being data-driven. So it is looking at what are your drop rates, what are your bounce rates, what’s actually failing either in comparison to other similar companies of the same industry or on a timeline.

So for instance, now in the pandemic, a lot of things have changed. And we have other users who move differently and so on and so forth. So what we are actually doing when we start an analysis, we not only try to find benchmarks for other companies, but we look into what exactly is different from 2020 compared to 2019. And then we see the pages which performed significantly worse than before the pandemic. So we see, “Oh, that is where we need to address something.”

So because things have changed, so either we compare different times, or we compare different sites, actually. And within your own, a range of products, of course, as well, if you’re offering 200 products online, you look at your product detail pages of the same category, and you find some pages might show some really worse numbers, worse ratios. This is the way actually we put this into a score scheme so that we can reduce this to numbers and then come up with it. What is the most promising effect and what is the least complexity?

 

Deepak: 

Yep. So, I think we can wrap up now and we’d be setting up the presentation to everybody who is leaving. And I would urge you guys and everybody to have a word with David and Jan Marks to consult and see if they can help you in your optimization journey. Any questions that you have, they’d be more than happy. Jan, in particular, is a very chatty guy.

So, you’ll hear a lot of stories from him. He did not introduce himself, but actually he’s a German guy, but difficult to identify. So, David is also very well-experienced and well-known in the industry. So, I’ll be quick for now. I hope everybody really enjoyed this session.

We’d be more than happy to get sincere feedback from everybody. Like, I sincerely thank my trusted partners, David and Jan, from Multiplica. Thanks, guys, for your time, and I hope everybody enjoyed the session.

 

David:

Thank you, Deepak. Thank you for your time today.

 

Jan:

Thank you very much. Thank you, everybody, for attending. Bye.

  • Table of content
  • Key Takeaways
  • Summary
  • Video
  • Deck
  • Questions
  • Transcription
  • Thousands of businesses use VWO to optimize their digital experience.
VWO Logo

Sign up for a full-featured trial

Free for 30 days. No credit card required

Invalid Email

Set up your password to get started

Invalid Email
Invalid First Name
Invalid Last Name
Invalid Phone Number
Password
VWO Logo
VWO is setting up your account
We've sent a message to yourmail@domain.com with instructions to verify your account.
Can't find the mail?
Check your spam, junk or secondary inboxes.
Still can't find it? Let us know at support@vwo.com

Let's talk

Talk to a sales representative

World Wide
+1 415-349-3207
You can also email us at support@vwo.com

Get in touch

Invalid First Name
Invalid Last Name
Invalid Email
Invalid Phone Number
Invalid select enquiry
Invalid message
Thank you for writing to us!

One of our representatives will get in touch with you shortly.

Awesome! Your meeting is confirmed for at

Thank you, for sharing your details.

Hi 👋 Let's schedule your demo

To begin, tell us a bit about yourself

Invalid First Name
Invalid Last Name
Invalid Email
Invalid Phone Number

While we will deliver a demo that covers the entire VWO platform, please share a few details for us to personalize the demo for you.

Select the capabilities that you would like us to emphasise on during the demo.

Which of these sounds like you?

Please share the use cases, goals or needs that you are trying to solve.

Please provide your website URL or links to your application.

We will come prepared with a demo environment for this specific website or application.

Invalid URL
Invalid URL
, you're all set to experience the VWO demo.

I can't wait to meet you on at

Account Executive

, thank you for sharing the details. Your dedicated VWO representative, will be in touch shortly to set up a time for this demo.

We're satisfied and glad we picked VWO. We're getting the ROI from our experiments.

Christoffer Kjellberg CRO Manager

VWO has been so helpful in our optimization efforts. Testing opportunities are endless and it has allowed us to easily identify, set up, and run multiple tests at a time.

Elizabeth Levitan Digital Optimization Specialist

As the project manager for our experimentation process, I love how the functionality of VWO allows us to get up and going quickly but also gives us the flexibility to be more complex with our testing.

Tara Rowe Marketing Technology Manager

You don't need a website development background to make VWO work for you. The VWO support team is amazing

Elizabeth Romanski Consumer Marketing & Analytics Manager
Trusted by thousands of leading brands
Ubisoft Logo
eBay Logo
Payscale Logo
Super Retail Group Logo
Target Logo
Virgin Holidays Logo

Awesome! Your meeting is confirmed for at

Thank you, for sharing your details.

© 2025 Copyright Wingify. All rights reserved
| Terms | Security | Compliance | Code of Conduct | Privacy | Opt-out