Basing marketing decisions on your gut and hoping for the best is one approach, but most pros know that using data is the best way to guide a successful strategy. If you’re torn between two approaches, or want to test how different elements on your marketing collateral may impact your audience, A/B testing is the way to go.
In this #ContentChat, community staple Patrick Delehanty (@MDigitalPatrick), marketing manager at Marcel Digital, gave us the 101 on A/B testing and explained how it can improve your content marketing performance.
Let’s do a quick #contentchat poll. How many of you are currently doing some A/B testing of your content?
— Erika Heald | Content Marketing Consultant (@SFerika) March 25, 2019
Q1: What is A/B testing? And why should I be using it as a content marketer?
A/B testing involves testing two versions of some marketing collateral (web page, newsletter headline, email subject line, social content and imagery, etc.) with a segment of your audience. Part of your audience sees version A, part sees version B, and the rest will see the version that performed best (based on your goals for the content).
A1: A/B testing is presenting two versions of the same web page, or specific page elements, to a split audience and measuring which one will ultimately engage your audience at a higher level. Winner becomes the page your audience will ultimately see. #ContentChat
— Patrick Delehanty (@MDigitalPatrick) March 25, 2019
A1: An a/b test is essentially an experiment. You should two similar, but slightly varied (maybe different copy or different images, for ex) pages or ads to a user. You’re testing to see which version performs better. #contentchat
— Brafton (@Brafton) March 25, 2019
A1: It’s a bit like going to the optometrist: “#1 or #2” — the goal is to find out which option is clearer or prettier or whatever it is you want. #ContentChat https://t.co/EFmNJiz4TD
— Jen Brass Jenkins (@chrliechaz) March 25, 2019
A/B testing allows you to collect data on different approaches to see which performs better.
A1b. A/B testing allows you test hypotheses on how to make your content, layouts, or calls to action engage your audience more effectively, so you can always be sure you’re providing the best experience to your users. #ContentChat
— Patrick Delehanty (@MDigitalPatrick) March 25, 2019
A1: A/B testing is the most scientific way to ensure that your creativity isn’t just a hunch, it’s working to drive results for your organizations. #ContentChat
— Maureen Jann (@SuperDeluxeMo) March 25, 2019
A/B Testing is the perfect way to experiment and see what resonates with your audience! For instance, sending the same newsletter but half with a catchier headline to a segmented list helps determine what your audience is attracted to and actually clicks. #ContentChat https://t.co/sNJb6lH5LJ
— The Karcher Group (@KarcherGroup) March 25, 2019
A1. The core of A/b testing is presenting two options of a web layout (a landing page, a banner, an ad, a piece of content) to a segmented part of audience. This way, there’s no need to rely on hypothesis about the effeciency of either. Data will gove you the answer. #contentchat
— Pitchbox App (@PitchboxApp) March 25, 2019
A1: There will be many technical answers here. For me, I tell clients this:
“You really can’t decide which idea would work better? There’s a way to prove it. Let’s put a little money where everyone’s mouth is and see what happens.” (Used and proven many times.) #ContentChat pic.twitter.com/T0HhwNxGRT— Shawn Paul Wood (@ShawnPaulWood) March 25, 2019
A1: “A/B testing is essentially an experiment where two or more variants of a page are shown to users at random, and statistical analysis is used to determine which variation performs better for a given conversion goal.” ~ @optimizely #contentchat
— Shelly Lucas (@pisarose) March 25, 2019
Q2: I have a limited budget. How can I do A/B testing without using any additional software or tools?
There’s several free tools out there depending on what collateral you’re A/B testing. Google Optimize is great for your website, and there are also free plugins to monitor website performance.
A2. @google offers Google Optimize which works with your Google Analytics implementation and is free. Also, CMSs like WordPress offer free plugins that can help with A/B testing as well. Here’s a great list from @VWO: https://t.co/ocytEB3r3O #ContentChat
— Patrick Delehanty (@MDigitalPatrick) March 25, 2019
A2: It seems Split Testing was made for a limited budget. It’s the trial run before the big investment. Use those tools @MDigitalPatrick shared and analyze the results. The report should earn the budget you really need. #ContentChat
— Shawn Paul Wood (@ShawnPaulWood) March 25, 2019
A2: Each channel will have some options available – Google is one good resource for website…with eCRM (email/Sms) you can do split lists and split creative to measure results #ContentChat
— Bernie Fussenegger 🐝✌️the7️⃣ (@B2the7) March 25, 2019
Email marketing tools like Mailchimp allow for A/B testing, too.
Q2: I’ve found that a lot of tools have flexibility for manual work-arounds. They’re not ideal, but they can work in a pinch. They just take a little more work. For example, segmenting in MailChimp is free. #ContentChat
— Ashley Hoffman (@ashhmarketing) March 25, 2019
On social you have a few options. Facebook has A/B testing for paid social built in, you can manually track the performance of posts, and it never hurts to poll your audience to see what they prefer.
A2: For social content, Facebook has a A/B testing tool for paid social built in. I rarely run a new creative without doing an A/B test for one component or another. #contentchat
— Scott Lum (@ScottLum) March 25, 2019
A2: #ContentChat
For content like social posts, simply keep track of the two pieces and measure the performance using the KPI’s you decide on beforehand – no additional tools/resources necessary other than patience.
Also, check out @MDigitalPatrick‘s answer! Great stuff! pic.twitter.com/ureSbdqMUK
— Click Control Marketing (@ClkContrl) March 25, 2019
A2 – You can put a poll on a social media platform, like @Twitter or @facebook to get feedback, reaction, and a ‘winner’ from your audience. #ContentChat
— John Buglino (@JBugs10210) March 25, 2019
Q3: What are some common A/B tests for content marketing, and how do I conduct them?
Almost anything can benefit from an A/B test. Headlines, CTAs, links, use of imagery, fonts, color, and the below community suggestions are all valid. Pick an element, create a baseline, then measure how the data changes based on your tweaks.
A3a: Common tests include headlines, calls to action, links, content types + media, images, fonts, colors, etc. Before you test anything, analyze your data – don’t just blindly start testing – do some research! #ContentChat
— Patrick Delehanty (@MDigitalPatrick) March 25, 2019
A3b. If you just randomly start testing, with no clear goal or purpose, you run the risk of adversely impacting engagement on your website, and worst, conversions or important actions like sales. #ContentChat
— Patrick Delehanty (@MDigitalPatrick) March 25, 2019
A3: You can test for almost any variable really. What content performs best on what channel? What language variations in headlines or content perform best? What images perform best? #contentchat
— John Cloonan (@johncloonan) March 25, 2019
Q3: Test your CTAs, your social copy, your emails (links vs buttons, colors, etc), your headlines, your images.
Make small, iterative changes. One at a time. Then set a specific timeframe and monitor results. Don’t change timeframe or make more than one alteration! #contentchat
— Ashley Hoffman (@ashhmarketing) March 25, 2019
A3: I’ve always found gated testing to be vital and often forgotten. Don’t ever assume you have an audience that will respond to CTAs for “signing up” here and “downloading” there. Gauge their interests. #ContentChat
— Shawn Paul Wood (@ShawnPaulWood) March 25, 2019
A3 – CTAs & Headlines – both need to be clear, concise, & enticing enough to get the conversions you desire. #ContentChat
— John Buglino (@JBugs10210) March 25, 2019
A3: Common A/B tests for content: titles, copy, wireframes/layout, forms, images and colors. #contentchat
— Shelly Lucas (@pisarose) March 25, 2019
A3: You can test different copy, graphics, CTAs, etc. You just want to be strategic about how you’re testing and make smaller tweaks at a time so you can narrow down what’s actually performing well. #ContentChat
— Express Writers (@ExpWriters) March 25, 2019
A3: My favorite is using the exact same copy on a #socialmedia post but then changing the image to something that invokes a different feeling. Ex: text over an image of a laptop vs. happy group of people. #ContentChat
— Claire Kennedy (@claireakennedy_) March 25, 2019
A3: Common A/B tests are ad creative, headline creative, button color, header images. #contentchat
— Maureen Jann (@SuperDeluxeMo) March 25, 2019
A3: #ContentChat
Across many types of content, A/B tests often compare design elements, copy, CTA’s & placements.
The best way to conduct A/B tests is to start based on a hypothesis – that means doing a little research first! Then decide the KPI’s, launch the test, and measure. pic.twitter.com/xXAGOmqJ3e
— Click Control Marketing (@ClkContrl) March 25, 2019
Q4: What are some common issues that can cause issues with A/B testing?
Always have a purpose and plan for your testing. Pick your element to test, set your KPIs and timeline, and see which approach performs better.
A4: Again, not properly analyzing data, testing without purpose, or testing multiple page elements at once with no structure – if you don’t know WHY and WHAT you’re testing – it’s all moot. Also – having a bad setup of tools or splitting audiences wrong. #ContentChat
— Patrick Delehanty (@MDigitalPatrick) March 25, 2019
A4: Changing timeframes halfway through, not properly setting up pixel/goals…
I.e. I just realized my Google analytics was tracking just /thanks instead of possibly /thanks/ #contentchat
— Ashley Hoffman (@ashhmarketing) March 25, 2019
Stay focused on your tests and change only one variable at a time (unless you’re skilled in multivariate testing, which we explore below).
A4: The problem I see most often is that people change too many variables at once. Then it’s not clear which change actually triggered the differences in results. One at a time. #ContentChat
— Brafton (@Brafton) March 25, 2019
A4: Teams throw in more variables, essentially negating the purpose of A/B testing. Multivariate testing requires its own approach. #contentchat
— Shelly Lucas (@pisarose) March 25, 2019
A4 – Forgetting that you should only test ONE element at a time + Analysis-Paralysis, you test too much and never get the content launched to your audience #ContentChat
— John Buglino (@JBugs10210) March 25, 2019
Depending on what you’re testing, you should start with a 50/50 split and run tests for at least a month. This will vary based on your typical traffic or audience size, but a 95% confidence in results is ideal. Patrick shares more insight below regarding sample size.
Patrick, can you address the sample size? Is there an “N” that is too small to test and derive any meaningful results? #ContentChat
— Derek Pillie (@derekpillie) March 25, 2019
I usually split traffic audiences 50/50 and wait for a 95% confidence (or statistical relevance) in results. The length of a test depends on how much traffic your test(s) get, so lengths can vary, but always start with 50/50 splits of traffic. That make sense? #ContentChat
— Patrick Delehanty (@MDigitalPatrick) March 25, 2019
So, if talking statistical relevance, would a sample size of 100 be enough? Or would we need 300? 1,000? #ContentChat
— Erika Heald | Content Marketing Consultant (@SFerika) March 25, 2019
Just saw this, sorry! Great question – there is no “dead set” sample number that you should consider for statistical relevance. It completely comes down to how much traffic you get and the test you are running (headlines? Forms? CTAs? Images? etc).
— Patrick Delehanty (@MDigitalPatrick) March 25, 2019
You could, out of the gate, run high impact tests or settle on a shorter time frame to run your test if you don’t get a whole lot of traffic to your site, but that could mean you get data that’s not totally correct or relevant. Smaller samples / shorter timeframes…
— Patrick Delehanty (@MDigitalPatrick) March 25, 2019
…mean that you get data that’s less likely to be more accurate than letting a test run for a longer period of time so more traffic can be tested. Now, if you have a large site that gets a ton of traffic, this shouldn’t be an issue. But if you have a small site…
— Patrick Delehanty (@MDigitalPatrick) March 25, 2019
…it means that your tests could be longer if you’re trying to get the most accurate data. I usually recommended for a small website to run a test for at LEAST a month to get a good hold on data and share that page via your social media / email lists to get more eyes / traffic.
— Patrick Delehanty (@MDigitalPatrick) March 25, 2019
I recommend checking out this resource for more information or thoughts on it: https://t.co/Kfuf1PFz2C
Hopefully this was helpful! Please let me know if you have more questions!
— Patrick Delehanty (@MDigitalPatrick) March 25, 2019
Q5: Can I test more than one variable at a time with my testing?
Multivariate testing is possible, but not recommended without previous experience or training.
A5a: Multivariate testing allows you test multiple changes at one time, and different combinations of those changes. The whole purpose of multivariate testing is to show you what combination of changes have the greatest effect on your conversion rate. #ContentChat
— Patrick Delehanty (@MDigitalPatrick) March 25, 2019
A5b: Multivariate testing allows you to test these changes at once, giving you the data you need for a truly optimal user experience and the boost your conversion rate needs. But! This is an advanced test, not something I’d recommend to a new tester! #ContentChat
— Patrick Delehanty (@MDigitalPatrick) March 25, 2019
A5 – Yes, multivariate testing can be done – though more difficult to manage and maintain if done incorrectly. I would recommend software and conducting research on the best ways to complete. #ContentChat
— John Buglino (@JBugs10210) March 25, 2019
For most folks, it’s easiest to test one variable at a time. This keeps the data more clear.
A5: If you test more than one variable at the same time, it’s difficult to tell which one has the greatest impact. Do multiple A/B tests keeping everything else constant for best results. #contentchat
— Scott Lum (@ScottLum) March 25, 2019
A5: I would only ever recommend testing one variable at a time. When there are multiple variables you could get false positives and negatives. Keep your data clear! #ContentChat
— Caitlin Kinser (@caitlinmarie89) March 25, 2019
A5: Yes, this is possible. However, many people do not set up their testing or structure their analysis in a way that supports this. They end up getting confounded results. #contentchat
— Brafton (@Brafton) March 25, 2019
With new tech on the horizon, multivariate testing will be more accessible.
New tools are coming out that leverage AI to test multiple variables with different segments simultaneously. Eventually, they will make multivariate testing easier–and, with time–more affordable. #contentchat
— Shelly Lucas (@pisarose) March 25, 2019
Q6: Let’s talk landing pages. What are some common A/B tests I should be conducting to improve my conversion rate?
Landing pages can be used to test essentially everything listed in the answers for Q3, including headlines, offers, CTAs, and the size and placement of page elements. Seemingly insignificant changes can make the world of difference in driving results, so test early and test often.
A6: Not only can you test things like headlines, offers, calls to action, images, ad copy, social proof, etc with landing pages, BUT you can also use them to test and influence site redesign ideas or new layouts. I love landing pages – SO much potential – use them! #ContentChat
— Patrick Delehanty (@MDigitalPatrick) March 25, 2019
A6: Calls to action, button sizes and placements, colors, headers, offer language. #contentchat
— John Cloonan (@johncloonan) March 25, 2019
A6: Headlines and CTA are huge for landing page success. Also consider layout, hero image, length of text and readability/reading level of copy. #ContentChat
— Caitlin Kinser (@caitlinmarie89) March 25, 2019
A6: Things to test: Pop-up or no pop-up, position on page for primary action – upper-right or upper-left, colors of your CTA buttons, etc. #contentchat
— Scott Lum (@ScottLum) March 25, 2019
A6: There are SO many elements to test with landing pages, and many involve content, so that’s always a good place to start. Some tests you might run: button CTAs, testimonials, shorter text vs. longer with more educational text., and even video vs. static content. #ContentChat
— Jessica Thiefels #ContentMarketing (@JThiefels) March 25, 2019
A6: If you want to improve your conversion rate, you definitely want to test your call to action and see what actually moves people to purchase. #ContentChat
— Express Writers (@ExpWriters) March 25, 2019
A6: #ContentChat
I’ve never personally conducted landing page tests, but things I’d test and measure are:
– CTA’s
– Images
– Colors
– Fonts
– Copy
– Organization/layout
– Brand recognition
– Sales
– And I’m sure a bunch more I can’t think of! pic.twitter.com/3hVgGbHcJ7— Click Control Marketing (@ClkContrl) March 25, 2019
Q7: What are some tools or apps that can make it easier to conduct A/B testing?
Some suggestions include Google Optimize, Hello Bar, Optimizely, Oracle Maxymiser, Unbounce and VWO.
A7: We use @Google Optimize here at our agency but also leverage @VWO and @Optimizely! All three kick ass – I recommend making a list of tools, seeing what they offer, and picking the best for your business needs. #ContentChat
— Patrick Delehanty (@MDigitalPatrick) March 25, 2019
A7: Some that I have used in the past are Optimizely and Maxymiser…want to test out Google Optimize. On other programs such as email, many larger Marketing Clouds have A/B testing as part of their tools #ContentChat
— Bernie Fussenegger 🐝✌️the7️⃣ (@B2the7) March 25, 2019
A7: Tools that allow you to run two tests at the same time are key to getting results fast—as opposed to running one test and then making changes to run the next version. @unbounce is great to do this w/ landing pages. I also love @thehellobar for testing pop-ups. #ContentChat
— Jessica Thiefels #ContentMarketing (@JThiefels) March 25, 2019
Q8: What are some of your favorite A/B testing tips?
First tip: Take the time to A/B test, and be disciplined about it.
A8: Do it. #contentchat
— John Cloonan (@johncloonan) March 25, 2019
Do it. And try really, really, really hard to be disciplined about it. Otherwise it’s not A/B testing, it’s just changing your mind. =] #ContentChat
— Maureen Jann (@SuperDeluxeMo) March 25, 2019
Start small and expand your strategy with time.
The other thing about A/B testing? It can get out of hand and feel really overwhelming. If you haven’t done it before, start small. Allow complexity to build with your skillset. #ContentChat
— Maureen Jann (@SuperDeluxeMo) March 25, 2019
A8: #ContentChat
My number one tip is to have patience!
Take the process one step at a time, starting with research, letting results roll in, then going from there. pic.twitter.com/nY9TEt8dqy
— Click Control Marketing (@ClkContrl) March 25, 2019
As with any marketing efforts, have clear goals, objectives and KPIs.
A8: Set clear goals and objectives along with the KPIs that you are measuring #ContentChat
— Bernie Fussenegger 🐝✌️the7️⃣ (@B2the7) March 25, 2019
Always be testing. You should have ongoing testing to see how you can keep improving. Just because you reach your initial goal does not mean that’s the best you can achieve, and you’ll have to keep testing to find your full potential.
A8: My biggest tip (besides knowing your goals / researching) is to include ongoing testing in your digital marketing strategy. There are ALWAYS ways to make your pages, elements, and content more engaging and effective. There’s always more than one test. Keep going! #ContentChat
— Patrick Delehanty (@MDigitalPatrick) March 25, 2019
A8. My main tip is to be doing a/b testing regularly and incorporating it into your marketing strategy. Not solely rely on one person’s creativity – test and test and test what resonates best with your TA. #contentchat
— Pitchbox App (@PitchboxApp) March 25, 2019
A8: Never stop looking for ways to test. There’s ALWAYS something new your audience will find fascinating. Just invest time to make (and test) the shiny toy. #ContentChat pic.twitter.com/soOBhabbue
— Shawn Paul Wood (@ShawnPaulWood) March 25, 2019
Just cause you got engagement or conversions to increase, doesn’t mean that there’s nothing else on that page or website at large that doesn’t need to be tested. Or better yet, what if the new conversion rate isn’t the best it COULD be? Keep testing! #ContentChat
— Patrick Delehanty (@MDigitalPatrick) March 25, 2019
Leave a Reply