For reaching readers, authors often rely on intuition: Will this subject line grab attention? Will this cover stop a scroll? But guesswork—and gut instincts—can only take you so far.

A/B testing, a method of comparing two options to see which performs better, offers a powerful way to make data-driven decisions. Many author newsletter platforms, such as MailerLite, make it easy to run these tests. Yet there’s more to learn beyond the basics: advanced multivariate testing, AI-powered ad optimization through tools like Meta’s Advantage+, and even rotating link features from platforms like Switchy. By understanding and applying these strategies, you can refine everything from newsletters to cover designs and beta-reading processes.

What Is A/B Testing?

A/B testing, sometimes called split testing, involves creating two versions of something—say, a newsletter subject line—and sending each version to a random half of your audience. The version that yields better results—higher open or click-through rates, in the case of a newsletter—is considered the better option. Easy-peasy.

This approach is straightforward because you’re testing one variable at a time. Think of it as baking an apple pie: If you swap out only the cinnamon for the nutmeg and your pie turns out better, you know the spice made the difference. But if you change the crust, filling, and sugar all at once, you don’t know which tweak improved or ruined the pie. A/B tests help authors gain clarity; when readers respond more to version A than version B, you know exactly what caused the shift in results.

Why do some readers respond better to a curious subject line, such as, “You won’t believe this twist!” while others prefer clarity, as in “Your free chapter is here”? Human psychology and behavior drive these differences. Cognitive biases, personal preferences, and habits shape how audiences engage with your content.

For basic A/B testing, you look for “statistical significance,” confirming your winner truly outperforms the loser because of the variable tested instead of random chance. Statistical significance means the difference in results—like a 30 percent open rate versus 28 percent—is large enough that we can be confident it's not just random chance. If the sample size is too small or the difference too tiny, we can't be sure one version is actually better.

Once you grasp the basics, you can move to more complex methods, like multivariate testing, which involves testing multiple elements simultaneously—a feature platforms like MailerLite offer. Instead of just comparing two subject lines, you might test two subject lines, two types of content—text-heavy vs. image-heavy, for example—and two call-to-action (CTA) buttons. That’s multiple combinations. Sorting through the data to see which combination truly excels can be overwhelming, but AI can quickly identify patterns and interactions between variables, making sense of complex results and speeding up your decision-making process.

Pro Tip: If the term “AI” worries you, consider that this is really just computational analysis of a range of numbers and statistics to find results. In these examples, it acts as a fancy calculator or spreadsheet, helping you analyze lots of data quickly and easily.

Newsletter A/B Testing for Authors

Now that we’ve uncovered why A/B testing matters, let’s consider how each step unfolds in the author sphere to provide the data that will help boost your open rates, clicks, and overall reader engagement.

  1. Choose what to test. Start simple—try two subject lines, such as version A: “Your members-only missing chapter is here!” and version B: “Download the missing chapter before Sunday evening when it will disappear forever!” Here we’re testing whether our audience prefers a friendlier subject line that makes them feel like they’re part of the cool kids’ club (A) or a more sales-driven, time-limited hard sell (B).
  2. Set clear goals. Are you optimizing for open rates or click-through rates? In other words, is your goal that readers open the email and read, or is there a link to click or another CTA? Decide your goals upfront.
  3. Analyze your results. Suppose version A increases open rates by 12 percent. You’ve learned that a curiosity-driven and friendlier subject line resonates with your audience. Next time, try another small tweak. Over time, these incremental improvements add up, giving you a better idea of how to connect with your subscribers and keep them interested.

As you grow more comfortable, also consider leveraging multivariate testing. Test not just subject lines but also sending times, such as morning versus evening, and CTA styles, such as “buy now” versus “read a sample.” The combinations multiply, but so do your insights. Your email service provider’s built-in analytics, possibly coupled with AI-driven tools, can help you parse these complex results and reveal your ideal newsletter strategy based on your audience.

Beyond Newsletters

A/B testing isn’t limited to newsletters. You can apply the same principles to all facets of your author platform. Consider testing variables in the following aspects of your business to decide on an option your audience will like best.

Book Covers

Before committing to a cover design, run a quick test. Show two versions to different segments of your audience on social media. Measure engagement—likes, comments, or clicks on a preorder link. By testing covers before finalizing one, you can launch with greater confidence. Just make sure your cover designer is open to testing like this, and get upfront clarity on costs of revisions.

Pro Tip: It’s always easier for people to comment about what they don’t like. If you’d like better feedback, ask them what they like about the image they chose.

Beta Reading

Thinking of altering a chapter’s ending? Split your beta readers into two groups, and present each with a different version. Gather feedback, and see which resonates more. There may not always be a clear winner, but the data you collect can help you make narrative decisions guided by reader response rather than guesswork.

Social Media Ads and Product Descriptions

For those with ads on Meta platforms, Meta now offers Advantage+, an AI-driven ad optimization tool. Instead of manually choosing what to test, Advantage+ does the heavy lifting, mixing and matching creatives, audiences, and placements. It continuously learns which combinations yield the best results and adapts in real time. This is a form of ongoing, automated multivariate testing that can save authors time and energy, helping ensure efficient ad spend.

Not all testing requires a built-in email or ads feature. With tools like Switchy, you can create short links that rotate traffic among multiple destinations. Let’s say you’d like to test different blurbs for the same book. With a static website, you might not have the capability for A/B testing, but you could create two landing pages for the same book. Switchy will rotate the link, taking half your audience to landing page A and the other half to landing page B. After gathering sufficient data, you should know which page performs better. This method is especially useful if you’re driving traffic from social media profiles, guest posts, or any environment where you can’t easily run a traditional A/B test.

As time goes on, you can keep the same link and change or add more destinations.

Alternatives to A/B Testing

Although A/B and multivariate testing are powerful, these methods are not always the best fit. Sometimes you need qualitative feedback to understand why readers prefer one version over another. In these situations, there are other ways to collect information on your audience’s preferences. Asking a small group of devoted readers what they think can offer rich insights. Surveys let you tap into your audience’s feelings and motivations, going beyond raw metrics to understand their preferences on a deeper level.

Pro Tip: This might sound like a lot of work—it can be! But hop on the phone with a fan for even just ten minutes and ask a few questions, and listen. You’ll receive more specific feedback and a deeper connection with your readers by involving them directly in your decision-making process.

A Mindset of Experimentation

Whether you’re fine-tuning a newsletter subject line or deciding on a cover design, testing is your secret weapon. By starting with simple A/B tests and gradually exploring multivariate testing with additional tools and strategies, you gain a better understanding of what resonates with your readers.

At the heart of it all is a mindset of experimentation and curiosity. Don’t be afraid to test, learn, and iterate. Begin with something small—one subject line, or one CTA. Analyze the results. Refine.

Over time, you’ll develop a testing toolkit that allows you to make confident, data-driven decisions, helping you connect with readers more effectively and propel your author career to new heights.

Bradley Charbonneau


BIO:

Bradley Charbonneau wanted to be a writer. Trouble was, he didn’t write. A friend was running a “Monthly Experiment” (no coffee for a month, wake up at 5 AM, etc.) and created one where everyone had to write every single day for 30 days. Bradley took the challenge. “Hmm, that wasn’t so bad.” Then he kept going. 100 days. 365. 1,000. 2,808 days and 31 books later and he found out it’s simple. Not necessarily easy, but simple. #write #everysingleday

Share this article