A/B Testing
A method of comparing two versions of a web page to determine which performs better for conversions.
A/B testing (also called split testing) is a controlled experiment where two versions of a web page, email, or ad are shown to different segments of an audience to determine which version performs better. In the context of testimonials, A/B testing is crucial for optimizing placement, format, and content. Common testimonial A/B tests include: video vs. text testimonials, testimonial placement (above vs. below the fold), number of testimonials displayed, testimonial format (carousel vs. grid), and whether to include photos and company logos. To run valid A/B tests, you need sufficient traffic volume, a clear success metric (usually conversion rate), and statistical significance (typically 95% confidence level).
Frequently Asked Questions
What testimonial elements should I A/B test first?
Start with the highest-impact tests: video versus text testimonials, and placement above versus below the fold. These two variables typically produce the largest measurable differences in conversion rate. Once you have winners, move on to testing the number of testimonials shown, whether to include customer photos and logos, and carousel versus grid layouts.
How long should I run an A/B test on testimonials?
Run your test until you reach statistical significance, typically at 95% confidence. For most sites this means at least two full weeks to account for day-of-week variation, with a minimum of 1,000 visitors per variation. Ending a test too early can produce misleading results because short-term fluctuations may not reflect true performance differences.
