Josiah Sternfeld - 1 year ago
Would love to see this functionality implemented. While I understand the point about winners not being determined automatically, one could theoretically set up the feature in a way that you get regular emailed reports about the A/B testing performance (like Marketo), with reminders about choosing a winner. Then you could change the random sample split test step to be 100% towards the winner.
It's still manual of course, but better than trying to hack around it with looking to see if someone's phone number ends with a 4 or something.
A/B testing in Engagement Studio is currently under product review. To ensure your use case is considered during further evaluation of our path forward with this functionality, please share your use cases in comments of this idea or via the Trailblazer Community feedback post here:
Kelsey McKinny - 1 year ago
Emilio said it best, but simply put, here are our goals for A/B testing in engagement studio:
1. Increase open rates in drip email campaigns based on subject lines and/or senders
2. Increase CTR in drip email campaigns based on content
3. Increase conversion rates based on drip email content
4. Decrease the amount of time spent A/B testing a LIST EMAIL (one and done) to later include it in a drip campaign (very manual process - so manual that I haven't even done it because, well, it should be something we can do already)
5. Achieve world domination
Basically, THE SAME EXACT GOALS AS WHEN YOU A/B TEST ANY EMAIL.
Rachel McLean - 1 year ago
Essentially working out the fastest way to get prospects to convert via clicking a specific link. Doing this by A/B testing an entire "welcome series" workflow - including autoresponder content and layout, and the content, layout and timing of the subsequent behavioural-based emails within the workflow.
Great feedback, Emilio! Thank you for providing such detailed use cases and expectations. I agree the hiccup with the "random split" approach is that a winner is never determined automatically and a "losing" verson will continue to be sent until manually decided and changed. Email A/B Testing in Engagement Studio is currently under Product review.
Emilio Reyes Le Blanc - 1 year ago
Hi Kylie! I remember we chatted briefly about this idea at the Pardot CAB.
The fudamental goal here is to build journeys that generate greater email engagement from prospects (which will presumably yield more conversions). A/B testing in Engagement Studio would address this by helping marketers lead prospects along journeys with content that performs best -- eliminating templates from journeys that haven't been performing as well as competitive templates sent at the same journey stage.
The standard email testing functionality for list email sends is familiar and useful. You can vary template content as well as subject lines. You can select your test group size. You can set the criteria by which one of the test conditions will be chosen for the rest of the email send. All of these features are great!
Imagine how this might look in Engagement Studio. Add a 'Send Email' action. Instead of selecting one template, you might imagine the option to select multiple templates. You'd select the criterion for choosing one template over the other(s): more clicks, more opens, etc. Then you'd set the constraint for ending the test: "After X prospects have passed through the node, choose the template with the most Ys for all future prospects travling through this node."
I would expect it's neither feasible nor desirable to have a WYSIWYG editor in the Engagement Studio environment. Nevertheless, you could get around that by requiring that the selected templates be published distinctly.
There are workaround where you can choose an arbitrary rule for routing prospects to one piece of email content over another. (E.g., if the prospect last name contains an 'a', route them to the first email template; otherwise, send the prospect to the other template.) However, the fundamental issue with this is that Engagement Studio can't just route future prospects to the better-performing piece of content. Instead, he or she would need to pause the journey, remove the poorer-performing template, delete the rule node, and then restart the journey. (It's kind of an infelicitous process.)
If you've upvoted the idea for A/B testing in Engagement Studio, we'd appreciate your feedback on this poll in the Pardot B2B Marketing Automation Trailblazer Community Group!