Kurt Stoll - 1 year ago
A/B Testing should mean that the Marketer can configure a % of prospects to receive each email (say 5% for Email A and 5% for Email B) and whichever email receives the most opens or clicks (again, configurable by the Marketer) should automatically become the "winner" after a set period of time (example: 2 days) and the remainding % of Prospects should automatically receive the "winning" email.
Also, for the record this should have been put into place years ago! All of your major competitors have this feature (ex. Eloqua, Responsys, Marketo, etc). Even lower tier competitors like Constant Contact and MailChimp have this feature. It should be an embarrasment to Salesforce/Pardot to not have this.
MARKETING - 1 year ago
For us, the A/B feature in engagement studio would be useful not so much just to test one content versus the other (in an e-mail), but test a conversion funnel versus another one. Therefore, there should be a simple "action" field that randomly splits the selected list 50/50 (without having to create a workaround) to go through 2 varied funnels. In the end, there should be the possibility to choose the better "performing" funnel (i.e. more opens, more clicks etc.).
Mike Vickroy - 1 year ago
Please make this or A/B testing email templates a reality. Seems to be an obvious miss. I understand there is an argument about there not being a control to determine a winner, but we are all professionals and should have the ability to analyze a simple A/B test.
To minimize pardot liability for poor decision making on our behalf, perhaps make this an on/off option in account settings similar to:
Josiah Sternfeld - 1 year ago
Would love to see this functionality implemented. While I understand the point about winners not being determined automatically, one could theoretically set up the feature in a way that you get regular emailed reports about the A/B testing performance (like Marketo), with reminders about choosing a winner. Then you could change the random sample split test step to be 100% towards the winner.
It's still manual of course, but better than trying to hack around it with looking to see if someone's phone number ends with a 4 or something.
Kylie Yancey - 1 year ago
A/B testing in Engagement Studio is currently under product review. To ensure your use case is considered during further evaluation of our path forward with this functionality, please share your use cases in comments of this idea or via the Trailblazer Community feedback post here:
Marketing Team - 1 year ago
Emilio said it best, but simply put, here are our goals for A/B testing in engagement studio:
1. Increase open rates in drip email campaigns based on subject lines and/or senders
2. Increase CTR in drip email campaigns based on content
3. Increase conversion rates based on drip email content
4. Decrease the amount of time spent A/B testing a LIST EMAIL (one and done) to later include it in a drip campaign (very manual process - so manual that I haven't even done it because, well, it should be something we can do already)
5. Achieve world domination
Basically, THE SAME EXACT GOALS AS WHEN YOU A/B TEST ANY EMAIL.
Rachel McLean - 1 year ago
Essentially working out the fastest way to get prospects to convert via clicking a specific link. Doing this by A/B testing an entire "welcome series" workflow - including autoresponder content and layout, and the content, layout and timing of the subsequent behavioural-based emails within the workflow.
Kylie Yancey - 1 year ago
Great feedback, Emilio! Thank you for providing such detailed use cases and expectations. I agree the hiccup with the "random split" approach is that a winner is never determined automatically and a "losing" verson will continue to be sent until manually decided and changed. Email A/B Testing in Engagement Studio is currently under Product review.