First steps
User Data
- Responsive Email Editor Review
- Designing your email
- Creating Synchronized Modules
- Setting Up Responsive Email Design
- Setting Up Smart Containers
- Creating a Gmail Promotions Annotation
- Adding The Rollover Effect
- Adding Anchor Links
- Module Library
- Adding a Table to an Email
- Adding Custom Fonts
- Creating CTA Button
- Working with Images
- Creating Timer
- Using AI in the Email Editor
- Messenger Protocol Support in Email Clients and Platforms
Omnichannel
- Setting Up Widgets for Your Site
- Widget Calling
- Setting Up Locations for the Widget Calling Rules
- Storing data from widgets to contact fields
- Using Annoyance Safeguard
- Actions After Form Submission
- Replacing Double Opt-In System Workflow
- Creating Pop-ups via Google Tag Manager or WordPress
- Sending Yespo Widget Events to Google Analytics
- Using A/B Tests for Widgets
- Collecting Contact Information Using Request Forms
Automation
- Building and Editing Workflows
- Configuring Workflow Start/Stop Conditions
- Start Block
- Popular Blocks
- Message Blocks
- Using One from Many Message Block
- Contact Blocks
- Conditions Blocks
- Other Blocks
- Message to Segment Blocks
- Time Blocks
- Advanced Workflow Block Parameters
- Setting Up Allowed Send Time
- Using Workflow Launch History
- Webhook Workflows
- Workflow Troubleshooting
- Double Opt-In
- Welcome Сampaign
- Welcome Series Segmented by Category
- Launching a Workflow After a Contact Import
- Regular Workflow for a Segment
- Birthday Campaign
- Linking Workflow to the Button
- Using Variables from Order in Workflow
- Collecting Order Feedback
- Customer Reactivation
- Sending Extra Campaigns
- Sending Reminders at the Time Specified by the User
- Sending Campaign to Those Who Did Not Open the Previous One
- Using A/B Tests In Workflows
Personalization
Analytics
- Email Campaign Report
- Web Push Campaign Report
- Viber Campaign Report
- Mobile Push Campaign Report
- App Inbox Campaign Report
- Telegram Campaign Report
- In-App Report
- Widget Report
- Triggered Campaign Report
- AMP Campaign Report
- SMS Campaign Report
- Multilingual Campaign Report
- Setting up UTM Tags
- Revenue from Campaigns
- Tracking Campaign Performance in Google Analytics 4
- Message Analytics
Multilanguage Campaigns
Events and Behaviour Tracking
Recommendations
API
Security and Compliance
A/B Testing of Recommendation Blocks
A/B testing allows you to automatically determine the best result when displaying a recommendation block based on CTR and conversions.
You can test:
- data sources — for recommendations in mobile applications and received via JavaScript API,
- names, data sources, appearance, and block placement — for Out the box recommendations.
Identify the most effective options and create new blocks based on test results.
A/B Testing Settings
1. Go to the Site → Recommendations section and click on the desired recommendation.
2. Go to the A/B testing tab and click the Start A/B testing button.
Testing Parameters
1. Select the number of variants to be tested — by default, this is two; to add more, click the corresponding button (the maximum number is 8).
2. Specify the percentage weight of each variant or distribute it evenly by clicking the corresponding button.
3. Add a description of the variants so that in the future, you can easily recognize the key differences between the variants being tested (optional).
Below, select the parameters you will test. Testing one parameter at a time is better to understand which factors influence the results.
Data source. Specify which algorithm will generate recommendations. To change it, click the option and select the desired one from the list.
Placement: Specify where you want to place the recommendations on the page.
Appearance: Specify the variants that will be included in the test.
Titles. Write down different options to see which one is more attractive.
After making your settings, click the Save button at the top of the page.
Testing Status and Its Assessment
In the general list of created recommendations, each recommendation that has been A/B tested has a corresponding mark in the form of a button.
The button is colored differently depending on the current status (completed/not completed) and key results (by CTR and achieved conversions). When you hover over it, a short report opens. Click the View test button to view the results in detail.
Color coding:
- The green button – testing is complete; there is a winner by CTR and conversion.
- The gray-yellow button indicates insufficient data to determine the winner by CTR; the conversion is the same.
- The yellow button – CTR and conversion statistics are the same, and testing continues.
- The green-yellow button indicates a winner by CTR, and conversion statistics are the same. Testing continues.
- The yellow-green button – there is a winner by conversion, and CTR results are the same.
- The grey button – there is insufficient data to determine the winner by any indicator.
Test Results Report
The winner is displayed with the results obtained on the first screen of the test viewing.
To determine a winner, one of the variants must be in the lead with a probability of at least 90%, and the difference in probability between the variants must be more than 5%. Testing will continue until this minimum threshold is reached.
If there is enough data, you can stop the test by clicking the Choose Variant and finish test button (the variant that performed better will be written in bold text).
If the winners by CTR and conversion are different variants, then only the winner by conversion can be the best since sales are more important than the click-through rate.
After describing the preferred option, a report with the testing results for each option is presented.
The green frame shows the winning variant's CTR and conversion rates.
In the last block of the report (Activity dynamics), graphs visualize the results obtained compared to other test variants.
To see the number of indicators for a certain period, hover over the graph:
The history of completed tests is displayed at the bottom of the page: