“Yandex.Direct, experiments” report

Experiments help you test different versions of ads and landing pages, as well as different forecasts for advertising in Yandex.Direct. You can test campaign settings, campaign types, and media plans.

The report helps you determine how well an experiment works: you can compare test groups with each other and a control group, if you use one in the experiment, using a variety of metrics (such as bounces and time on the site), as well as by conversion rate. This is how you can use the report to analyze the results of an experiment based on a set of metrics.

If you need a clear answer based on CPA, use the A/B testing reliability calculator. It also lets you change the significance level (p-value). In the report “Yandex.Direct, experiments” the p-value is 68%.

To view the report: Reports → Standard reports → Sources → Yandex.Direct, experiments.

  1. Conditions for getting the report
  2. How to conduct an experiment
  3. Evaluating experiment results
  4. Data interpretation
  5. Report structure and settings
  6. Questions and answers

Conditions for getting the report

To conduct testing, create an experiment in Yandex.Audience, specifying how many segments and what proportions to divide the audience into. Then configure ad campaigns for the experiment in Yandex.Direct.

If you deleted an experiment in Yandex.Audience, statistics on it and its segments are available in Yandex.Metrica for the period prior to deletion.

How to conduct an experiment

Before launching the experiment, give clear names to experimental campaigns in Yandex.Direct. This makes it easier for you to analyze statistics in Yandex.Metrica.

In the name, specify the experiment and group that the campaign is targeted for. For example, if one of the groups in your experiment is a control group (you'll compare it with the results of the other groups), this should be written in the name of the campaign its designed for.

After you launch an experimental campaign in Yandex.Direct, you can check that the experiment actually started. To do this, in the Yandex.Metrica “Yandex.Direct, experiments” report, create a segment with all the campaigns involved in the experiment. If the report shows sessions and other statistics for these campaigns, then the experiment was launched.

Tip. Save the created segment: you'll need it later when viewing statistics.
Why do I need to create a segment

After you create an experiment in Yandex.Audience, the system randomly divides Yandex users into experimental groups. The experiment may include users who visited the advertiser's site and users who didn't. Users who visited the site may not have arrived via a Yandex.Direct ad. To select only sessions that came from ads, create a segment with the relevant Yandex.Direct campaigns.

Evaluating experiment results

Note. We recommend that you start evaluating data using the “Yandex.Direct, experiments” report two weeks after launching an experiment.

In the “Yandex.Direct, experiments” report:

  1. Select the experiment you want to analyze.
  2. Select the segment with the Yandex.Direct campaigns that you created for the experiment. If you don't have a segment yet, create one following our recommendations.
  3. Compare the experiment segments. Click Comparison mode and select a control group to compare the results of other groups to.
Note. To compare the effectiveness of an experimental and control campaign, you can add previously created goals to the report.
In comparison mode, some metrics are highlighted in color:
  • Green or red means there's enough data to evaluate the results of the experiment. Green means that this metric is significantly better than in the control group. Red means it's significantly worse.
  • Gray means there's enough data, but the values of metrics in the control group don't differ from the test group.

No color is shown if the comparison mode isn't available for the selected metric or there isn't enough data accumulated. Wait a while longer or start an experiment with other settings that allow you to collect more traffic.

Metrics that are highlighted in color
  • Pageviews
  • Sessions
  • Converted sessions
  • Users
  • Converted users
  • Goal completions
  • Conversions per user
  • Conversions
  • User conversion rate
  • Bounces
  • Item views
  • Users who have viewed the item
  • Items added to basket
  • Items removed from basket
  • Number of users who added an item to their basket
  • Items purchased
  • Number of users who bought items
  • Percentage of missed calls
  • Percentage of first-time callers
  • Number of calls from unique phone numbers

You can add other metrics to the report. Only the ones from the list above get highlighted.

Data interpretation

You can use the report to compare ad campaigns by different metrics. This helps you understand according to which metrics one campaign should be more effective than another.

To verify the results of an experiment, you can take margin of error into account. This is the possible deviation of the actual value of a metric from its true value, or what it would tend towards if the experimental campaign was conducted indefinitely. To account for the margin of error, click Comparison mode and enable the display of the margin of error.

The margin of error in the report is relative. Its indicator reflects to what percentage the range is possible. The lower the margin of error, the higher the probability that the metric value is accurate. The higher it is, the greater the deviation. You can assume that when all else is equal, an advertising campaign with a lower margin of error has better indicators.

Report structure and settings

The data in the report is grouped by the name of the experiment created in Yandex.Audience. You can only add the groups and segments that were created for conducting the experiment to the report.

Some metrics are calculated and displayed only when the condition is met:

Yandex.Metrica Condition
Converted sessions Set up goals.
Сonverted users
Conversions
Conversions per user
Conversion
User conversion rate
Item views Pass E-Commerce data to Yandex.Metrica.
Users who have viewed the item
Items added to basket
Items removed from basket
Number of users who added an item to their basket
Items purchased
Number of users who bought items
Percentage of missed calls Pass information about calls to Yandex.Metrica.
Percentage of first-time callers
Number of calls from unique phone numbers

You can view general statistics on experiments in other Yandex.Direct reports, such as Yandex.Direct, summary:

The name of the experiment is displayed as a grouping condition in Yandex.Metrica. For example, you can view statistics on an ad campaign as part of an experiment.

Questions and answers

How to compare bounce rate and conversion

To compare the bounce rate and conversion in experimental and control campaigns, switch the relevant column to relative values () in comparison mode. Instead of seeing the share of this metric in “total and average”, as is the case in other Yandex.Metrica reports, you see the percentage by which the value of the metric in the experimental campaign differs from that of the control one.

For example, if you see 100% in the Bounce rate column for an experimental campaign, this means that the bounce rate is the same in the control and experimental groups.

How metrics are compared

Yandex.Metrica uses different methods for verifying the statistical significance of metrics based on the behavior of each of them. The comparison takes the spread of the experimental and control groups into account, not the absolute values.

Select a question to find a solution.

Check that:

  • The tag ID is specified in the Yandex.Direct campaign settings, and Tag links for Yandex.Metrica is enabled.
  • Tag settings in the Yandex.Metrica interface don't have excessively narrow filters set up on the Filters tab.
  • The tag is installed on all the landing pages. To see data for pages that have the Yandex.Metrica tag installed, go to Reports → Standard reports → Content → Popular. Data is shown if site users visited these pages. To check whether the tag is installed correctly on any of the site pages, see Checking the tag.
  • The site was functioning correctly during the selected report dates.

Yandex: Unknown means that the session that Yandex.Metrica registered couldn't be traced to a specific click on a Yandex.Direct ad. Besides the advertising system itself, the rest of the data couldn't be detected either, including the campaign, keyword, and query.

The reason for Yandex: Unknown appearing in a Yandex.Direct source might be a gap in time between generating the yclid tag and registering the session in Yandex.Metrica. This can happen in the following situations:
  • The user clicked an ad but left the source page open on a browser tab, and later refreshed the page.
  • The user clicked a link from an ad, but then forwarded it to someone else. A repeat visit using this link won't be associated with the click on the Yandex.Direct ad, either.

Make sure that the label is formed correctly. A UTM label has five parameters, presented in any order, separated by ampersands (&):

http://example.com/?utm_source=yandex&utm_medium=cpc&utm_campaign=campaign123&utm_content=ad456&utm_term=keyword
If the page URL already contains parameters, the label should be added after an ampersand (&):
http://example.com/?uid=1234&utm_source=yandex&utm_medium=cpc&utm_campaign=campaign123&utm_content=ad456&utm_term=keyword

In addition, UTM labels might not be counted because:

  • The Yandex.Metrica tag isn't installed on landing pages or is installed incorrectly. Check how the tag is installed.
  • The ad specifies a URL that redirects to a page that doesn't have the Yandex.Metrica tag installed.
  • The ad specifies a URL that redirects to another page, and the UTM labels are lost.
  • The ad specifies an invalid URL that doesn't open a page.

The standard report on UTM labels has a hierarchical structure: utm_source, utm_medium, utm_campaign, utm_content, and utm_term.

If you use every UTM label except utm_content, data for utm_term won't be available in the standard Labels report group report. In order to start showing the labels, remove utm_content from the report.

Label parameters have standardized names: utm_source, utm_medium, utm_campaign, utm_content, and utm_term. You can't change them or create custom names (like utm_keyword or utm_word), because they won't be part of the UTM labels and their data won't be reflected in the “UTM labels” report.

For example:
http://example.com?utm_source=yandex&utm_medium=cpc&utm_campaign={campaign_id}&utm_keyword={keyword}

where utm_keyword is a custom parameter.

The “UTM labels” report shows information for the utm_source, utm_medium, and utm_campaign parameters. Information for the utm_keyword parameter is only available in the report Standard reports → Content → By URL parameters.


This happens as the reports are by default based on the Last non-direct click attribution model. It helps you calculate the conversion rate more precisely by taking into account the user's session history to select the traffic source.

To remove statistics on a stopped or archived campaign from the report, select the Last click model. In this case, Yandex.Metrica detects the current traffic source for each session.

See the reasons why data may differ. If the reasons have been resolved but you still have questions, fill out the form below.