[object Object] Icon

Encoding
Learn how to create, start, manage and modify Encodings

[object Object] Icon

Player
Learn how to create, start, manage and modify Players

[object Object] Icon

Analytics
Learn how to create, start, manage and modify Analyticss

Docs Home
User shortcuts for search
Focus by pressing f
Hide results by pressing Esc
Navigate via   keys

Thu Sep 13 2018

How to do A/B Testing with Bitmovin Analytics

OverviewLink Icon

Bitmovin Analytics offers the ability to do A/B testing out of the box on any data field you want. Since out API allows you to filter or group on any field we tracked you can often do A/B testing even on data you did not explicitly set up for A/B testing.

For example, let's say you have deployed a new Version of the Player. Analytics will automatically pick up the new Player version and save it. So to compare the old with the new Player you just need to query a Metric and GroupBy on the PLAYER_VERSION field.

Comparing startuptime between Players using the Bitmovin Javascript/NodeJs API-Client is as easy as:

1bitmovin.analytics.queries.builder.median('STARTUPTIME')
2 .between(moment().startOf('day').toDate(), moment().toDate())
3 .interval('MONTH')
4 .groupBy('PLAYER_VERSION')
5 .filter('STARTUPTIME', 'GT', 0)
6 .query()

But sometimes there are changes that can't be picked up automatically, so we have added the fields experimentName and customData to indicate changes which you can set yourself through the Analytics Configuration.

Let's say we do a comparison of our custom ABR Logic with 3 variants we could add the following:

1var analyticsConfig = {
2 key: "",
3 experimentName: 'ABR-Tests',
4 customData1: 'abr-variant-1'
5}

Obviously the customData1 would vary depending on what variant of the new custom ABR-Logic gets tested by the customer. Some logic on the server is required to decide which users get what variant of the new ABR-Logic, but once that is sorted out you can simply compare video startuptimes with the following query:

1bitmovin.analytics.queries.builder.median('VIDEO_STARTUPTIME')
2 .between(moment().startOf('day').toDate(), moment().toDate())
3 .interval('MONTH')
4 .groupBy('CUSTOM_DATA_1')
5 .filter('EXPERIMENT_NAME', 'EQ', 'ABR-Tests')
6 .filter('VIDEO_STARTUPTIME', 'GT', 0)
7 .query()

The same goes for CDN_PROVIDER which you just have to put into the analytics configuration (the field is called cdnProvider - see our simple example) and can then be used in conjunction with all the other data to compare CDN performance.

For example if we are interested in the best performing CDN in the US in terms of video startuptime we can do:

1bitmovin.analytics.queries.builder.median('VIDEO_STARTUPTIME')
2 .between(moment().startOf('day').toDate(), moment().toDate())
3 .interval('MONTH')
4 .groupBy('CDN_PROVIDER')
5 .filter('CDN_PROVIDER', 'EQ', 'us')
6 .filter('VIDEO_STARTUPTIME', 'GT', 0)
7 .query()

We can even groupBy multiple variables to for example get a comparison of how the CDN fares either for DASH or HLS:

1bitmovin.analytics.queries.builder.median('VIDEO_STARTUPTIME')
2 .between(moment().startOf('day').toDate(), moment().toDate())
3 .interval('MONTH')
4 .groupBy('CDN_PROVIDER')
5 .groupBy('STREAM_FORMAT')
6 .filter('CDN_PROVIDER', 'EQ', 'us')
7 .filter('VIDEO_STARTUPTIME', 'GT', 0)
8 .query()

Give us feedback