Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Enable AB Tasty to send campaign data to your Data Warehouse integrations, export information to your productivity tool, etc.
This guide will walk you through the steps to set up the AB Tasty connector with Notion, allowing you to track and manage your campaigns directly in Notion.
A new Notion page will be automatically created, displaying your campaigns and their details, such as:
Campaign Name
Campaign Description
Campaign ID
Campaign Link
Campaign Type
Created Time
Start Date
End Date
Number of Live Days
Number of Variation.s & Variation.s Name.s
Reporting link
Learning Library Inputs
This page will update in real-time to reflect any changes in campaign status or days passed, the number of visitors …



Follow these steps to set up and manage the integration efficiently:
Go to AB Tasty Integration HUB
Log in to your AB Tasty account.
Navigate to the Integration HUB, where you can find and configure third-party connectors.
Create a Google Sheets Connector
Under Productivity Tool, open Google Sheets
Click Create Connector => Authorize and Create
3. Connect with Google
Give a name to your connector (to retrieve it inside the integration)
Give a name to your Google Sheet Name
Click on “Continue with Google”
You’ll be prompted to log in with your Google account.
Grant permissions to allow AB Tasty to manage and update Google Sheets. 4. Your Spreadsheet is Now Created
Once the connection is established, AB Tasty will automatically generate a Google Sheet for the integration.
The spreadsheet will include pre-defined headers for your campaign data, such as:
Campaign Name
Campaign Description
Campaign ID
Campaign Link
Campaign Type
Created Time
Start Date
End Date
Number of Live Days
Number of Variation.s & Variation.s Name.s
Reporting link
Learning Library Inputs
Retrieve and Share AB Tasty Campaign Data
All updates in AB Tasty will sync automatically to the connected Google Sheet.
You can share the spreadsheet with team members, use it for presentations, or integrate it with other reporting tools.
Real-Time Data Updates: Campaign performance metrics are always up-to-date.
Collaboration: Share insights effortlessly with team members and stakeholders.
With this integration, managing and sharing AB Tasty campaign data has never been easier.





Context This page is linked to the Snowflake connector creation process. It will allow you to use the Public Key authentication method.
Integrating Slack with AB Tasty lets you receive real-time updates on your experiments, personalizations, and other platform activities directly in your Slack workspace. This integration helps streamline team collaboration by ensuring everyone stays informed without leaving their Slack environment.
Once connected, the Slack integration with AB Tasty provides several key features:
Integrating Microsoft Dynamics 365 Commerce with AB Tasty brings the power of AB Tasty's Experience Optimization Platform to Microsoft Dynamics 365 Commerce. Through the integration, you will be able you run A/B, Multipage or Multivariate tests. You will also benefit from AB Tasty's automation, including dynamic allocation, and the ability to personalize right on your store to increase conversions and transactions.
AB Tasty is fully compatible with client-side Contentful websites through the AB Tasty tag and visual editor. This allows marketers to easily create and run client-side experiments.
The server-side AB Tasty app for Contentful goes beyond that:
The native integration of Contentful and AB Tasty simplifies structured content experimentation by reducing the dependency on developers. Once configured, it allows teams to create and run experimentation campaigns directly in AB Tasty, selecting the content to display for each variation from Contentful.
This application is designed to facilitate test management and content personalisation. It offers the ability to choose a project, a campaign, and its variations, then associate each variation with a content entry. This process is code-free, ensuring a fluid and accessible workflow even for non-technical users.
👉 To install and configure the AB Tasty app, check the how-to guide in the AB Tasty documentation. 👉 To access the app directly, go to the Contentful Marketplace
trueopenssl genpkey -algorithm RSA -out rsa_key.pem -pkeyopt rsa_keygen_bits:2048openssl rsa -in rsa_key.pem -pubout -out rsa_key.pubuse role securityadmin;
alter user YOUR_USER_NAME
set rsa_public_key='YOUR PUBLIC KEY'; // Paste here your public keyshow users like 'YOUR_USER_NAME';Real-time notifications: Get updates on experiment launches, creation of new campaigns and other key events.
Specific Accounts & Campaigns subscriptions: Focus on what you want to follow.
Slack commands: Use specific commands to retrieve experiment details directly from Slack.
Multi-channel support: Choose where notifications are sent based on your team's workflow.
Admin access to your Slack workspace.
1. Access AB Tasty Integration Hub
2. Under Productivity, localize Slack Card
3. Click on Create a connector
4. Authorize the Installation
You can follow the on-screen prompts to give AB Tasty the necessary permissions to access your Slack workspace. If you're not the workspace's admin, it may require admin approval.
Once the APP is installed on your workspace, you can connect AB Tasty to Slack directly from within Slack:
Open Slack and navigate to the Apps section.
Search for AB Tasty
On the Home Page, click on Log In to continue
Once completed, your AB Tasty account will be linked to your Slack, enabling commands & notifications
If you prefer to set up the integration directly from AB Tasty:
Log in to AB Tasty and go to the Integrations section.
Find Slack and click Create a connector
Follow the authorization flow to link your Slack Account and AB Tasty successfully
Once the integration is active, you can use the following commands within Slack to interact with AB Tasty:
/abt-campaigns → Displays a list of active experiments from your last connected account.
/abt-account-info → Displays a list of active experiments from the selected account.
/abt-campaign-info → Displays specific details about the selected campaign.
/abt-subscribe-users → Allows a user to manage subscriptions to specific accounts or campaigns.
/abt-subscribe-channel → Allows a Slack channel to follow specific accounts or campaigns
/abt-unsubscribe → Manages current subscriptions.
/abtasty help → Provides a list of available commands.
AB Tasty’s Slack integration brings your campaign insights directly to your workspace, making it easier to stay informed and collaborate efficiently.
With the AB Tasty Slack integration, you can:
✅ Subscribe/unsubscribe to campaigns – Get only the updates that matter to you.
✅ Retrieve campaign details – Access specific data directly from Slack
✅ Get real-time status updates – Never miss a key change.
✅ Add the app to any channel – Keep campaign discussions in context with dedicated updates per channel.
By integrating Slack with AB Tasty, teams can improve collaboration and ensure that important experiment updates are easily accessible. Whether you connect via Slack or AB Tasty, the straightforward process enhances visibility into your optimization efforts.
To learn more about AB Tasty policy, please read our dedicated page:

This article assumes you already have your store on Microsoft Dynamics 365 Commerce and want to create and run Experimentation or Personalization for your online store using AB Tasty.
Beware: There are two kinds of AB Tasty tags: the generic tag will allow you to apply the JavaScript layer for your page modifications, and the transaction tag will send transaction data to your reports. Only the Generic tag can be implemented with Microsoft Dynamics 365 Commerce.
Get the ABT Tasty Generic tag code snippet. For this,
Log in to your AB Tasty Account
Access Settings > Tags implementation > Generic Tag. Here you will find AB Tasty's Javascript code snippet to copy.
Copy the code snippet
Log in to your Microsoft Dynamics Commerce account.
Follow the steps for creating an External script module from this
The external script module includes the “Executed Script Asynchronously” and the “Defer Script Execution” configuration properties:
The “Executed Script Asynchronously” property specifies whether the script should be run in an asynchronous mode.
The “Defer Script Execution” property specifies whether the script should be run when the page has finished parsing configuration properties.
The following illustration shows an external script injector module that is configured on a page template. The Script source property box is where you add the URL that points to the script source code that will be injected into the HTML for the rendered page.
Check From the Network Tab
The easiest way to QA your implementation is to have a look at the network tab of your browser. By filtering on the try.abtasty.com domain, you should see our tag being downloaded. By opening its preview tab, you will see its last update time and its version.
Check for the JS Object
Our tag creates an ABTasty JS object with useful variables and methods when correctly executed.
Check for the Browser Storage
If no storage restriction is set (consent management), the tag will automatically create cookies and entries in the local storage and session storage.
This feature is currently available in Early Adoption. Please contact your CSM to enroll into the Early Adopter program.
The Data Warehouse integration feature allows you to design a report from data that has been collected by AB Tasty, and export them to your external Data Warehouse. Daily exports are available to keep your external Data Warehouses up to date.
To manage your Data Warehouse integrations, go to the Integration Hub > Data Warehouse category.
An export is generated once an integration is configured and will execute daily at the configured time.
You need to set up a connector and an export for the Data Warehouse integration to work.
The Connector manages the connection between AB Tasty and your Data Warehouse. Please refer to the following user guides to set it up:
BigQuery connector
Snowflake connector
Redshift connector
You can create an export once a connector is configured.
All collected data can be exported through our Data explorer API. You will be asked a payload to extract data from the Data Explorer form. The payload corresponds to a data query.
We recommend testing the Data Explorer API and validating the output data report before setting up an export.
For more information, please refer to the Developer portal’s article about the
To generate your payload, use the new version of the from the Data Explorer category of the main navigation.
Please follow the following steps to generate a payload:
In the Data Explorer:
Define a date start and an end date
Choose dimension campaignId
Fill the value: the campaignId of the campaign(s) you want to extract data from
Leave the “limit” field empty
You can apply a filter by campaignId to limit the payload to one campaign extraction. You can add more filters and dimensions to your payload.
You must define at least of metric to export
Once you click on “Test query data consumption”, the Payload will be generated as follows:
At this stage, you’ve created the following:
your connector between your AB Tasty account and your external Data Warehouse
the payload to generate the data export
In this example, we will use BigQuery as provider. The process is the same for Snowflake and Redshift.
In the Integration Hub library, select your provider: here BigQuery.
Select the connector you’ve previously created to link your AB Tasty account to your BigQuery account
Name your export. For example, you can use the name/id of your campaign, to easily identify the data in your Data Warehouse. Ex: “A/B Test CTA Product page - id 123123”
Name the table that will be generated in BigQuery
Use the payload you’ve previously generated with the data explorer form
Ab Tasty automatically checks the payload for syntax errors and displays an error message if an error is encountered.
Save and Create export
This will create a full export of the existing campaign data: from the 1st minute since the campaign is running, to now. It will test the connection and create a daily report, ready to be activated at the next step.
When you click on “Save and create export”, AB Tasty checks the following:
If the Payload contains the filter campaignID
If the campaignID is well filled in
If the campaignID is linked to the current account (if the campaign has been built in the current account)
If one of the above conditions is not met, the API returns an error message.
Now that you have created and tested your export, you can activate it. This will make it run daily at the specified time.
Every day, AB Tasty will send:
All the collected data for the campaignID that has been declared, from the last export done while the campaign is live, and the daily export active.
You can run as many exports as you want on a daily basis.
The Data Warehouse integration usage and extracts falls is limited by data explorer v2.
Your requests to Data Explorer (either by integration or direct) will be limited by a monthly quota. Your export quota is set every first of the month. The usage depends on the volume of data fetched.
AB Tasty provides tools to assist you estimate the cost of your request and know your remaining quota.
You can find more information on quota management .
This method only needs to be implemented once. However, it requires a front-end developer to update the site’s code so that the AB Tasty Variation Container from Contentful can be connected with the Flagship SDK. It will combine the Contentful API with a Feature Experimentation SDK to decide which content variation should be displayed to a user.
In the example below, a pseudo-backend retrieves the AB Tasty Variation Container from Contentful and leverages Feature Experimentation to select the appropriate variation for the visitor.
When you fetch a variation container entry from the Contentful API, the response includes the main fields:
This guide walks you through installing and configuring the AB Tasty app on the Contentful Marketplace. It explains how to connect your Contentful space with AB Tasty so you can:
Add variation containers inside entries,
Link them with AB Tasty campaigns,
And run experiments directly on your content.
Choose this option if you want visitor consent to be managed by Didomi. Didomi is a Consent Management Platform (CMP) that enables you to manage your vendors (or third-party tools), and their associated purposes, and to configure the way your visitors can consent to data usage. There are two ways of integrating AB Tasty with Didomi:
Using our native integration: This method allows AB Tasty to execute without consent and to wait for the visitor’s consent to start collecting and storing data. This method is based on the Restrict cookie deposit option.
Using Didomi to manage tag injection: This method allows AB Tasty to execute only when the visitor has given their consent. This means no AB Tasty campaign will run on your website until the visitor gives consent.
We don’t recommend using the second method since the AB Tasty tag won’t be injected without the visitor’s consent.





The Export manages data export. It defines the following elements:
The data to export
Where to export it
The export frequency
Click “Test query data consumption”







Enable AB Tasty to send campaign data to your Data Warehouse integrations.
experimentId
meta
variations
Use the Flagship SDK to decide which variation to present.
AB Tasty selects the correct variation for the given visitorId based on the flag decision.
You can then use the variationId to retrieve the matching content entry from the variation container’s meta field.
From Contentful>Settings Menu> API keys
Retrieve SpaceID & Content Delivery API token
Click on Add an API key
Copy the Space ID
Copy the Content Delivery - access token
Click save if you've made any changes (You will be able to look back at both the space ID & the access token by clicking on your API name)
Use this code example to retrieve a variation entry // (replace the variable accordingly)
This approach ensures a seamless link between the Flagship SDK and Contentful by mapping variationIds to content entries.
For each visitor, the correct variation is automatically selected and delivered—scalable, reliable, and consistent.
{
"sys": {
"space": {
"sys": {
"type": "Link",
"linkType": "Space",
"id": "test"
}
},
"id": "FAKE_ID",
"type": "Entry",
"createdAt": "2025-08-28T13:49:03.977Z",
"updatedAt": "2025-08-28T13:49:03.977Z",
"environment": {
"sys": {
"id": "master",
"type": "Link",
"linkType": "Environment"
}
},
"publishedVersion": 3,
"revision": 1,
"contentType": {
"sys": {
"type": "Link",
"linkType": "ContentType",
"id": "abTastyContainer"
}
},
"locale": "en-US"
},
"fields": {
"environmentId": "ci84rm4uf6t1jrrefeig",
"environment": "prod",
"experimentID": "d1ilgv373e5iv8esho80",
"experimentName": "TEST ContentFull",
"variations": [
{
"sys": {
"type": "Link",
"linkType": "Entry",
"id": "5pDbSiwvTqtBthib8G1opA"
}
},
{
"sys": {
"type": "Link",
"linkType": "Entry",
"id": "5zKtmbcqb9ndM2wgU1KEmk"
}
},
{
"sys": {
"type": "Link",
"linkType": "Entry",
"id": "6QJmXjfiu91B7UpbjlKl1Z"
}
}
],
"meta": {
"d1ilgv373e5iv8esho90": "5pDbSiwvTqtBthib8G1opA",
"d1ili4eg4ajm59kcr1kg": "5zKtmbcqb9ndM2wgU1KEmk",
"d1j7hjfdv6k265cudqg0": "6QJmXjfiu91B7UpbjlKl1Z"
},
"projectId": "ci84rmkuf6t1jrrefejg"
}
}const { Flagship, DecisionMode } = require('@flagship.io/js-sdk');
Flagship.start(process.env.FLAGSHIP_ENV_ID, process.env.FLAGSHIP_API_KEY, {
decisionMode: DecisionMode.DECISION_API,
fetchNow: false,
});const visitor = await Flagship.newVisitor({
visitorId: String("VISITOR_ID"),
context: {},
hasConsented: true
});async function getVariationForCampaign(visitor, campaignId) {
if (!campaignId) {
throw new Error('getVariationForCampaign: missing campaignId');
}
// fetch the flags
await visitor.fetchFlags();
// get the metadata from the visitor
const flags = await visitor.getFlags();
const metaMap = await flags.getMetadata(); // Map
if (!metaMap || typeof metaMap[Symbol.iterator] !== 'function') {
throw new Error('getVariationForCampaign: invalid metaMap');
}
// Find the right campaignId → variationId mapping
for (const [flagKey, meta] of metaMap.entries()) {
if (!meta) continue;
if (meta.campaignId !== campaignId) continue;
return meta.variationId;
}
throw new Error(`getVariationForCampaign: no metadata for campaignId=${campaignId}`);
}
const variationId = await getVariationForCampaign(visitor, campaignId);// Import the official Contentful SDK
const contentful = require('contentful');
// Create a Contentful client using environment variables
const client = contentful.createClient({
space: process.env.CTF_SPACE_ID, // Contentful Space ID
accessToken: process.env.CTF_CDA_TOKEN, // Content Delivery API (read-only) token
});
// Fetch entries of type "abTastyContainer"
const containerResp = await client.getEntries({
content_type: 'abTastyContainer',
limit: 1,
include: 2,
});
// Extract the first container
const container = containerResp.items[0];
// AB Tasty campaign ID linked to this container
const campaignId = container.fields.experimentID;
// Includes contain all linked entries and assets
const includes = containerResp.includes;
// Retrieve the variationId for this campaign and visitor
const variationId = await getVariationForCampaign(visitor, campaignId);
// Map variationId → entryId from the container
const containerMeta = container.fields.meta;
const entryId = containerMeta?.[variationId];
// Find the linked entry in the includes
const entry = includes.Entry?.find(e => e.sys.id === entryId);Prerequisites:
A Contentful account with a dedicated space for your website.
An AB Tasty account with a Feature Experimentation project available
You have configured your website so you can use AB Tasty SDK to serve the right variation

Create a SnowFlake account
Go into your account to create a new SQL Snowflake worksheet
Copy this below code (change the annotation field as role, username, password etc ..)
Paste it inside the console and run it
Once the script is done, refresh the page. On the left part, a new line will appear with the name you gave inside the script for the Data Warehouse
In the above example, the name of the worksheet is YOUR_DATABASE
On the Snowflake homepage, click at the bottom left of the page and copy the account URL
Go to the integration page > Data Warehouse > Snowflake > setup connector
Enter a name for the collector
Host: This corresponds to the account URL you’ve already copied
Role: the role we created in the previous script
Warehouse: the warehouse we created in the previous script
Data Warehouse: the Data Warehouse we created in the previous script
Authorization method: choose username and password
Username: the username we created in the previous script
Password: the password we created in the previous script
Loading Method: choose internal staging
You will get an error message if one of the fields contains an error. Your connector is now set up, and you can go ahead and set up your Export.
To set up your daily export, please take a look at the guide: Data Warehouse integrations: General information.
Use the Data Warehouse article to create your payload.
Export name: the name of your export; give an explicit name to retrieve it easily in AB Tasty
Name of the table: the name of the table we will create in your Snowflake
Data exporter query: paste here the payload of your data explorer query
Click save and create.
The SnowFlake integration is now complete, and you will soon see the data flowing into your dedicated Data Warehouse (It can take up to 2–3 hours, depending on the size of your report).
The export is activated upon creation, and new data will be appended to the current one, daily. The following screenshot shows that the export is activated on creation:
In the Vendors & Purposes section of your consent notice configuration in Didomi, select AB Tasty 2.0 to use the most recent AB Tasty vendor.
The purposes related to the AB Tasty 2.0 vendor are as follows:
Measuring content performance
Developing and improving product
Storing and/or accessing information (cookies and others)
Selecting personalized content
Creating personalized content profiles
These last two are used for DMP usage in AB Tasty only; they won’t prevent the AB Tasty tag from collecting and storing regular data.
If you don’t want to use our default vendor, you can create your own directly in Didomi via Settings > Vendors. Set up the name and purposes, and write down the generated custom vendor ID, as it will be useful further along in the process.
For the Privacy policy field, you should refer to the AB Tasty Privacy policy.
To enable Didomi in AB Tasty, follow these steps:
Go to AB Tasty Settings > Implementation > Cookie and Privacy > Privacy.
Select Manage visitor consent directly in AB Tasty, to restrict AB Tasty data collection and storage until the consent has been given.
Select which mode you want AB Tasty to operate on. In default mode, the tag will still execute but won’t collect or store any data on the visitor. In strict mode, the tag won’t execute until the consent condition is met.
Under AB Tasty cookies will be placed once, select Didomi.
If you have a Custom Vendor ID set up in Didomi, fill in the corresponding field, otherwise, leave it empty.
Click Save.
The AB Tasty tag will check Didomi’s state each time a change is detected and will apply the corresponding rule.
You can use Didomi to inject the AB Tasty tag after consent has been granted by the visitor.
This method gives you more control and prevents our tag from downloading without a visitor’s consent.
However, we don’t recommend using this method as it means that no AB Tasty campaign can be displayed until visitor consent has been given. This may generate a flickering effect when consent is granted (e.g.: with a patch campaign).
To use Didomi to manage tag injection, follow these steps:
Refer to Didomi’s developer documentation to set up the AB Tasty tag in Didomi.
Go to Settings > Cookies > Cookie Deposit Method and toggle Restrict cookie deposit to No.
Check the box labeled: The cookie deposit restriction and its consent collection are handled on my side. AB Tasty will consider that consent has been given once it has started executing. Proof of consent will be sent as soon as it is executed.
For more information on Didomi, please refer to Didomi’s developer documentation.




Google BigQuery is a serverless, highly scalable and fully managed data warehouse that comes with a built-in query engine. BigQuery enables scalable analysis of petabytes of data.
The Google BigQuery integration allows you to export any data collected by AB Tasty’s tracking system to a Google BigQuery dataset, daily.
We will now proceed to configure the connector and the export.
Go to your console and activate BigQuery.
The service account is used to generate your credentials (in JSON format). More information on service accounts .
Go to the service account
Click on create Service account
Enter a name and a description
Add a role: you must add “BigQuery User” and “BigQuery Data Editor”
Click on “done” to validate.
Now that the service account is created, we will create the credential keys and export them.
Click on the new services account created
Go into your service account, then “Keys” and click “add key” > “create new key”
Select “JSON” and click “create”.
Download the key.
The content of the key should look like this:
Go back to BigQuery
In the Explorer menu choose your GCP project and click the three dots, then click “Create dataset”
Give it an ID and a location (no other mandatory options)
Click “create dataset”
Your dataset is now created and should appear in the Explorer menu. By clicking on your dataset you should be able to display its details.
More information on how to create a BigQuery dataset can be found .
In AB Tasty, go to the Integration Hub page > Data Warehouse > BigQuery > setup connector
Enter a name for the connector
Enter the dataset location (info can be found in the details of the created dataset)
Enter the dataset ID (info can be found in the details of the created dataset). Copy and paste the part to the right of the ". (see the above screenshot)
Your connector is now set up, and you can proceed to set up your Export.
You will get an error message, if one of the fields contains an error.
To set up your daily export, please refer to the guide: .
Refer to the article to create your payload.
Export name: the name of your export, give an explicit name to easily retrieve it in AB Tasty
Name of the table: the name of the table we will create in your BigQuery
Data exporter query: paste here the payload of your data explorer query
Click save and create.
The Google BigQuery integration is now complete, and you will soon see the data flowing into your dedicated Data Warehouse (It can take up to 2–3 hours, depending on the size of your report).
The export is activated upon creation, and new data will be appended to the current one, daily. The following screenshot shows that the export is activated on creation:
-- set variables (these need to be uppercase)
set abt_snwoflake_role = 'YOUR_ROLE'; // field to be defined by you
set abt_snowflake_username = 'YOUR_USER_NAME'; // field to be defined by you
set abt_snowflake_warehouse = 'YOUR_WAREHOUSE';// field to be defined by you
set abt_snowflake_database = 'YOUR_DATABASE';// field to be defined by you
set abt_snowflake_schema = 'YOUR_SCHEMA'; // field to be defined by you
-- set user password
set airbyte_password = 'your_password'; // field to be defined by you
begin;
-- create your role
use role securityadmin;
create role if not exists identifier($abt_snwoflake_role);
grant role identifier($abt_snwoflake_role) to role SYSADMIN;
-- create your user
create user if not exists identifier($abt_snowflake_username)
password = $airbyte_password
default_role = $abt_snwoflake_role
default_warehouse = $abt_snowflake_warehouse;
grant role identifier($abt_snwoflake_role) to user identifier($abt_snowflake_username);
-- change role to sysadmin for warehouse / database steps
use role sysadmin;
-- create your warehouse
create warehouse if not exists identifier($abt_snowflake_warehouse)
warehouse_size = xsmall
warehouse_type = standard
auto_suspend = 60
auto_resume = true
initially_suspended = true;
-- create your database
create database if not exists identifier($abt_snowflake_database);
-- grant your warehouse access
grant USAGE
on warehouse identifier($abt_snowflake_warehouse)
to role identifier($abt_snwoflake_role);
-- grant your database access
grant OWNERSHIP
on database identifier($abt_snowflake_database)
to role identifier($abt_snwoflake_role);
commit;
begin;
USE DATABASE identifier($abt_snowflake_database);
-- create schema for Airbyte data
CREATE SCHEMA IF NOT EXISTS identifier($abt_snowflake_schema);
commit;
begin;
-- grant Airbyte schema access
grant OWNERSHIP
on schema identifier($abt_snowflake_schema)
to role identifier($abt_snwoflake_role);
commit;set abt_snowflake_database = 'YOUR_DATABASE';// field to be defined by youAfter a successful authorisation, you’ll be redirected back to Contentful automatically
Select the environment (production / pre-prod / ..)
Define the content type for which you want to enable the A/B testing for
Click the Install button at the top right corner.
Connect each variation to new or existing entries before publishing.
Go to Content → Add entry and select the AB Tasty container.
In the Experiment section, all AB Tasty campaign variations will appear automatically. You can then link each variation to the appropriate content.










Enter the project ID where your dataset is located. Copy and paste the part to the left of the ". (see the above screenshot)
Choose Service account as the Authorization Method
JSON credentials: paste the content of the key (JSON file) that was downloaded when you created the credentials.
Click on “Test connection”
Validate by clicking on “Next step”.

















Amazon Redshift is a fully managed, petabyte-scale data warehouse service in the cloud. Amazon Redshift Serverless lets you access and analyze data without all the configurations of a provisioned data warehouse
Create an account
Go to AWS Redshift using the search bar)
Once you're on AWS Redshift, click on “create Cluster” (top right on the screen). More information .
Input an identifier of your choice
Choose your own configuration (AWS can help you if you chose “Help me choose”).
Inside Data Warehouse configuration, you need to create an admin username
Enter a username
Once your cluster is created, it should be displayed on AWS Redshift Cluster.
Click on the new cluster / Properties tab and click on the instance on VPC security group
Click on the instance of Security group ID.
Inside Inbound rules, check that at least one rule has the follow configuration:
IP version at IPv4
Type: All traffic
Source: 0.0.0.0/0
If it is not the case, you should edit one route with this specific configuration.
Go back to your cluster and click on action and Modify publicly accessible setting.
Click on “Turn on Publicly accessible” and save changes
Connect to the Data Warehouse
Stay inside your cluster and click on query data => Query in query editor V2.
A new window will open and a new modal will be displayed for the first connection.
Choose Data Warehouse user name and password
By default, the name of your Data Warehouse is dev. Don’t edit this input
User name: enter here the username you have already registered at step 1 (admin user of your cluster). Enter the user name and the good password
Click on connection: the connection is now established.
Create a new user
Since the connection is done, you need to create another user (which is not the admin of the cluster).
Go to the console, and copy/paste this script into app2 integration hub (the user name and password you created).
Replace username and Userpassword by the username and UserPassword of your choice
Username has to be in lowercase. You must have at least one digital number for your password.
And run the script
Give rights to the new user
Give to the new user all the rights it needs to write some data (provided by app2). Copy and past this script inside the console:
Replace username by the name of your user:
Select the first line and execute it
Select the second one and execute it
Go to AWS S3 (Enter S3 or Bucket inside the search bar). Click on create Bucket
Create a bucket by using the following configuration:
Bucket name: Name of your choice
AWS Region: Choose the appropriate region
ACLS disabled
Unchecked Block all public access
Click on create Bucket
Now that the bucket is created, click on it and go to the Permissions tab → Edit Block public access
Uncheck Block all public access.
Go back to the bucket/ Permissions tab/ Edit Bucket Policy
Copy / Paste the following script:
Replace NAME BUCKET by the real name of the created bucket.
We need to add a new IAM user who will have the rights on the S3 Bucket
Search for IAM via the search bar:
Click on User => Create user
Enter a user name and click on next
On the second step, you have to create a new user group before finalizing the creation of the user
User group name : Enter the name of the user group you want.
Permissions policies : add to this new group :
As soon as the new user group is created, add it to the new user
Click on next and finalize the creation of the new user.
As soon as the user is created, click on it: Security credentials tab / Create access key
Choose Third-party service option
Add a description
Click on create access key
A user key and a user secret are now created.
We recommend saving them as you will need them later.
Go to the integration page > Data Warehouse > Redshift > Create connector
Enter a name for the connector
Host: Go on AWS redshift, and on this screen, copy the endpoint URL
Paste it in the Host field, deleting all characters after the domain name.
Ex: endpoint URL:: julien-nied-cluster.cgtw9e5st7qf.eu-west-3.redshift.amazonaws.com:5439/dev
Enter only:: julien-nied-cluster.cgtw9e5st7qf.eu-west-3.redshift.amazonaws.com
S3 Bucket path: Need no values.
S3 Bucket region: Choose the region where the bucket is created (ex: eu-west-1)
AWS access key Id: Paste the key ID you have created when you have created a new IAM User
AWS access secret key: Paste the secret key you have created when you have created a new IAM User
Your connector is now set up, and you can proceed to set up your Export.
You will get an error message, if one of the fields contains an error.
To set up your daily export, please refer to the guide: .
Refer to the article to create your payload.
Export name: the name of your export, give an explicit name to easily retrieve it in AB Tasty
Name of the table: the name of the table we will create in Redshift
Data exporter query: paste the payload of your data explorer query
Click save and create.
The RedShift integration is now complete, and you will soon see the data flowing into your dedicated Data Warehouse (It can take up to 2–3 hours, depending on the size of your report).
The export is activated upon creation, and new data will be appended to the current one, daily. The following screenshot shows that the export is activated on creation:
Enter a password (we recommend adding it manually)
Click on “create cluster”
Encryption type: Server-side encryption with Amazon S3 managed keys
Bucket key: Enabled
Data Warehouse: By default, it’s “dev” (you can find it on the endpoint URL. If it is not the same, copy and paste it)
Schema and JDBC parameters URL: Leave empty/
Authorization method: Choose Username And password
Username: enter the username you created above (create a new user inside the console)
Password: enter the password you created above (create a new user inside the console)
Loading Method: Choose S3 staging
S3 Bucket name: The name of the bucket you created
Click “Test connection”. If everything works well, you can validate by clicking on “Next step”.
CREATE USER username PASSWORD 'Userpassword';GRANT CREATE ON DATABASE dev TO username;
GRANT usage, create on schema public TO username;
GRANT select on svv_table_info to username;{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "ListObjectsInBucket",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:ListBucket",
"Resource": "arn:aws:s3:::NAME BUCKET"
},
{
"Sid": "AllObjectActions",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:*Object",
"Resource": "arn:aws:s3:::NAME BUCKET/*"
}
]
}























