Tableau_CRM_Datasets_formerly_Einstein_Analytics

Tableau CRM: Using Dataset Builder, Dataflows, and Recipes

If you’re just getting started with Tableau CRM (formerly Einstein Analytics), we’ve written a comprehensive blog post to help you do just that. In today’s post, however, we’ll be taking a deeper look into Tableau CRM’s backend infrastructure, including when to use dataflows and recipes to manipulate your datasets.

Before we get into the nitty gritty of dataflows and recipes, though, let’s look at an overview of what it takes to successfully plan and execute a Tableau CRM implementation.

Overview
Best Practices for Planning an Implementation
Approaching a Data Problem the Smart Way
Using Tableau CRM’s Dataset Builder
Dataflow vs. Recipes
Using Dataflows
Using Recipes
Wrap Up

Best Practices for Planning an Implementation

Implementing a business intelligence tool is a task that should not be taken lightly. As a best practice, it’s a good idea to approach it in phases. After all, identifying goals that can build upon one another is essential for executing an implementation successfully.

Here’s an example of an implementation plan for a Salesforce Sales Performance app:

  1. Determine where critical data regarding Sales Performance lives
    a) Is it all in Salesforce?
    b) Does any critical data exist in your data warehouse?
  2. Set up automated pipelines between your platforms and Tableau CRM
    a) Tableau CRM natively connects to Salesforce and has many connectors to other popular sources.
  3. Use dataflows/recipes to transform and load the data into a Dataset(s)
  4. QA prep data with existing reporting to ensure alignment
  5. Begin building out new lenses or dashboards to fill reporting needs

Each of these steps has multiple subtasks that will move you toward a successful implementation.

Approaching a Data Problem the Smart Way

Tableau CRM offers a wide variety of tools to help solve virtually any data problem and answer any data-related questions that might arise within your organization. But before diving in, it’s important to understand exactly what the project stakeholders are looking for. Knowing their requirements will guide you towards solving the problem.

In keeping with the example of our Sales Performance app, we might break those steps down into the following parts.

  1. Define the problem. It’s crucial to understand exactly what is being asked by stakeholders and how that question translates into a data problem. It should be clear, concise, and, most importantly, measurable. Is this a one-time thing or does value exist in creating automation to update the end results for future reference?
  2. Determine the Data Framework. This step is best addressed by asking yourself a few questions. What data do we need to solve this problem? Is it all stored in one place? Will we need to bring in and incorporate other external data sources? Having this framework in mind before you start will help you operate efficiently and stay on task when creating datasets and/or dataflows.
  3. Collection and Preparation of Data. Now, it’s time to begin gathering the data required to solve the problem. If all the data lives in Salesforce, Tableau CRM makes it easy to import and transform that data using tools we’ll discuss later in this post. If external sources are required, Tableau CRM has many other tools that make connecting to databases, data warehouses, and so forth a simple exercise. Preparing your data for analysis is handled mainly through dataflows and recipes, with more on those tools shortly.
  4. Perform the Analysis. Once your data is prepped in new datasets, it’s time to begin exploring the data to solve the initial problem. At this point, it’s important to not get distracted with making end results look nice. The focus here should be whether the resulting query or charts provide the correct solutions.
  5. Presenting the Results. Now that there are one or many queries that answer the initial problem, it’s time to make the results presentable for your end users. Whether results are presented via a lens or a dashboard, take some time to focus solely on the design and presentation of your solution. It should be easy and quick for your end users to interpret the data.

Using Tableau CRM’s Dataset Builder

Now that we’ve covered planning and executing an implementation as well as an overview of approaching a data problem, let’s take a closer look at the Tableau CRM tools that will help you work with the data itself.

The Tableau CRM Dataset Builder is a point-and-click tool that facilitates the creation of datasets. It allows users to easily extract data from relevant Salesforce objects and augment it with related data from another object (for example, add account information to opportunity data). It’s important to keep the concept of data grain in mind when creating a dataset with this tool.

The Tableau CRM Dataset Builder is a point-and-click tool that facilitates the creation of datasets.

When creating a new dataset, you should always ask yourself “What will the data stored in this dataset be about?” Generally, the answer to this question will be the object you start with, or your lowest grain (for more information on this check out this video).

For example, if you want to answer sales-related questions that involve data from the Product, Opportunity, and Account objects, you should choose the Product object as your base. This is because the Product object holds the lowest level of detail out of the three objects when considering the question you want to answer.

When creating a new dataset, you should always ask yourself “What will the data stored in this dataset be about?”

After choosing a base dataset, pick the desired fields you want to pull and have the Opportunity augment the base dataset with data from other objects.

Einstein - Opportunity ID

To accomplish this, click the “+” button beside the ‘Root’ node and choose the ‘Relationships’ option. Next, select the object you wish to relate to your base data. In our example below, we are relating the ‘Opportunity Product’ data to ‘Opportunity Data.’

Einstein opportunity product

Image: Relating the ‘Opportunity Product’ to ‘Opportunity data’

Once joined, click the “+” beside the new node and choose the fields you wish to pull in from this object.

Any time you want to create a new dataset that relies on Salesforce data, this is the process you’ll want to follow. Just remember to start with the object of lowest grain and you’ll be creating insight-rich datasets in no time!

Dataflow vs. Recipes

Now, let’s look at when and how to use Tableau CRM dataflows and recipes for solving your data problem.

Dataflows and recipes can both be used to accomplish the same task, but when should you use one over the other? Here at CloudKettle, we lean toward using dataflows as the tool of choice when combining data from multiple sources while recipes are great when performing transformations on a single dataset. For example, changing date formats or adding a column to the dataset from a related dataset.

“…dataflows as the tool of choice when combining data from multiple sources while recipes are great when performing transformations on a single dataset.”

Though these tools will be changing soon with the release of Tableau CRM Dataprep, for now, they’re still go-to choices for any ETL work in Tableau CRM.

Let’s walk through both tools.

Using Dataflows

The dataflow tool is used to extract and transform data from Salesforce or other connected data sources. Once processed, this data is registered (stored) in a new Tableau CRM dataset. The dataflow follows a set of instructions starting with the extraction of data. Below, we will go a bit deeper on all of the features present in the Dataflow Builder.

Einstein dataflow

“The dataflow tool is used to extract and transform data from Salesforce or other connected data sources.”

Dataflow features from left to right:

Dataset Builder: Navigates to Tableau CRM’s point-and-click Dataset Builder mentioned in the previous section of this post.

SFDC Digest: Extracts data from any connected Salesforce objects from your local Salesforce org.

Digest: Extracts data from sources that are synced via external connections (a data warehouse, for example) or from an external Salesforce org that is synced.

Edgemart: Reference an existing dataset so that its data can be used in other operations in the dataflow.

Append: Use this transformation to combine rows from multiple datasets into one.

Augment: Use this transformation to add columns to a dataset from a related dataset.

ComputeExpression: Use this transformation to create derived fields in your dataset. This can be used to concatenate text fields, perform mathematical operations, etc. This syntax is SAQL based.

ComputeRelative: Use this transformation to perform more advanced analytical calculations on your dataset, such as tracking how long a lead remains in each stage of the sales process. This enables you to perform a trend analysis through the use of partitions (similar to window functions in SQL).

Dim2Mea: This transformation creates a new ‘Measure’ field from an existing ‘Dimension’ field. It’s useful, if on import, some ‘Measure’ fields are not recognized as measures.

Flatten: Allows the user to flatten hierarchical data. Salesforce gives the example of being able to flatten the role hierarchy in Salesforce data, which allows for lower grain, row-level security to be implemented.

Filter: This transformation removes records from an existing dataset based on a filter criteria. Syntax for filters can be simple logical operators or a SAQL expression.

Slice: Allows the user to remove fields from an existing dataset. With this subset of fields, you can create a new dataset or use these fields in other transformations.

Update: Updates the fields in an existing dataset with the values from a new dataset. Similar to a lookup operation, you must specify the source dataset, which will be used to update the existing dataset. Then, specify a lookup key that will be used to specify which records will be updated. The results are stored in a new dataset.

SFDCRegister: Registers a dataset to make it available for use in queries. You can not query unregistered datasets. Normally this will be the final operation in a dataflow.

Using Recipes

Now that we’ve taken a closer look at dataflows, let’s move on to recipes. Because of the similarities between the two, we don’t need to do a major deep dive, but remember: the dataflow tool is our top choice when combining data from multiple sources while recipes are great when performing transformations on a single dataset.

In a nutshell, a recipe is a user-interface tool that lets you: 1) take data from your existing datasets and connected objects, 2) apply transformations, and 3) load the results into a new dataset.

Recipes offer many of the same features available in dataflows; however, this tool also provides a visual interface where you can see how transformations affect the underlying data in the dataset. See the below image for an example.

Einstein - lookup

Wrap Up

As fun as it may be to jump headfirst into Tableau CRM, having a problem-solving plan in place will help you implement your solutions much more smoothly and efficiently. It is crucial to clearly understand the issue your stakeholders are experiencing and to map out how you want to execute the solution.

Furthermore, utilizing dataflows and recipes for your Tableau CRM solution is a great way to manipulate the data into datasets. Plus, these tools can display your data in useful, digestible ways–for example, via dashboards–so your end users can truly see the value Salesforce data has to offer.

If you have any questions about this blog post or how to leverage Tableau CRM to solve enterprise business challenges, reach out today!

You may be interested in

CRM Analytics Development Lifecycle: Production vs. Sandbox

When CRM Analytics is selected as a Business Intelligence application for data visualization, a common question always arises: Should we develop in Production or in a Sandbox? In this post, we will cover advantages and disadvantages of using Production and Sandbox for any CRM Analytics development projects, and also cover the Admin responsibilities when it […]

Read More

RevOps Leaders Panel

RevOps Leaders Panel

CloudKettle founder and CEO, Greg Poirier, was recently joined by Sales and Marketing/RevOps experts Ursula Ayrout and Betsy Sewell to chat about their experiences in RevOps and share some thoughts and insights on planning for 2023. These are some of the highlights, and you can find the full video below. 2:24 – What matters more? […]

Read More

Sign up for the latest tips & news from CloudKettle

Thank you for subscribing.