No matches found
Try choosing different filters or resetting your filter selections.
Setup and Data Integration: New Google Analytics Connector, SAQL in Dataflow Filters, Event-Based Dataflow Scheduling
Load your website statistics with the Google Analytics connector.
Create more targeted dataflow filters with SAQL. Schedule dataflows to run after data
sync with event-based scheduling.
-
Analyze More Data Under the Einstein Analytics Plus License"
For the Einstein Analytics Plus license, the maximum number of rows for all registered datasets has increased to 1 billion. -
Step Confidently Through First-Time Analytics Setup
With the refreshed Getting Started page in Setup, enabling and setting up Analytics is now more intuitive. The page includes help with each step and recommended actions to complete your setup. -
Bring Your Website Statistics into Analytics with the Google Analytics Connector
Use the Google Analytics connector to bring in data about your websites, and analyze the data using the power of Einstein Analytics. -
Connect to Your Data in Oracle Eloqua and NetSuite (Beta)
Connect to even more enterprise application data with the new Oracle Eloqua and NetSuite connectors. -
Sync More Data with the Snowflake Computing Connector
Previously, this connector could sync up to 1 million rows or 500 MB per object. Now, it can sync up to 20 million rows or 10 GB per object, depending on which limit is reached first. -
Test Your Connections When You Create Them
You can now test connection settings when you create or update a connection. Previously, you only discovered a connection didn't work when you tried using it and had to return to the connection setup to fix it. -
Create More Complex Dataflow Filters with SAQL
Filter nodes in your dataflows now support Salesforce Analytics Query Language (SAQL), so you can create more complex filters to get just the data that you need in your datasets. SAQL gives you a wider range of operators. For example, you can filter on an array of values, on part matches, and on relative dates. -
Orchestrate Data Sync and Dataflow Runs with Event-Based Scheduling
Scheduling a dataflow for Salesforce objects used to require some planning and could be tricky if you didn't know how long the data sync would take. But now, by setting an event-based schedule to run the dataflow when the data sync finishes, you can run them back to back, without the guesswork. Use event-based scheduling to schedule your dataflow to run whenever your Salesforce Local connection finishes syncing. -
Set Up Notifications Without Scheduling
Notification options for dataflows and connectors now have their own home. Previously, these options were hidden away with the schedule options. We made them easier to find by creating a separate item for them on dataflow and connection menus. -
See Which Dataflows Are Taking Longer Than Expected
Find out when a dataflow is running after a specified length of time. When a dataflow takes longer than expected, it could impact dependent dataflows and recipes scheduled to run after. Now you can set an elapsed time notification to notify you when a dataflow is still running and take action. -
Run More Data Jobs
Your dataflows and recipes run as data jobs in Analytics. As you work with more data, we want to make sure that you can run all these jobs, so we increased the number of data job runs that you can make in a rolling 24-hour period from 50 to 60. -
Get a Better Metadata View on the Dataset Edit Page
We added last queried and size information to the top of the dataset edit page to give you a complete at-a-glance metadata view whenever you’re working with a dataset. We also moved the extended metadata file information to the top to complete the picture.