Databricks data lakehouse unifies the best of data warehouses and data lakes in one simple platform to handle all your data, analytics and AI use cases. You can create a direct connection to export data from Crisp to Databricks via Databricks-to-Databricks delta sharing (an open protocol for secure real-time exchange of large datasets created by Databricks).

Requirements

To create this connection you will need to meet the following requirements and gather the required information:

  • Have a Databricks account with admin access.
  • Set up a Unity Catalog metastore (with delta sharing enabled) that is associated with the workspace to which you want Crisp data to flow. To ensure your Databricks account is set up correctly to share data, make sure you have completed the following steps (you can click the links to see the relevant Databricks documentation).
    Note: In Databricks, documentation is cloud provider specific, so these links display Amazon Web Services documentation by default, but you can change the cloud provider in the top-right corner of the screen. 
    1. Setup Delta Sharing for your account
    2. Create a Unity Catalog Metastore
    3. Enable Delta Sharing on a metastore
    4. Enable a workspace for Unity Catalog
    5. Create a workspace
  • Get your Databricks sharing identifier.  
    1. Requesting a sharing identifier- Note: Though this article is written as if you were directing a data recipient to locate their sharing identifier, the steps for locating your sharing identifier are the same. 

Setting up the connection in Crisp

  1. Log into your Crisp account.
  2. From the main menu at the top-right of the screen, select Outbound Connectors.

    Amazon_Redshift_003.png

  3. Select the Add connector button.

    Connecting_Retailer_Portals_002.png

  4. Select the Databricks connector tile. 

  5. If you want, update the default name of the connector, then enter or paste your Databricks sharing identifierFor instructions on getting your sharing identifier, see Databricks Documentation > Requesting a sharing identifier. Note: Though this article is written as if you were directing a data recipient to locate their sharing identifier, the steps for locating your sharing identifier are the same. 

    Databricks_001.png


  6. From the Databricks region drop-down menu, select a Databricks cloud region (e.g., us-east-1). Note: You can select any cloud region, but for optimal performance we recommend selecting the same cloud region as your workspace in Databricks. If you need to look up your cloud region, you can find it on the Workspaces page in Databricks. For more: Databricks documentation > Update a workspace

    Databricks_002.png

  7. In the Connector source data section, select the Select button.
    Databricks_003.png
    The Select source data screen appears.

  8. Select the retailer(s) for which you want to export data to Databricks (e.g., UNFI), then choose which data tables and columns to include in the export. For detailed instructions on making your outbound data selections, see Selecting Outbound Data Sources.
    Excel_Outbound_003.png
    Hint: You can select the book icon to the right of a
    table name to see its documentation.

  9. To complete your selections, select the Accept button.

    Excel_Outbound_004.png
    The window closes and you return to the connector setup screen.

     

  10. Select the Save button.

    Connecting_Retailer_Portals_005.png

    The connection setup is complete. You can check the status of the connection by clicking the new Databricks tile. 

Accessing Crisp Data in Databricks

The Crisp connection utilizes Databricks-to-Databricks sharing, for more information on accessing shared data in Databricks see Databricks documentation > Read data shared using Databricks-to-Databricks Delta Sharing.