Introducing Local File Upload Connector in Data Cloud (Beta)

We know that Data Cloud is optimized for big data, but what if you have just a little bit of data? You don’t need a full ingestion pipeline to do a single, one-time ingestion. With the new Local File Upload connector (Beta), you can easily upload comma-separated value (CSV) files directly from your local drive and ingest them into data lake objects (DLOs). This feature is ideal for users who need to quickly bring in small amounts of data or who do not want to set up a full data pipeline. It’s also great for testing and experimentation.

When should I use the Local File Upload connector?

With so many options to get data into Data Cloud, choosing a path can sometimes be difficult. Here are a few recommendations:

  • If your file is on a local drive, under 10MB, and you only need to ingest it once then the Local File Upload connector is likely a good choice.
  • If you need to ingest a Parquet file, use one of the cloud storage connectors (Amazon S3 Storage, Google Cloud Storage, Azure Storage).
  • You can ingest Excel files from OneDrive, SharePoint, or Google Drive using the connectors for those respective services. Soon you’ll be able to use the Local File Upload connector to upload small Excel files directly from your local drive too.

Enabling the Local File Upload connector in Data Cloud

Since the Local File Upload connector is in Beta, you’ll need to start by enabling Beta Connectors. To do this, navigate to Data Cloud Setup, click Feature Manager, and ensure Beta connectors are enabled.

Note: Unlike other connectors, you don’t need to create a Local File Upload connection; one will automatically be created for you when you enable Beta connectors.


Image of the Data Cloud Setup screen with Feature Manager and Beta functionality.

Using the Local File Upload connector

Navigate to the Data Streams tab and click New.


Image of the Data Streams page with the New button, Delete Data Stream button, and Update Status button in the top, right corner.

Click the File Upload tile and then click Next.


Image of the File Upload tile on the New Data Stream page.

Click Upload Files to choose a file using your browser, or drag and drop a file onto the target.


Image of the New Data Stream modal displaying the Upload File button and the Drop Files option.

When you see the file uploading notification, click Next if the process doesn’t automatically advance.

Next you’ll see a preview of your data, making it easy to verify that your file was read correctly.

Choose your DLO category and select the primary key.

Next, provide a name for your DLO, or accept the automatically generated name.


A screenshot showing the creation of a new profile data stream with fields and the data lake object label and API name.

To see the automatically detected data types, and to change the type of any field, click the Supported Fields tab. You can also change the column labels and API names on this tab.


A screenshot of the Supported Fields tab when creating a new data stream.

Once the ingestion has started, you’ll see the Data Stream page for your new data stream. Ingestion usually completes within a few seconds! You can use the Data Explorer or Query Editor to view your ingested data.


Image of the newly created data stream and associated data lake object.

Note: The Last Run Status, Last Refreshed, Last Processed Records, and Total Records columns will not be populated. You can use the Query Editor or Data Explorer to query your new DLO and confirm the data has been ingested. In a future release, these fields will be populated automatically.

Key considerations

  • A header row is required in your CSV file.
  • Data streams created using Local File Upload cannot be scheduled or refreshed.
  • Currently, status fields are not automatically populated when the data stream is created.

Known issues

New functionality and Beta features sometimes come with a few issues, and this is no exception. We are currently working to resolve a bug with the Local File Upload connector (Beta). Until the bug is resolved, attempts to create data streams and data lake objects with the Engagement category will fail. When this issues is resolved we will update this blog post.

Conclusion

Data Cloud gives you the flexibility to ingest data from your files no matter where you store them. The new Local File Upload connector will make it easier to quickly get (relatively) small amounts of data into Data Cloud for testing or validation, while existing connectors give you all the features you need to set up recurring jobs for files of any size.

Resources

About the authors

Danielle Larregui is a Senior Developer Advocate at Salesforce focusing on the Data Cloud platform. She enjoys learning about cloud technologies, speaking at and attending tech conferences, and engaging with technical communities. You can follow her on LinkedIn.

Ken Reppart is a Director of Product Management at Salesforce Data Cloud. He is a customer-obsessed data nerd who enjoys writing SQL and using Tableau to create dashboards for visualizing and understanding product usage. Follow Ken on LinkedIn.

The post Introducing Local File Upload Connector in Data Cloud (Beta) appeared first on Salesforce Developers Blog.