From Hype to Reality: 4 Things I Learned Implementing Zero Copy with Data 360

When I first came across the concept of Zero Copy for Data 360, I wondered how it worked. I wanted to investigate further.

I learned that Zero Copy eliminates the need to move data, enabling organizations to act upon it directly where it resides without duplication and in real time. It seemed like a highly promising solution.

Feasibility ultimately depends on the architecture and ecosystem of each organization. And as with any emerging technology, real-world application comes with nuances.

This is why I want to share four key insights I gained from translating the promise of Zero Copy for Data 360 into practice. If you are considering or are already engaged in a Zero Copy initiative with Data 360, these lessons may help you avoid setbacks.

1. Designing and planning Zero-Copy implementations

Too often, we focus more on the technical side of a project than something more fundamental: defining use cases and asking “What problem am I really trying to solve?” The design and planning stages of Zero-Copy implementations usually account for 80% of the project’s success (or failure). 

What users say they want does not always align with what they truly need. Nor is it enough to rely on what we assume they need. Identifying the real needs is critical in the early stages.

Another takeaway is never configure without a solid design. That said, it is perfectly natural to want to see how the concept works in practice. This is where a PoC is invaluable–more to come on that. The point is large-scale implementations are not necessary from the start.

Recommendations on planning for a Zero Copy implementation based on my experience

  • Conduct a comprehensive inventory of your data sources — origins, formats, current volumes, and projected growth. From this, build a data dictionary that will serve as your primary reference throughout the implementation.
  • Identify the data you need according to its frequency. This step is critical for defining the ingestion type.
  • Whenever possible, design a transformation and harmonization plan outside of Data 360. This helps prevent system overload and ensures that only the highest-quality data enters the environment.
  • Estimate future query loads through simulations. Doing so allows you to anticipate potential costs and performance latencies.
  • Most importantly, document all metadata, relationships, and design decisions. This documentation will become your roadmap when the project scales.

2. Zero Copy in action: getting results

There is no single approach to implementing Zero Copy. In fact, there are several ways to do it. The key lies in determining which option — or combination of options — best fits your use cases.

Once you have gathered the necessary requirements, I recommend building a proof-of-concept (PoC) — it is an effective way to evaluate new functionality in depth.

The good news is that Zero Copy greatly facilitates the creation of a PoC: with a solid plan and clearly defined objectives, the initial connection can be established in just a few clicks. But it’s critical to define use cases and data sources in advance.

Getting from a proof-of-concept to results

  • Technical reality: Even if you are not duplicating data, you remain dependent on the quality, governance, and performance of the original sources.
  • Architectural considerations: Every architecture is unique; what works in one environment may not scale effectively in another. The technological maturity of the organization must be carefully assessed.
  • Security as a foundation: Permissions and governance are not details; they form the backbone of the entire model.
  • Trade-offs in efficiency: While reduced duplication lowers storage needs, it shifts the focus to query performance and costs. For intensive use cases, I recommend copying targeted subsets of data into the Data 360 to optimize queries.
  • The human factor: Shifting business mindset from “I want all the data on the platform” to “work with the data where it resides” can be as challenging — if not more than the technical implementation.
  • Iteration is key: Start small, test, measure, and adjust. This incremental approach makes it possible to build a sustainable Zero-Copy reality.

In my experience Zero Copy delivers results – but as with any innovation, success is in the execution of the details.

3. Dual cost: How to maximize value with Zero Copy

One might assume that “not moving data” automatically means lower cost, but a closer look reveals more nuance.

Today, there are three methods for implementing Zero Copy: 1. Live Query 2. Cached Acceleration 3. File Federation. It’s crucial to understand your use cases so you can choose the best connection strategy.

Your costs will vary depending on which method you implement:

  • Live Query gives you the ability to execute federated queries in an external storage system, eg., a data lake. This method will result in costs in the external storage system and in Data 360. This is what I mean by dual cost: you’re using resources on both sides — Salesforce and the external source — when accessing data without copying it. In Data 360, the cost is 70 credits per million of retrieved rows.
  • Cached acceleration makes sense when processing performance is important. Acceleration implies data caching–hence much faster access. The cost for this is 2000 credits per million of rows cached and can be configured in incremental mode. You will also incur costs for your external data storage system. 
  • File-based federation is the third and best way to connect to external data lakes. It consumes the same number of credits as in Live Query, but does not generate any cost on the external lake. And if your data lake and Data 360 are in the same region, the cost is almost zero on both sides.

If you frequently query large tables, process heavy transformations, or perform complex joins and you aren’t using in-region file federation, your external costs can grow just as much as your internal Salesforce cost.

To minimize total expenditure:

  • Use queries that retrieve only what’s strictly necessary.
  • Reduce how often you perform federated queries.
  • Identify queries that run repeatedly and cache their results, avoiding repeated hits on the external source.
  • Use file federation when your external infrastructure (e.g. Snowflake, Databricks) is located in the same region as your Data 360 setup.

Always monitor and compare both cost sides — Salesforce and your external source — so you can decide when to use pure Zero Copy, when to rely on caching, or even when copying data locally might be more cost-effective overall.

It’s also essential to design a total cost of ownership (TCO) model that covers both sides — Salesforce and the external data store or lake (including file federation options) — to pinpoint when it makes sense to use full Zero Copy, a hybrid approach, or traditional ingestion.

4. Optimizing Zero-Copy performance

When working with complex audiences or executing frequent activations, federated queries can encounter elevated latency or scalability challenges. Common underlying causes include:

  • Queries that are overly broad or insufficiently filtered on external datasets
  • Inefficient structures or poorly designed indexes within the external storage or data lake
  • Resource saturation in the external system supplying the data
  • “Joins” between data already stored in Data 360 and federated data, which often result in performance degradation

How to get closer to real-time performance with Zero Copy

  • Apply the maximum possible filtering before federating (restrict columns, define conditions at the source).
  • Leverage caching or selective acceleration for the most frequently queried datasets.
  • Proactively anticipate demand spikes by stress-testing with heavy loads to identify bottlenecks.
  • In certain cases, consider temporarily “promoting” frequently queried data into Data 360.
  • Whenever feasible, leverage file federation in alignment with your ecosystem, as it is often the more effective option.

In summary, the promise of real time can only be delivered when performance is deliberately and carefully optimized. 

With Zero-Copy experience comes wisdom

Zero Copy for Data 360 enables real-time use of data without duplication, but success depends on discipline and strategy. The essentials are clear: strong design and planning, well-defined use cases, and starting with a proof-of-concept to validate architecture and performance.

Costs must be viewed holistically. Federated queries consume resources in both Salesforce and external platforms, making continuous monitoring crucial. Governance and data quality are equally critical: poor source data or unmanaged schema changes can undermine results.

Ultimately, Zero Copy is not only a technical shift but also a cultural one, requiring organizations to move from “copy everything” to “work where data resides.” Treated strategically, it delivers sustainable value beyond the hype.