CRM

Salesforce to Replace RSA Key Exchanges with TLS 1.3

IMPORTANT: Starting May 1, 2025, Salesforce will phase out RSA Key Exchanges for TLS connections. Salesforce is enhancing Transport Layer Security (TLS) measures for customers. Starting May 1, 2025, Salesforce will no longer support RSA key exchanges for all incoming TLS connections. TLS 1.3 will become the preferred protocol for Salesforce, but TLS 1.2 will […]

Salesforce to Replace RSA Key Exchanges with TLS 1.3 Read More »

5 Questions Admins Should Ask When Building an Agent With Agentforce

It’s possible by now you’ve been to a Salesforce event or went through some Trailhead modules, and you’ve built your first agent in Salesforce. What’s even more possible is that you know about artificial intelligence (AI) and have in some way been involved in company discussions about its use. For me, those types of discussions

5 Questions Admins Should Ask When Building an Agent With Agentforce Read More »

GIFT-Eval: A Benchmark for General Time Series Forecasting Model Evaluation

TL;DR: Time series forecasting is becoming increasingly important across various domains, thus having high-quality, diverse benchmarks are crucial for fair evaluation across model families. Such benchmarks also help identify model strengths and limitations, driving progressive advancements in the field. GIFT-Eval is a new comprehensive benchmark designed for evaluating general time series forecasting models, particularly foundation

GIFT-Eval: A Benchmark for General Time Series Forecasting Model Evaluation Read More »

How to Use the Python Connector for Data Cloud

The ability to efficiently access data wherever it resides is crucial when building visual data models, performing analytical operations, or building machine learning models. The Data Cloud Python Connector abstracts Data Cloud’s Query APIs to help developers quickly authenticate and access data within Data Cloud. In this blog post, we’ll delve into the key features

How to Use the Python Connector for Data Cloud Read More »

Moirai-MoE: Empowering Time Series Foundation Models with Sparse Mixture of Experts

TL;DR: We propose Moirai-MoE, the first mixture-of-experts time series foundation model, achieving token-level model specialization in a data-driven manner. Extensive experiments on 39 datasets reveal that Moirai-MoE delivers up to 17% performance improvements over Moirai at the same level of model size and outperforms other time series foundation models, such as Chronos (from Amazon) and

Moirai-MoE: Empowering Time Series Foundation Models with Sparse Mixture of Experts Read More »