CRM

How Agents Can Take Smarter Actions With Prompt Builder

Agentforce is now generally available, empowering enterprises to build, customize, and deploy autonomous and assistive artificial intelligence (AI) agents. This technology helps you foster both customer and employee success. But where does Prompt Builder fit in? Does Agentforce replace Prompt Builder? Should you bypass prompt creation and dive straight into build agents? We’ll show you

How Agents Can Take Smarter Actions With Prompt Builder Read More »

Who Is an Agentblazer?

You’ve heard about AI agents that can take on tasks and innovate at companies in ways we never thought possible. But what can they do for you? How can they save your company money and make your operations more efficient? How will they shape the future? If you’re asking these questions, you’re an Agentblazer. And

Who Is an Agentblazer? Read More »

Dynamic Memory Networks for Visual and Textual Question Answering

Neural network architectures with memory and attention mechanisms exhibit certain reasoning capabilities required for question answering. One such architecture, the dynamic memory network (DMN), obtained high accuracy on a variety of language tasks. However, it was not shown whether the architecture achieves strong results for question answering when supporting facts are not marked during training

Dynamic Memory Networks for Visual and Textual Question Answering Read More »

MetaMind Neural Machine Translation System for WMT 2016

Neural Machine Translation (NMT) systems, introduced only in 2013, have achieved state of the art results in many MT tasks. MetaMind’s submissions to WMT ’16 seek to push the state of the art in one such task, English→German newsdomain translation. We integrate promising recent developments in NMT, including subword splitting and back-translation for monolingual data

MetaMind Neural Machine Translation System for WMT 2016 Read More »

The WikiText Long Term Dependency Language Modeling Dataset

The WikiText language modeling dataset is a collection of over 100 million tokens extracted from the set of verified Good and Featured articles on Wikipedia. The dataset is available under the Creative Commons Attribution-ShareAlike License. Compared to the preprocessed version of Penn Treebank (PTB), WikiText-2 is over 2 times larger and WikiText-103 is over 110

The WikiText Long Term Dependency Language Modeling Dataset Read More »

Teaching Neural Networks to Point to Improve Language Modeling and Translation

Imagine you were a young child and wanted to ask about something. Being so young (and assuming you are not exceedingly precocious), how would you describe a new object, the name of which you have yet to learn? The intuitive answer: point to it! Surprisingly, neural networks have the same issue. Neural Networks typically use

Teaching Neural Networks to Point to Improve Language Modeling and Translation Read More »

New Neural Network Building Block Allows Faster and More Accurate Text Understanding

In deep learning, there are two very different ways to process input (like an image or a document). Typically, images are processed all at once, with the same kind of computation happening for every part of the image simultaneously. But researchers have usually assumed that you can’t do this for text data: that you need

New Neural Network Building Block Allows Faster and More Accurate Text Understanding Read More »