NewsNVIDIA

Snowflake Summit 2024: Snowflake and NVIDIA partner to revolutionize custom AI applications for enterprises

1 Mins read
snowflake nvidia

At the Snowflake Summit 2024, Snowflake, the AI Data Cloud company, announced an innovative partnership with NVIDIA designed to help customers and partners create tailored AI applications within the Snowflake environment. This collaboration leverages NVIDIA AI technology to enhance Snowflake’s capabilities, facilitating seamless integration and optimization for diverse business needs.

As businesses increasingly seek to harness the power of AI, there is a growing demand for data-driven customization. Snowflake and NVIDIA’s collaboration allows organizations to quickly develop bespoke AI solutions tailored to specific use cases. This empowers enterprises across various industries to fully realize the potential of AI.

Snowflake Cortex AI and NVIDIA AI Enterprise Software integration

The collaboration focuses on integrating key technologies from NVIDIA AI Enterprise software, including NeMo Retriever, into Cortex AI. This empowers business users to efficiently build and leverage custom AI applications, maximizing their AI investments.

Here’s a breakdown of the key functionalities:

NVIDIA NeMo Retriever: Delivers high-accuracy information retrieval with exceptional performance, ideal for businesses building retrieval-augmented generation-based AI applications within Cortex AI.

NVIDIA Triton Inference Server: Provides the ability to deploy, run, and scale AI inference for any application across any platform.

Furthermore, NVIDIA NIM inference microservices, pre-built AI containers included in NVIDIA AI Enterprise, can be deployed directly within Snowflake as a native application powered by Snowpark Container Services. This empowers organizations to effortlessly deploy a range of foundation models right within the Snowflake platform.

Enhanced support for Snowflake Arctic

Snowflake Arctic, the industry’s most open, enterprise-grade LLM, will get a significant boost with NVIDIA TensorRT-LLM software, giving users significantly faster performance.

Snowflake Arctic, the state-of-the-art LLM launched in April 2024 and trained on NVIDIA H100 Tensor Core GPUs, is now available as an NVIDIA NIM, enabling users to get started within seconds. The Arctic NIM hosted by NVIDIA is live on the NVIDIA API catalog for developer access using free credits and will be offered as a downloadable NIM, providing users with even more flexibility to deploy the most open enterprise LLM on their preferred infrastructure.

Earlier this year, Snowflake and NVIDIA expanded their initial collaboration to create a unified AI infrastructure and compute platform within the AI Data Cloud. This latest announcement at Snowflake Summit 2024 marks significant strides in their mission to support customers on their AI journeys.

Read next: Palo Alto Networks and IBM partner to offer enhanced AI-driven cybersecurity solutions

Leave a Reply

Your email address will not be published. Required fields are marked *

− one = four