Nvidia and Snowflake announce partnership for custom generative AI models

Nvidia (NVDA) and Snowflake (SNOW) have announced a new partnership that will allow the cloud services company’s more than 8,000 customers to build their own AI assistants.

The news was announced during a conversation between Nvidia CEO Jensen Huang and Snowflake CEO Frank Slotman at the Snowflake Summit on Monday.

This move will allow Snowflake users to build custom AI models using their own internal data. This is a big deal for companies that want to take advantage of large language models and generative AI, and need to get company-specific, focused answers to their queries.

Jensen Huang, Co-Founder, President and CEO of Nvidia Corporation. (AP Photo/Ross D. Franklin, file)

This means that Snowflake customers will be able to, among other things, build their own AI-powered, self-contained chatbots to pull information from their massive information databases.

Nvidia provides the core toolkit, called NeMO, which provides a large language base model that Snowflake customers customize using their own data. Nvidia will also provide the infrastructure, including the GPUs that customers will need to train their generative AI models.

In May, the graphics chip maker announced a somewhat similar partnership with ServiceNow (NOW). Rather than letting customers train their own generative AI models, ServiceNow trains the models themselves. The idea is to give customers a quick way to take advantage of AI’s generative capabilities without necessarily having to train the platforms on their own data.

This is not the first type of software that allows companies to create their own AI applications.

In May, Microsoft (MSFT) announced the launch of its Azure AI Studio, which allows Microsoft customers to create custom AI-powered applications called co-pilots. Helpers, like the Snowflake show, can take a number of forms including operating as chatbots.

Sign up for the Yahoo Finance newsletter.

Sign up for the Yahoo Finance newsletter.

Generative AI exploded onto the scene when OpenAI launched ChatGPT in November 2022. Since then, companies ranging from Microsoft to Google (GOOG, GOOGL) to Meta (META) and Amazon (AMZN) have either released products or discussed working on the technology.

But Nvidia, which has been investing in developing both chips designed to power AI systems and the software that runs them, has easily been one of the biggest beneficiaries of the boom.

The company’s shares are up 159% over the past 12 months, and 189% year-to-date. the reason? nvidia the The company’s go-to when it comes to AI chips. This is because it turns out that graphics chips are exceptionally good at performing the kind of parallel processing required for AI.

Sure, AMD has its own graphics capabilities, and Intel is building its own AI offerings, but Nvidia is at the top of the heap. And for the foreseeable future, it will remain there.

Daniel Holly He is the Technical Editor at Yahoo Finance. He has been covering the tech industry since 2011. You can follow him on Twitter @employee.

Click here for the latest tech business news, reviews, and helpful articles on technology and gadgets

Read the latest financial and business news from Yahoo Finance

Leave a Reply

Your email address will not be published. Required fields are marked *