Snowflake Puts LLMs In The Hands Of SQL And Python Coders
Cloud data warehouse biz Snowflake has launched a fully managed service designed to rid developers building LLMs into their applications of the onerous task of creating the supporting infrastructure.
Currently in private preview, the bundle of services built under the Snowflake Cortex brand aims to provide the building blocks for employing large language models (LLMs) inside data-driven applications such as anomaly detection.
The service comes in the form of scale-up/scale-down serverless functions, which can be called up using SQL or Python code, helping developers build applications without requiring AI expertise or complex GPU-based infrastructure management, the vendor said.
Snowflake was founded in 2012 and was one of the early proponents of separating storage from compute to make a more flexible data warehouse and analytics architecture. In late 2020, it launched on the New York Stock Exchange, after which its market capitalization briefly hit a staggering $120 billion. US food giant Kraft Heinz, European automotive supplier ABB, and insurance and pensions company Axa are customers.
Sridhar Ramaswamy, senior veep of AI, told us that Cortex hosts a set of commonly used machine learning and language models, including Meta's Llama, but also Snowflake's own models, such as a text-to-SQL LLM set to launch in private preview.
"This is great for our users because they don't have to do any provisioning. We do the deployment. It looks just like an API, similar to, say, what OpenAI offers, but it is run right within Snowflake Data Cloud," he said.
As well as general-purpose functions like text-to-SQL, the service comes with a set of task-specific functions geared around sentiment analysis, question answering, text summarizing, and translation.
James Malone, Snowflake's director of product management, told The Register that the services would help developers use machine learning as part of their applications. "If you want to do predictive analytics, you can do all that through Snowflake just using simple functions. You don't have to develop an ML application to create something like anomaly detection. You call an anomaly detection routine directly in your code. We're trying to package up and bring GenAI and ML models to Snowflake users without them needing to be an ML developer."
- TileDB secures $34M to reimagine databases, not just collect GitHub stars
- VCs lay $52.5M golden egg for MotherDuck's serverless analytics platform
- Tabular's Iceberg vision goes from Netflix and chill to database thrill
- Snowflake's Instacart protestations hint at challenges for poster child of the data cloud
But users might want to keep an eye on the efficient use of resources when reaching for fashionable LLMs to solve problems already tackled by more straightforward ML approaches.
Alex Savage, head of digital analytics at broadcaster ABS-CBN Corporation, said in a blog post: "Due to their well-established algorithms and less intensive resource requirements, traditional ML models often outperform other methods in terms of speed and accuracy. They provide a reliable foundation for anomaly detection, identifying irregularities with minimal computational overhead."
While LLMs might offer language comprehension for real-time prediction in video streaming, for example, the computational needs of LLMs should also be considered.
"Compared to traditional ML models, they are typically more resource-intensive, necessitating significant processing power to parse and comprehend the 'language' of streaming data," he said.
Snowflake might have made high-performance analytics more accessible and usable to a new generation of data scientists and developers through its Data Cloud platform, but critics have said that its consumption-based pricing model has also led to unexpected costs.
As a result, optimization has come back into fashion. In August, e-commerce platform Instacart, a prominent Snowflake customer, showed how its payments to the data analytics vendor were falling over the years.
Snowflake was forced to explain that it had worked with Instacart to "meet the company's massive demand growth, and then to optimize for efficiency. Optimizations are undertaken on a workload-by-workload basis and have been extremely successful."
Snowflake is now pushing a raft of new LLM and machine learning features – they include new LLM models for its Python library Streamlit – but this time around, users might be more wary of consumption before they jump into these services with both feet. ®
From Chip War To Cloud War: The Next Frontier In Global Tech Competition
The global chip war, characterized by intense competition among nations and corporations for supremacy in semiconductor ... Read more
The High Stakes Of Tech Regulation: Security Risks And Market Dynamics
The influence of tech giants in the global economy continues to grow, raising crucial questions about how to balance sec... Read more
The Tyranny Of Instagram Interiors: Why It's Time To Break Free From Algorithm-Driven Aesthetics
Instagram has become a dominant force in shaping interior design trends, offering a seemingly endless stream of inspirat... Read more
The Data Crunch In AI: Strategies For Sustainability
Exploring solutions to the imminent exhaustion of internet data for AI training.As the artificial intelligence (AI) indu... Read more
Google Abandons Four-Year Effort To Remove Cookies From Chrome Browser
After four years of dedicated effort, Google has decided to abandon its plan to remove third-party cookies from its Chro... Read more
LinkedIn Embraces AI And Gamification To Drive User Engagement And Revenue
In an effort to tackle slowing revenue growth and enhance user engagement, LinkedIn is turning to artificial intelligenc... Read more