Detecting the Dynamics of Generative AI and LLM in Snowflake’s Cortex

In today’s technology-driven world, Generative AI models have advanced to engage in conversations, provide answers, generate images and videos, compose narratives, and develop source codes of any description. Despite their remarkable capabilities, the potential of Generative AI and LLM services often remains underestimated. Hence, companies need to embrace these cutting-edge technologies to drive innovation, boost efficiency, and extract maximize value from their data sources.

Let’s detect and explore some potential functions built on Snowflake Cortex and how generative AI (GenAI) and large language models (LLMs) are disrupting the work at a global level, helping users to incorporate LLMs into analytical processes in seconds, develop GenAI-powered apps in minutes, or within hours to execute powerful workflows with all necessary security measures.

Snowflake’s Foundational Services for Generative AI and LLM App Development

There should be access to two foundational services to bring generative AI on top of already governed data and build LLM-serviced applications within the Snowflake boundary.

Snowflake Cortex

Snowflake Cortex is an intelligent and fully managed service that allows organizations to quickly analyze and build AI applications with data. It can be accessed by any user to build industry-leading AI Models, Large Language Models (LLMs), perform vector searches, etc.

Advanced dynamics of generative AI and LLM services as Meta AI’s Llama 2 model within Snowflake’s Cortex enables enterprises to build data-driven apps, regardless of their technical expertise in managing complex GPU-based infrastructure. LLM services such as Document-AI (currently private preview), Copilot (currently private preview), and Universal Search (currently private preview) are also included in Snowflake Cortex’s as additional services.

Snowpark Container

Snowpark runtime (public preview soon on select AWS regions) within Snowflake’s account gives developers the ability to deploy, manage, and scale containerized workflows by customizing and tuning open-source LLMs using secure Snowflake-managed infrastructure with GPU instances.

Snowflake Cortex’s Serverless Functions

There are serverless functions with Snowflake Cortex to accelerate analytics and AI application development. Users can instantly access ML or LLM models built for some specific tasks with just a line of SQL or Python code besides models for actual engineering and contextual learning. Due to the fully managed service within Snowflake Cortex itself, users can manage expensive GPU infrastructure and utilize governance framework with secure and managed access to their data.

We can derive different models as below,

LLM-based models can be used to work with unstructured data to,

  • Extract data from unstructured data (Extract Response)
  • Sentimental detection of text (Detect Sentiments)
  • Summarizing long documents for easier usage (Text Summary)
  • Text translation with scalability as needed (Text Translate)

ML-based models can be used to work with historical data to,

  • Forecast
  • Detect anomalies
  • Monitor data pipelines
  • Identify dimensions of CDC and metrics
  • Categorize and classify data with patterns

State-of-the-art models can be used to work with use cases on general-purpose to,

  • Complete a text for a prompt or a keyword
  • Generate SQL from a natural language text (TEXT 2 SQL)

Invoking Snowflake Cortex’s Powerful AI and Semantic Search Abilities 

 Snowflake Copilot:

An LLM-powered assistant generates and refines SQL queries using natural language texts. It allowing users to refine queries through conversation and filter to the most appropriate task without additional setup.

Universal Search:

A search assistant powered by LLM quickly discovers and accesses applications and data. This universal search engine feature enables swift navigation through tables, views, databases, schemas, marketplace data products, and Snowflake documentation articles, facilitating streamlined information access and retrieval processes for users.

Document AI: 

A pre-trained model and intuitive interface powered by LLM that allows individuals to extract data from a wide range of use cases. By leveraging Document AI, users can seamlessly process various document formats, including PDFs, Word, text, and screenshots, to obtain accurate answers to their questions. A key advantage of this innovative approach lies in its scalability, enabling the setup of an extraction pipeline that efficiently addresses data extraction needs.

Snowflake Cortex support with Streamlit in Snowflake Data Cloud Platform

Streamlit (public preview) gives the ability to develop interfaces in just a few lines of Python code without any front-end experience and can also assist in creating LLM applications faster. In addition, it facilitates the deployment and sharing of these applications within organizations through unique URL links. 

Snowflake Cortex with its serverless functions and search abilities and without a need to set up and manage a complex infrastructure can assist users to quickly build AI applications in minutes with instant access to ML and LLM models in Snowflake Data Cloud with or without any AI expertise specialization. By integrating serverless functions into a chatbox using Streamlit, users can securely develop LLM applications in minutes or hours.

About the author

Mansoor Sherif

Mansoor is a Sr. Practice Manager and an Enterprise Architect with over 19 years of experience in the IT industry, who specializes in helping organizations overcome complex challenges through innovative Big Data, Cloud technologies, Advanced Analytics, and AI solutions. As a thought leader and blogger, Mansoor shares valuable insights on emerging trends and best practices, empowering businesses to leverage cutting-edge technologies for digital transformation and long-term success. Passionate about driving impactful change, Mansoor combines deep technical expertise with strategic vision to deliver transformative solutions that meet the evolving needs of clients.

Add comment

Welcome to Miracle's Blog

Our blog is a great stop for people who are looking for enterprise solutions with technologies and services that we provide. Over the years Miracle has prided itself for our continuous efforts to help our customers adopt the latest technology. This blog is a diary of our stories, knowledge and thoughts on the future of digital organizations.


For contacting Miracle’s Blog Team for becoming an author, requesting content (or) anything else please feel free to reach out to us at blog@miraclesoft.com.

Who we are?

Miracle Software Systems, a Global Systems Integrator and Minority Owned Business, has been at the cutting edge of technology for over 24 years. Our teams have helped organizations use technology to improve business efficiency, drive new business models and optimize overall IT.