Securely Deploy Custom Apps and Models with Snowpark Container Services, Now Generally Available
Snowpark Container Services is now generally available in Azure as well as AWS regions. Learn more here.
Since introducing Snowpark Container Services, we’ve seen overwhelming adoption across industries from customers and partners, including Landing.AI, Relational.AI, H20.AI, SailPoint, AIR MILES, Spark NZ, and Eutelsat OneWeb. These organizations and many more are using Snowpark Container Services capabilities to easily and securely deploy everything from custom front-ends and large-scale ML training and inference to open source and homegrown models, all securely within Snowflake.
Today we are excited to announce the general availability (GA) of Snowpark Container Services in all AWS and Azure commercial regions Customers can get fast access to GPU infrastructure without needing to self-procure instances or make reservations with their public cloud provider. (GPU availability may be limited in certain regions.)
And with such widespread adoption of the feature, we’re excited to announce that we’ve lowered costs by 50% across all instance types!
Interested in learning more about why we created Snowpark Container Services? Check out this blog.
Deploy custom workloads securely, without complexity
Security, simplicity and value: This is why customers are so excited about Snowpark Container Services.
First, security. Snowpark Container Services gives developers the ability to bring any containerized workload to their data that is already secure in Snowflake — ReactJS front-ends, open source large language models (LLMs), distributed data processing pipelines, you name it. Data doesn’t need to move across a patchwork of services, opening up security and governance risks; it can stay within Snowflake while you analyze it, transform it and build with it.
Second, simplicity. Stitching together various container registries, management services, compute services and observability tools is complicated. It creates maintenance overhead for developers and adds complexity to architectures. Snowpark Container Services makes it simple. It is a fully managed service that provides a single, integrated experience.
Third, value. The simplicity of a fully managed service reduces overhead and operational burden, maximizing the value you get out of the service. Additionally, we’ve added budget controls that enable you to monitor and manage resources cost effectively. And as we mentioned previously, we’re excited to reduce costs for all instance types: See table 1(b) in the Snowflake rate card.
What’s new in GA?
In addition to making Snowpark Container Services generally available, through close partnership with our design partners throughout preview, we’ve further advanced Snowpark Container Services across the following key areas.
- Improved security and governance: We enhanced control over security aspects, including egress, ingress and networking. Register here to learn more about this in our on-demand security deep dive.
- Cost reduction: With overwhelming adoption of Snowpark Container Services, Snowflake is lowering the cost by 50% for all types of instances. See table 1(b) in the Snowflake rate card.
- Increased storage options: We added more diverse storage solutions, including local volumes, memory, Snowflake stages and configurable block storage, to support additional use cases, such as deploying high-performance LLMs and low-latency applications.
- More diverse instance types: We introduced high-memory instances and dynamic GPU allocation for intensive workloads.
- More flexible, GPU-powered compute in Snowflake Notebooks: Container Runtime (currently in public preview) provides seamless access to distributed processing with CPU and GPU options, which is ideal for resource-intensive machine learning tasks in Snowflake ML, such as deep learning. Users can get started with the Container Runtime directly from Snowflake Notebooks (currently in public preview) with optimized data loading from Snowflake, automatic lineage capture and Model Registry integration.
- Observability with Snowflake Trail: With Snowflake Trail, you can get a comprehensive set of telemetry signals, including metrics, logs and traces, all within Snowflake. Built with OpenTelemetry standards, schema and open ecosystem integrations in mind, Snowflake telemetry and notification capabilities integrate with some of the most popular developer tools, including Datadog, Grafana, Metaplane, Monte Carlo, PagerDuty and Slack.
- Streamlined DevOps: With GA, Snowpark Container Services supports programmatic ingress, spec templating and integration of jobs with services that will help automate software development and IT operations.
Get started with Snowpark Container Services
Here are a few resources to help you get started:
- Deploy your first container in Snowflake with this quickstart (Note: Snowpark Container Services is not available for free trial accounts).
- Get started building models in Snowflake ML with Snowflake Notebooks on Container Runtime with this quickstart.
- Get notified when new regions become available through Github.
- Learn more about Snowpark Container Services in our documentation.
- Watch the tech talk to learn how Landing.AI is effortlessly and securely deploying large vision models in Snowflake.
- Check out this YouTube Playlist full of demos of developers using Snowpark Container Services for everything from call center analytics to drug discovery, to running Doom within Snowflake!
We look forward to seeing all the cool things you build in the AI Data Cloud with Snowpark Container Services.