How to Leverage Snowflake for Machine Learning

Learn how to use Snowpark for seamless ML model training and deployment within Snowflake, as well as how to integrate with external ML tools like Amazon SageMaker and Google Cloud AI Platform.

GraphQL has a role beyond API Query Language- being the backbone of application Integration
background Coditation

How to Leverage Snowflake for Machine Learning

Businesses are constantly looking for ways to extract insights from their vast data repositories. Snowflake, a cloud-based data warehousing platform, has been making waves in the industry for its scalability and performance. But did you know it's also a powerhouse when it comes to machine learning (ML)? Let's see how you can leverage Snowflake's support for ML using Snowpark and external ML integrations.

The Snowflake ML Ecosystem

Snowflake's approach to ML is twofold:

  1. Snowpark: A developer framework that lets you write code in your preferred language (Python, Java, or Scala) and execute it directly within Snowflake.
  2. External ML Integrations: Snowflake's ability to connect with popular ML platforms and tools.

This dual approach gives data scientists and ML engineers the flexibility to choose the best tool for their specific needs. Whether you're a Python aficionado or prefer working with specialized ML platforms, Snowflake's got you covered.

Getting Started with Snowpark for ML

Snowpark is Snowflake's answer to bringing ML workloads closer to the data. It's like having a Swiss Army knife for data processing and ML tasks, all within the Snowflake environment.

Setting Up Your Environment

First things first, let's set up our Python environment:


pip install snowflake-snowpark-python pandas scikit-learn

Now, let's connect to Snowflake:


from snowflake.snowpark import Session

# Replace with your Snowflake account details
connection_parameters = {
    "account": "your_account_locator",
    "user": "your_username",
    "password": "your_password",
    "role": "your_role",
    "warehouse": "your_warehouse",
    "database": "your_database",
    "schema": "your_schema"
}

session = Session.builder.configs(connection_parameters).create()
print(session.sql("SELECT current_warehouse(), current_database(), current_schema()").collect())

If everything's set up correctly, you'll see your current warehouse, database, and schema printed out.

Data Preparation with Snowpark

Let's say we're working on a customer churn prediction model. First, we'll need to prepare our data:


# Assuming we have a table named 'customer_data'
df = session.table("customer_data")

# Basic data exploration
print(df.count())
print(df.schema)

# Feature engineering
df = df.with_column("tenure_months", df["tenure_days"] / 30)
df = df.with_column("total_charges", df["monthly_charges"] * df["tenure_months"])

# Select relevant features
features = ["tenure_months", "total_charges", "contract_type", "internet_service", "payment_method"]
target = "churned"

model_data = df.select(features + [target])

# Split the data
train_data, test_data = model_data.random_split([0.8, 0.2], seed=42)

Training a Model with Snowpark

Now, let's train a simple logistic regression model:


from snowflake.snowpark.functions import col
from sklearn.linear_model import LogisticRegression
from sklearn.preprocessing import StandardScaler
import pandas as pd

# Convert Snowpark DataFrames to Pandas for scikit-learn
train_pdf = train_data.to_pandas()
test_pdf = test_data.to_pandas()

# Prepare features and target
X_train = train_pdf[features]
y_train = train_pdf[target]
X_test = test_pdf[features]
y_test = test_pdf[target]

# Preprocess categorical variables
X_train = pd.get_dummies(X_train)
X_test = pd.get_dummies(X_test)

# Scale numerical features
scaler = StandardScaler()
X_train_scaled = scaler.fit_transform(X_train)
X_test_scaled = scaler.transform(X_test)

# Train the model
model = LogisticRegression()
model.fit(X_train_scaled, y_train)

# Evaluate the model
accuracy = model.score(X_test_scaled, y_test)
print(f"Model accuracy: {accuracy:.2f}")

Deploying the Model in Snowflake

Now that we've trained our model, let's deploy it back to Snowflake:


import joblib
import base64

# Serialize the model
model_bytes = joblib.dumps(model)
model_b64 = base64.b64encode(model_bytes).decode("utf-8")

# Create a UDF to make predictions
def create_predict_udf(session, model_b64):
    import_stmt = """
    import joblib
    import base64
    import pandas as pd
    from sklearn.preprocessing import StandardScaler
    """
    
    predict_func = """
    def predict(features):
        model_bytes = base64.b64decode('''{}''')
        model = joblib.loads(model_bytes)
        
        # Preprocess input features
        features_df = pd.DataFrame([features], columns={})
        features_encoded = pd.get_dummies(features_df)
        
        scaler = StandardScaler()
        features_scaled = scaler.fit_transform(features_encoded)
        
        # Make prediction
        prediction = model.predict(features_scaled)[0]
        return "Churn" if prediction == 1 else "No Churn"
    """.format(model_b64, features)
    
    session.udf.register(
        func=predict_func,
        name="predict_churn",
        stage_location="@my_stage",
        imports=[import_stmt],
        packages=["scikit-learn", "pandas", "joblib"],
        is_permanent=True,
        replace=True
    )

create_predict_udf(session, model_b64)

Now we can use our model directly in Snowflake SQL queries:


SELECT 
    customer_id,
    predict_churn(tenure_months, total_charges, contract_type, internet_service, payment_method) AS churn_prediction
FROM customer_data
LIMIT 10;

External ML Integrations

While Snowpark is powerful, sometimes you might want to leverage specialized ML platforms. Snowflake plays nice with several external ML tools. Following are some of the examples to show you how you can utilize the external ML integrations with Snowflake:

Snowflake + SageMaker

Amazon SageMaker is a fully managed ML platform. Here's how you can integrate it with Snowflake:

  1. Data Export: Use Snowflake's COPY command to export data to S3.

COPY INTO 's3://your-bucket/data/'
FROM (SELECT * FROM customer_data)
FILE_FORMAT = (TYPE = 'CSV' FIELD_DELIMITER = ',' COMPRESSION = 'GZIP')
HEADER = TRUE;

  1. Model Training: Use SageMaker to train your model on the exported data.
  2. Model Deployment: Deploy your model as a SageMaker endpoint.
  3. Integration: Use Snowflake's external function to call the SageMaker endpoint.

CREATE OR REPLACE EXTERNAL FUNCTION
    predict_churn_sagemaker(tenure_months float, total_charges float, contract_type varchar, internet_service varchar, payment_method varchar)
    RETURNS VARIANT
    API_INTEGRATION = aws_sagemaker_api_integration
    AS 'https://runtime.sagemaker.us-west-2.amazonaws.com/endpoints/your-endpoint-name/invocations';
    
    

Now you can use this function in your SQL queries:


SELECT 
    customer_id,
    predict_churn_sagemaker(tenure_months, total_charges, contract_type, internet_service, payment_method) AS churn_prediction
FROM customer_data
LIMIT 10;

Snowflake + Google Cloud AI Platform

Google Cloud AI Platform is another robust ML platform. Here's a quick integration guide:

  1. Data Export: Use Snowflake's COPY command to export data to Google Cloud Storage.

COPY INTO 'gcs://your-bucket/data/'
FROM (SELECT * FROM customer_data)
FILE_FORMAT = (TYPE = 'CSV' FIELD_DELIMITER = ',' COMPRESSION = 'GZIP')
HEADER = TRUE;

  1. Model Training: Use AI Platform to train your model.
  2. Model Deployment: Deploy your model as an AI Platform endpoint.
  3. Integration: Use Snowflake's external function to call the AI Platform endpoint.

CREATE OR REPLACE EXTERNAL FUNCTION
    predict_churn_ai_platform(tenure_months float, total_charges float, contract_type varchar, internet_service varchar, payment_method varchar)
    RETURNS VARIANT
    API_INTEGRATION = gcp_ai_platform_api_integration
    AS 'https://ml.googleapis.com/v1/projects/your-project/models/your-model/versions/your-version:predict';
    
    

Best Practices and Considerations

When leveraging Snowflake for ML, keep these tips in mind:

  1. Data Privacy: Snowflake's data sharing and secure views can help maintain data privacy when working with sensitive information.
  2. Performance Optimization: Use Snowflake's caching mechanisms and materializing views for frequently accessed data.
  3. Cost Management: Monitor your credit usage, especially when running compute-intensive ML workloads.
  4. Version Control: Use Snowflake's Time Travel feature to version your data and track changes.
  5. Scalability: Take advantage of Snowflake's auto-scaling to handle varying workloads efficiently.

Real-world Impact

Let's look at a couple of case studies where Snowflake's ML capabilities made a difference:

  1. E-commerce Giant: A large online retailer used Snowflake and Snowpark to build a real-time product recommendation engine. By processing user behavior data directly in Snowflake, they reduced their model update time from hours to minutes, leading to a 15% increase in click-through rates.
  2. Financial Services Firm: A multinational bank leveraged Snowflake's integration with SageMaker to develop a fraud detection model. By combining transactional data in Snowflake with ML capabilities in SageMaker, they were able to reduce false positives by 30% while maintaining high fraud detection rates.

The Road Ahead

Snowflake continues to evolve its ML capabilities. Recent announcements hint at native support for more ML algorithms and tighter integrations with popular ML frameworks. As the line between data warehousing and ML platforms blurs, we can expect even more powerful and user-friendly ML features in Snowflake.

Wrapping Up

Snowflake's support for ML through Snowpark and external integrations offers a flexible and powerful approach to building ML applications. Whether you're a data scientist who loves to code or an analyst who prefers using specialized ML platforms, Snowflake provides the tools you need to turn your data into insights.

Want to receive update about our upcoming podcast?

Thanks for joining our newsletter.
Oops! Something went wrong.