Learn to track trial-to-paid conversion rates in product-led growth SaaS. Complete guide with SQL examples, cohort analysis, and actionable insights for improving PLG conversion rates.
Trial-to-paid conversion is the most critical metric for product-led growth (PLG) SaaS companies. Unlike traditional sales-led models where conversion happens through human touch, PLG relies entirely on the product experience to drive paid conversions.
Why trial-to-paid conversion matters for PLG: It directly measures how well your product delivers value during the trial period. A healthy trial conversion rate (typically 15-25% for B2B SaaS) indicates strong product-market fit, effective onboarding, and clear value demonstration.
This guide shows you how to properly measure trial-to-paid conversion using cohort analysis, track the activation events that matter most, and build dashboards that help product and growth teams optimize the trial experience.
First, connect your user database, subscription system, and product analytics platform to Airbook:
Identify the key events that define your trial period and paid conversion:
Create views that group trial users by sign-up period to track conversion rates over time:
Build visualizations that help different teams understand and optimize trial conversion:
Here's a comprehensive SQL query to calculate trial-to-paid conversion rates with cohort analysis. This query assumes you have user, subscription, and event tracking tables.
-- Trial-to-Paid Conversion Analysis with Cohort Breakdown
WITH trial_cohorts AS (
SELECT
u.user_id,
u.trial_start_date,
u.acquisition_channel,
u.utm_source,
u.utm_campaign,
DATE_TRUNC('week', u.trial_start_date) AS cohort_week,
DATE_TRUNC('month', u.trial_start_date) AS cohort_month,
-- Calculate trial end date (typically 14 days)
u.trial_start_date + INTERVAL '14 days' AS trial_end_date
FROM users u
WHERE u.trial_start_date IS NOT NULL
AND u.trial_start_date >= '2024-01-01'
),
activation_events AS (
SELECT
e.user_id,
COUNT(DISTINCT CASE WHEN e.event_name = 'project_created' THEN e.event_id END) AS projects_created,
COUNT(DISTINCT CASE WHEN e.event_name = 'invite_sent' THEN e.event_id END) AS invites_sent,
COUNT(DISTINCT CASE WHEN e.event_name = 'integration_connected' THEN e.event_id END) AS integrations_connected,
COUNT(DISTINCT CASE WHEN e.event_name = 'first_dashboard_view' THEN e.event_id END) AS dashboards_viewed,
MIN(CASE WHEN e.event_name = 'project_created' THEN e.event_timestamp END) AS first_project_created_at,
COUNT(DISTINCT DATE(e.event_timestamp)) AS active_days_in_trial
FROM events e
JOIN trial_cohorts tc ON e.user_id = tc.user_id
WHERE e.event_timestamp BETWEEN tc.trial_start_date AND tc.trial_end_date
AND e.event_name IN ('project_created', 'invite_sent', 'integration_connected', 'first_dashboard_view')
GROUP BY e.user_id
),
conversions AS (
SELECT
s.user_id,
MIN(s.subscription_start_date) AS first_payment_date,
MIN(s.plan_type) AS first_plan_type,
MIN(s.billing_cycle) AS first_billing_cycle,
SUM(s.mrr) AS initial_mrr
FROM subscriptions s
WHERE s.subscription_status = 'active'
AND s.subscription_start_date IS NOT NULL
GROUP BY s.user_id
),
cohort_analysis AS (
SELECT
tc.user_id,
tc.cohort_week,
tc.cohort_month,
tc.acquisition_channel,
tc.utm_source,
tc.utm_campaign,
tc.trial_start_date,
tc.trial_end_date,
-- Activation metrics
COALESCE(ae.projects_created, 0) AS projects_created,
COALESCE(ae.invites_sent, 0) AS invites_sent,
COALESCE(ae.integrations_connected, 0) AS integrations_connected,
COALESCE(ae.dashboards_viewed, 0) AS dashboards_viewed,
COALESCE(ae.active_days_in_trial, 0) AS active_days_in_trial,
ae.first_project_created_at,
-- Conversion metrics
c.first_payment_date,
c.first_plan_type,
c.first_billing_cycle,
c.initial_mrr,
-- Calculate conversion status and timing
CASE
WHEN c.first_payment_date IS NOT NULL THEN 1
ELSE 0
END AS converted_to_paid,
CASE
WHEN c.first_payment_date <= tc.trial_end_date THEN 1
ELSE 0
END AS converted_during_trial,
CASE
WHEN c.first_payment_date > tc.trial_end_date THEN 1
ELSE 0
END AS converted_after_trial,
-- Time to conversion in days
CASE
WHEN c.first_payment_date IS NOT NULL
THEN EXTRACT(days FROM c.first_payment_date - tc.trial_start_date)
ELSE NULL
END AS days_to_conversion,
-- Activation score (weighted combination of key events)
(COALESCE(ae.projects_created, 0) * 3 +
COALESCE(ae.invites_sent, 0) * 2 +
COALESCE(ae.integrations_connected, 0) * 4 +
COALESCE(ae.dashboards_viewed, 0) * 1) AS activation_score
FROM trial_cohorts tc
LEFT JOIN activation_events ae ON tc.user_id = ae.user_id
LEFT JOIN conversions c ON tc.user_id = c.user_id
)
-- Final aggregated results
SELECT
cohort_month,
acquisition_channel,
-- Cohort size and basic conversion
COUNT(*) AS trial_users,
SUM(converted_to_paid) AS paid_conversions,
ROUND(100.0 * SUM(converted_to_paid) / COUNT(*), 2) AS conversion_rate_pct,
-- Conversion timing breakdown
SUM(converted_during_trial) AS converted_during_trial,
SUM(converted_after_trial) AS converted_after_trial,
ROUND(AVG(days_to_conversion), 1) AS avg_days_to_conversion,
-- Activation metrics for converted vs non-converted
ROUND(AVG(CASE WHEN converted_to_paid = 1 THEN projects_created END), 2) AS avg_projects_converted_users,
ROUND(AVG(CASE WHEN converted_to_paid = 0 THEN projects_created END), 2) AS avg_projects_non_converted_users,
ROUND(AVG(CASE WHEN converted_to_paid = 1 THEN active_days_in_trial END), 1) AS avg_active_days_converted,
ROUND(AVG(CASE WHEN converted_to_paid = 0 THEN active_days_in_trial END), 1) AS avg_active_days_non_converted,
ROUND(AVG(CASE WHEN converted_to_paid = 1 THEN activation_score END), 1) AS avg_activation_score_converted,
ROUND(AVG(CASE WHEN converted_to_paid = 0 THEN activation_score END), 1) AS avg_activation_score_non_converted,
-- Revenue metrics
SUM(initial_mrr) AS total_initial_mrr,
ROUND(AVG(initial_mrr), 2) AS avg_mrr_per_conversion,
-- Plan mix
ROUND(100.0 * SUM(CASE WHEN first_billing_cycle = 'annual' THEN 1 ELSE 0 END) / NULLIF(SUM(converted_to_paid), 0), 1) AS annual_plan_pct
FROM cohort_analysis
WHERE cohort_month >= '2024-01-01'
GROUP BY cohort_month, acquisition_channel
ORDER BY cohort_month DESC, conversion_rate_pct DESC;
Cohort analysis is essential for PLG SaaS because conversion rates vary significantly based on when users sign up, how they discovered you, and what features they use during the trial.
Activation events are user actions during the trial that strongly predict paid conversion. Identifying and optimizing these events is crucial for improving your trial-to-paid rate.
Create a weighted score based on user actions during trial to predict conversion likelihood:
-- Activation Score Calculation
SELECT
user_id,
trial_start_date,
-- Weight events by conversion impact
(integrations_connected * 4 + -- Highest impact
team_invites_sent * 3 + -- High impact
projects_created * 3 + -- High impact
feature_adoptions * 2 + -- Medium impact
help_articles_viewed * 1 + -- Lower impact
active_session_days * 1) AS activation_score,
-- Predict conversion probability
CASE
WHEN activation_score >= 15 THEN 'High (>40%)'
WHEN activation_score >= 8 THEN 'Medium (20-40%)'
WHEN activation_score >= 3 THEN 'Low (5-20%)'
ELSE 'Very Low (<5%)'
END AS conversion_likelihood
FROM user_activation_summary
WHERE trial_start_date >= CURRENT_DATE - INTERVAL '30 days';
Effective visualization of trial-to-paid conversion helps different teams understand trends, identify opportunities, and make data-driven decisions to optimize the PLG funnel.
Cohort Week | Organic | Paid Ads | Referral | Product Hunt |
---|---|---|---|---|
Week 48 | 24.2% | 18.7% | 31.4% | 12.1% |
Week 49 | 22.8% | 19.3% | 28.9% | 11.7% |
Week 50 | 25.1% | 21.4% | 33.2% | 15.8% |
Understanding what your trial-to-paid conversion data means is crucial for making the right product and growth decisions. Here's how to interpret common patterns and take action.
Below average - immediate action needed
Average - opportunity for optimization
Excellent - scale what's working
What it means: Product improvements, better targeting, or market fit are working.
Action: Identify what changed and amplify those improvements. Document learnings for future iterations.
What it means: Product changes, increased competition, or targeting issues may be causing problems.
Action: Immediately review recent product changes, check competitor activity, and analyze cohort performance by acquisition channel.
What it means: Users need more time to see value or understand the product.
Action: Improve onboarding, reduce time-to-first-value, or consider extending trial period for complex products.
What it means: Different acquisition channels bring users with varying intent and fit.
Action: Focus marketing spend on high-converting channels and optimize messaging for underperforming ones.
Trial-to-paid conversion is a critical metric for PLG companies, but different teams need different views and update frequencies to be effective.
Focus on user experience and activation
Focus on acquisition and conversion optimization
Focus on business impact and strategic decisions
Metric | Daily | Weekly | Monthly | Quarterly |
---|---|---|---|---|
Overall Conversion Rate | ✅ | ✅ | ✅ | ✅ |
Channel Performance | ✅ | ✅ | ✅ | - |
Activation Events | ✅ | ✅ | - | - |
Cohort Analysis | - | ✅ | ✅ | ✅ |
Competitive Benchmarking | - | - | - | ✅ |
Common pitfalls that can skew your trial-to-paid conversion analysis and lead to poor decisions. Avoid these mistakes to ensure accurate measurement and effective optimization.
The Problem: Using account creation date instead of actual trial activation date.
The Fix: Track when users actually start using the product, not just when they sign up. Some users delay activation by days or weeks.
The Problem: Mixing free users, trial users, and direct purchases in conversion calculations.
The Fix: Clearly define what constitutes a "trial user" and filter your data accordingly. Only include users who went through your trial flow.
The Problem: Trial periods calculated incorrectly due to UTC vs. user time zones.
The Fix: Standardize all timestamps to UTC and clearly define trial period calculation logic.
The Problem: Missing important insights by not segmenting data by channel, time, or user characteristics.
The Fix: Always break down conversion rates by key dimensions: acquisition channel, time period, user segments, and activation behavior.
The Problem: Comparing conversion rates across different trial lengths or types without normalization.
The Fix: Segment analysis by trial type (14-day, 30-day, unlimited) and normalize time periods when possible.
The Problem: Not tracking how conversion rates change over time for similar user groups.
The Fix: Implement proper cohort analysis to track how product changes affect new vs. existing trial user behavior.
The Problem: Focusing solely on conversion rate without considering customer quality or lifetime value.
The Fix: Balance conversion rate optimization with customer quality metrics like retention, expansion, and LTV.
The Problem: Implementing trial experience changes without A/B testing or measuring impact.
The Fix: Always test trial optimization changes with proper control groups and statistical significance.
The Problem: Reducing trial length to increase urgency without understanding user adoption patterns.
The Fix: Analyze time-to-activation and time-to-conversion patterns before adjusting trial periods. Some products need longer evaluation times.
The Problem: Different teams using different definitions of "conversion" leading to conflicting reports.
The Fix: Document clear definitions: Does conversion mean first payment, subscription activation, or plan upgrade? Ensure consistency across all reports.
The Problem: Reporting headline conversion rates without context or actionable breakdowns.
The Fix: Include context like historical trends, channel breakdowns, and specific recommendations for improvement in every report.
The Problem: Reporting conversion rates before allowing enough time for users to convert, creating artificially low numbers.
The Fix: Wait at least trial period + 7 days before reporting final conversion rates for a cohort. Use predictive models for real-time estimates.
Set up comprehensive trial conversion tracking with cohort analysis and activation scoring in minutes. Build dashboards that help your product team optimize the trial experience for maximum conversion.
Complete guide to measuring product-led growth funnels from signup to revenue.
Combine product-led and sales-led motions for maximum growth efficiency.
Track feature adoption and user engagement for PLG optimization.
Complete collection of go-to-market metrics and benchmarks.
Answer: A good trial-to-paid conversion rate for B2B PLG SaaS is typically between 15-25%. Consumer SaaS often sees lower rates (5-15%) due to different user intent and price sensitivity.
Answer: Wait at least your trial period length plus 7 additional days before reporting final conversion rates. For 14-day trials, wait 21 days total to account for delayed payment processing and decision-making.
Answer: The best activation events are those closest to your product's core value. Most PLG SaaS companies should track 3-5 key events that represent meaningful product engagement.
Answer: Segment your trial users by acquisition channel (organic, paid ads, referral, etc.) and calculate conversion rates separately. This reveals which channels bring the highest-quality trial users.
-- Channel-specific conversion rate calculation
SELECT
acquisition_channel,
COUNT(*) as trial_users,
SUM(CASE WHEN converted = 1 THEN 1 ELSE 0 END) as conversions,
ROUND(100.0 * SUM(CASE WHEN converted = 1 THEN 1 ELSE 0 END) / COUNT(*), 2) as conversion_rate_pct
FROM trial_user_cohorts
WHERE trial_start_date >= '2024-01-01'
GROUP BY acquisition_channel
ORDER BY conversion_rate_pct DESC;
Answer: Yes, freemium and free trial models require different conversion tracking approaches because user behavior and intent patterns are fundamentally different.
Answer: The optimal trial length depends on your product complexity and time-to-value. Most B2B SaaS products see best results with 14-day trials, but complex products may benefit from 30-day trials.
Answer: Focus on reducing time-to-first-value, improving activation event completion, and personalizing the trial experience based on user intent and behavior.
Real-world examples of how different PLG SaaS companies approach trial-to-paid conversion measurement and optimization.
14-day trial users weren't connecting data sources quickly enough to see value before trial expiration.
Conversion rate improved from 12% to 22% by reducing time-to-first-chart from 6 days to 2 days.
Individual sign-ups weren't inviting team members, missing the core collaboration value proposition.
Team trial conversion rate increased from 8% to 28%, with invited users showing 3x higher individual conversion rates.
Developers were signing up but not implementing the API, leading to low trial conversion despite high sign-up volume.
First API call completion increased from 23% to 67%, driving trial conversion from 9% to 19%.