CAPI — Conversions API (server-side tracking)
Definition: Meta's server-to-server event API. Sends conversion events from your server directly to Meta, bypassing the browser. Complements (doesn't replace) the browser Pixel.
2026 state: Essentially mandatory for any brand spending >$5K/mo on Meta. Without CAPI, iOS 14.5+ ATT blocks most pixel events on Apple devices (~57% of US traffic). CAPI restores visibility by sending events server-side.
Match quality: Meta rates each event 1-10 based on signal completeness. 7.2/10 with fbc/fbp/IP/UA/em/ph/external_id vs 5.1/10 pixel-only. Higher match quality = better attribution + cheaper effective CPM (Meta allocates spend to measurable brands).
Implementation: Shopify Plus has native CAPI. Standard Shopify needs a connector (Stape, Elevar). WooCommerce, Magento, custom sites need direct implementation — typically 2-4 engineering days.
Meta Pixel
Definition: Browser-based JavaScript tracker that fires events (PageView, ViewContent, AddToCart, Purchase) to Meta.
2026 reliability: ~68% of conversions captured (down from ~92% pre-iOS 14.5). Degraded further by Safari ITP, Firefox Enhanced Tracking, and content-blocker adoption. No longer a primary measurement tool — used alongside CAPI to enrich match quality (adding fbp cookie signal that CAPI alone can't produce).
Do not remove Pixel when you add CAPI. Use both. The browser Pixel produces the fbp cookie and fbclid signals that CAPI uses for deduplication.
Attribution Window
Definition: Time window Meta credits a conversion to an ad. Format: X-day click + Y-day view.
2026 defaults: 7-day click + 1-day view. Older option (28-day click) removed post-iOS 14.5. Available alternatives: 1-day click only, 7-day click only (no view-through), 1-day click + 1-day view.
Why it matters: Attribution window determines what Meta takes credit for. 7-day click is Meta's default and produces the highest attributed CPA — the window most analyses use. 1-day click reflects tighter causality. The delta between 1-day click and 7-day click shows how much attributed revenue relies on multi-day click journeys (a sign of creative + offer strength).
Best practice: optimize on 7-day click + 1-day view; sanity-check against 1-day click for tighter causality signal. Compare Meta-attributed ROAS against Shopify last-click — the gap reveals over-attribution.
View-Through Conversion
Definition: Conversion credited to a viewed-but-not-clicked ad within the view window (Meta default 1 day).
2026 reliability: Directionally useful, often over-credited. View-through assumes the impression caused the conversion — in reality, view-through conversions are a mix of (a) real brand-impact-driven conversions and (b) coincidental overlap between audience and purchase.
When it matters most: brand campaigns, awareness-stage funnels, high-AOV products with long consideration cycles. Not ideal for direct-response optimization where you want the tightest signal.
MTA — Multi-Touch Attribution
Definition: Attribution methodology that assigns fractional credit across multiple touchpoints in a conversion path (e.g., 30% Meta, 30% Google, 20% email, 20% direct).
2026 state: Severely constrained post-iOS 14.5 because cross-site tracking (the foundation of MTA) collapsed. Platform-native MTA (Google's data-driven attribution, Meta's algorithmic attribution) operates within each ad network but not across them.
Practical 2026: Treat MTA as supplementary insight, not primary attribution. Cross-channel MTA providers (Rockerbox, Northbeam, Segment) use statistical modeling + first-party identity stitching — useful at $200K+/mo spend, overkill below.
MMM — Marketing Mix Modeling
Definition: Statistical regression on aggregated marketing spend vs revenue. Decomposes revenue contribution by channel without tracking individual users.
2026 state: The dominant 2026 substitute for MTA. Privacy-robust (no user-level tracking), platform-independent (works across Meta, Google, TikTok, affiliate, email, organic). Requires 12+ months of weekly or daily data.
Providers: Haus, Recast, Rockerbox, Ladder, Measured. Cost $20K-80K/year depending on provider and spend tier.
When it pays off: $500K+/mo spend across 3+ channels. Typical result: 15-25% budget allocation improvement, 2-4 month payback.
Incrementality Testing
Definition: Holdout experiment that compares conversion rates for users who saw ads vs a randomized control group who didn't. The only truly causal ROI signal.
Platform options: Meta native conversion lift studies (simplest; requires $10K+/mo spend for 30 days to achieve significance). Third-party: Haus, INCRMNTL, Measured for cross-channel incrementality.
Typical finding: Incremental ROI is 30-60% of platform-attributed ROI for mature DTC brands. If Meta reports 3× ROAS, real causal ROAS is probably 1.5-2×. This gap is where growth teams lose money — they scale based on 3× attributed, discover true ROI is 1.8×, pull back, repeat.
Cadence: Run incrementality tests before major budget increases (+30% on a channel) and quarterly for major channels. Don't run on every campaign — too noisy and expensive.
Practical 2026 measurement stack
- CAPI + Pixel for signal (both, not either) — daily optimization, match quality 7+/10.
- Platform reporting (Meta Ads Manager 7-day click) for daily optimization decisions.
- Platform conversion lift quarterly for Meta-specific incrementality check.
- MMM quarterly for aggregate budget allocation (channels, dayparts, geographies). Only if $500K+/mo spend.
- Shopify/source-of-truth ROAS for board-level reporting (last-click usually) — the most conservative number.
Related pillars
- Core Metrics Pillar
- Audience Targeting Pillar
- Testing & Optimization Pillar
- Creative Strategy Pillar