What the platform sees — and what it's been missing
What determines competitive position in AI-era marketing automation?
The surface answer is better AI features. But features copy. The structural answer: the platform with the deepest understanding of customer intent sets the ceiling for what AI can do.
Most platforms are building 2028 AI capabilities on 2018 data infrastructure. They're adding intelligence features on top of systems that capture transactional events — purchases, email opens, page views — while ignoring the behavioral signals that reveal intent.
This is the opportunity.
Most people think about data wrong. They think: more data → better decisions. That's linear thinking. The actual dynamic is different across three dimensions:
Point 1
If Omnisend knows a customer searched "sustainable yoga mat," viewed 3 products in that category, spent 6 minutes on reviews, checked the return policy twice, and has a historical AOV $22 higher than their current cart — that's not "more information." That's a categorically different understanding of that customer.
The email you send with that understanding is fundamentally different:
Compare to what current systems know: "Abandoned cart, $45, yoga mat." The email from that understanding: "Hey, you left something behind! Here's 10% off."
Same customer. Completely different intervention. The second email treats them like a price-sensitive abandoner. They're actually a research-oriented buyer with a specific objection. The discount might actually hurt conversion because it signals the product isn't worth full price.
Point 2
If Omnisend starts capturing search queries today, in 24 months they'll have 24 months of search intent data. That data doesn't exist in Klaviyo for those same customers. Klaviyo can't retroactively capture what customers searched for 18 months ago.
This is the asymmetry that matters. Features can be copied. Data accumulation cannot be copied backward through time.
A brand that's been on Omnisend for 2 years with full behavioral tracking has:
Move that brand to Klaviyo and that intelligence layer vanishes. It's not in an export file. It doesn't migrate. The new platform starts from zero. That's a switching cost that gets more expensive the longer someone stays. Traditional switching costs (workflow rebuilding, team retraining) are one-time. Data switching costs compound annually.
Point 3
The "AI" most platforms ship is thin. It's prompt engineering on top of GPT-4. "Generate a subject line for this email." "Write abandoned cart copy." Any platform can do this because the intelligence is in the foundation model, not the platform.
Real AI advantage comes from proprietary training data. If Omnisend has behavioral patterns across thousands of merchants showing that customers who check return policy twice convert 40% better with guarantee-emphasis messaging, they can build that into their systems. Klaviyo can't build that model because they don't have that signal.
The foundation models are commoditizing. Everyone has access to GPT-4, Claude, Gemini. The differentiation moves to: what proprietary data can you feed them? What patterns can you detect that others can't?
Data creates switching costs that appreciate rather than depreciate.
If Omnisend begins capturing behavioral signals today, in 24 months there will be 24 months of intent data for every active customer:
That intelligence layer doesn't exist in Klaviyo for those same customers. It can't be exported. It can't be migrated. A brand switching platforms abandons that accumulated understanding entirely.
Traditional switching costs — workflow rebuilding, team retraining — are one-time. Data switching costs compound monthly. Every month on the platform is a month of understanding that competitors cannot replicate backward through time.
Omnisend currently only captures approximately 5–6 signals: product_added_to_cart, checkout_started, checkout_completed, partial browse events.
The most valuable behavioral signals go uncaptured. Not because the infrastructure doesn't exist — Shopify broadcasts these events — but because the listening layer wasn't prioritized.
Shopify's standard events — documented, available, and broadcasting right now. View the full list →
Gap 01
| Event | Why It Matters | Segmentation Unlocked |
|---|---|---|
search_submitted |
Direct intent signal | "Users who searched for [term] but didn't buy" |
collection_viewed |
Category interest | "Users interested in [category]" |
product_removed_from_cart |
Hesitation signal | "Users who removed items" (intervention opportunity) |
cart_viewed |
Purchase consideration | "Users who viewed cart 3+ times" (high intent) |
checkout_shipping_info_submitted |
Commitment level | "Got to shipping but dropped" |
payment_info_submitted |
Highest intent | "Entered payment but didn't complete" (highest priority) |
Gap 02
| Event | Why It Matters | Segmentation Unlocked |
|---|---|---|
| Clicked size guide | Needs reassurance | Include size chart in follow-up email |
| Clicked reviews section | Social proof seeker | Include testimonials |
| Clicked return policy | Risk-averse | Emphasize guarantee |
| Filter changes | Preference signals | "Users who filtered by [color/size/price]" |
Gap 03
| Trait | How It's Calculated | Use Case |
|---|---|---|
| Session Depth Score | Pages viewed × time on site | Identify high-engagement sessions |
| Comparison Behavior | Products viewed in same category | "Active shoppers" segment |
| Hesitation Index | Cart views + policy checks + sessions without purchase | Users who need a push |
| Price Sensitivity | Sale item interaction, discount code field focus | Dynamic offer personalization |
A customer journey as it actually happens — and what your platform actually captures from it. Toggle to see the difference:
Current Workflow
New Workflow
The second report is believable because it's specific. It proves the agency understood something the brand couldn't see themselves.
Many changes make this viable now:
Before 2023
2024 Onwards
But here's what hasn't changed: the collection infrastructure. Most platforms are still capturing what they captured in 2019. They updated their AI features but not their data foundation.
It's like upgrading to a Ferrari engine while keeping bicycle tires. The engine can go 200mph but the tires limit you to 30.
Shopify built the Web Pixels API for this use case. The events are documented, integration paths are clear. This isn't building a CDP from scratch. It's deciding whether to listen to signals already being broadcast.
Shift 01
Current systems react to events. Customer abandons → trigger recovery flow.
With behavioral signals, systems detect abandonment patterns before the event: third session, same products viewed repeatedly, no checkout progression. Intervention fires before abandonment, not after.
This isn't possible without session-level behavioral data. Platforms limited to transactional events structurally cannot build anticipatory systems.
Shift 02
Single-brand behavioral data enables better messaging for that brand.
Cross-merchant behavioral data — patterns across thousands of stores — enables intelligence no single brand could develop: customers who check return policy twice convert 40% better with guarantee emphasis; customers who compare 4+ products respond to social proof over discounts; search-first customers have higher intent than browse-first customers.
These patterns become proprietary models. Competitors without the underlying data across thousands of merchants cannot replicate them.
| Criteria | Score | Notes |
|---|---|---|
| Impact | ⭐⭐⭐⭐⭐ | Foundation for all other AI features. Without this, nothing else works. |
| Technical Feasibility | ⭐⭐⭐⭐ | Shopify APIs exist and are well-documented. Main challenge is data infrastructure at scale. |
| Resources Required | Medium-High | Backend engineers for event processing, data engineers for storage/retrieval, frontend for new UI components. |
| Long-term Sustainability | ⭐⭐⭐⭐⭐ | Becomes more valuable over time as data accumulates. Creates switching costs. |
| Fit with Agency ICP | ⭐⭐⭐⭐⭐ | Agencies want differentiated value. Rich data = better stories to tell clients. |
The capabilities in subsequent pillars — intelligent segmentation, adaptive automation, predictive modeling — depend on behavioral data existing.
Micro-segments require signals to segment on. Intent-aware automation requires intent signals. Predictive models require historical patterns to learn from.
This is infrastructure. Not the feature users see — the capability layer that determines the ceiling for every feature built on top.