Mobile Attribution Windows Explained: How to Choose the Right Settings

Master attribution windows for mobile marketing. Learn click-through vs view-through windows, platform defaults, and how to optimize for your goals.

Senni
Senni
Mobile attribution windows explained for iOS and Android campaigns

Attribution windows are one of the least understood but highest-impact settings in mobile marketing. Get them wrong, and you'll underestimate campaign performance, make poor optimization decisions, and waste budget on low-confidence data. Get them right, and you'll accurately measure ROI, understand true campaign impact, and optimize effectively.

Yet most marketers treat attribution windows as a one-time setup, never revisiting them. Your attribution window settings should evolve with your business, campaign types, and audience behavior. This guide explains how attribution windows work, how different platforms handle them, and most importantly, how to choose the right settings for your goals.

What Are Attribution Windows and Why They Matter

An attribution window is the time period during which a user's actions (app install, in-app purchase, subscription, etc.) are credited to an ad impression or click. Without windows, every install would be unattributed—you wouldn't know which ad or campaign drove it.

The window concept emerged because user behavior is delayed. A person might see your ad on Monday, not immediately install, do other things for several days, then install on Thursday. The question becomes: do you credit the Thursday install to the Monday ad, or do you require the install to happen within a shorter timeframe?

Attribution window decisions directly impact your reported metrics:

Longer windows capture more attributed installs, inflating reported performance but potentially crediting the wrong touchpoint.

Shorter windows miss delayed conversions, underreporting true campaign impact but ensuring accuracy for direct responses.

This is especially critical in mobile because App Store optimization, organic discovery, and paid campaigns all interact. A user might see your ad, visit your app store page organically, and then install days later. Which touchpoint gets credit?

The answer depends on your window settings and your attribution model (last-click, first-click, multi-touch, etc.).

Click-Through Attribution Windows vs View-Through Attribution Windows

These two window types work differently and serve different purposes.

Click-Through Attribution Windows

Click-through attribution (CTA) windows measure the time between a user clicking your ad and taking a conversion action (installing your app, making a purchase, etc.).

Standard CTA windows by use case:

  • Direct response campaigns: 1-7 days (typical: 1-3 days)
  • Brand awareness campaigns: 7-28 days
  • Re-engagement campaigns: 1-3 days
  • App install campaigns: 7-30 days (platform-dependent)

When a user clicks your ad and installs within the window, that install is attributed to that campaign. If they click but don't install until after the window closes, that install is unattributed.

Example scenario:

Monday 2:00 PM: User clicks your TikTok ad
Friday 3:15 PM: User installs your app (4 days 1 hour later)

With 7-day CTA window: Install attributed to TikTok campaign
With 1-day CTA window: Install is unattributed
With 3-day CTA window: Install is unattributed

Click-through windows are deterministic—either the install happened within the window or it didn't. The user either clicked (and you have device-level data) or they didn't.

View-Through Attribution Windows

View-through attribution (VTA) windows measure the time between a user seeing (but not clicking) your ad and taking a conversion action.

VTA is probabilistic, not deterministic. You saw an impression, but you don't have a click signal. You're estimating that some % of post-impression installs were influenced by that impression.

Standard VTA windows:

  • Typical range: 1-30 days
  • Common defaults: 7 days, 14 days, 30 days
  • Aggressive: 1-3 days

View-through windows are less precise than click-through because you're relying on timing and audience overlap, not explicit user actions.

Example scenario:

Monday 2:00 PM: User sees your Meta ad impression (doesn't click)
Thursday 5:00 PM: User installs your app (3 days later)

With 7-day VTA window: Install may be attributed to Meta campaign (probabilistic)
With 1-day VTA window: Install is unattributed
With 30-day VTA window: Install more likely attributed to Meta campaign

When to Use Each Window Type

Click-through windows: Use for all campaigns where you have click data. Direct response campaigns, retargeting, most performance campaigns.

View-through windows: Use to capture awareness-driven installs where users see your ad but don't click immediately. View-through is less reliable but captures delayed impressions-to-conversions.

Best practice: Set aggressive click-through windows (1-7 days) for performance campaigns, and use view-through windows (7-30 days) supplementarily to capture brand-driven installs.

Attribution Window Settings by Platform

Each ad network has default settings and customization options. Understanding platform defaults is essential.

Meta (Facebook, Instagram, Audience Network)

Meta allows separate configuration for click-through and view-through windows.

Default settings:

  • Click-through: 28 days
  • View-through: 1 day

Customization: You can adjust both windows in Meta Ads Manager.

Campaign Settings > Optimization
- Attribution Window: Click 28d / View 1d (default)
- Can change to:
  - 1d / 1d (aggressive, direct response only)
  - 7d / 7d (balanced)
  - 28d / 28d (generous, brand campaigns)

Recommendation: For mobile UA campaigns, use 7-day click-through, 1-day view-through unless you specifically target brand awareness.

Google offers two primary attribution models: Last-Click and Data-Driven. Window configuration varies by model.

Default settings:

  • Last-Click model: 30-day click, 30-day view
  • Data-Driven model: 90-day click, 90-day view

Customization: Available in campaign settings under "Conversion tracking."

Campaign Settings > Conversion Tracking
- Attribution Model: Last-Click or Data-Driven
- Windows: 1, 7, 14, 30 days (click); 1, 7, 14, 30 days (view)

Recommendation: For app install campaigns, use 7-day click window and 1-day view window. Google's defaults (30+ days) are too aggressive for most mobile scenarios.

TikTok Ads

TikTok offers attribution windows in their conversion tracking settings.

Default settings:

  • Click-through: 7 days
  • View-through: Not separately configurable; integrated into window setting

Customization: Limited. You can set 1-day or 7-day windows.

Campaign Settings > Conversion Tracking
- Attribution Window: 1 day or 7 days

Recommendation: Use 7-day attribution for most TikTok campaigns. TikTok's defaults are appropriate for mobile UA.

Snap Ads

Snap provides simple window configuration.

Default settings:

  • Attribution Window: 28 days

Customization: Available in campaign setup.

Campaign Settings > Attribution
- Window: 1, 7, or 28 days

Recommendation: Use 7-day windows for performance campaigns, 28-day for brand awareness.

Reddit Ads

Reddit offers attribution windows similar to other platforms.

Default settings:

  • Attribution Window: 28 days

Customization: Limited; typically 1, 7, or 28 days.

Recommendation: Use 7-day windows for direct response, 28-day for awareness campaigns.

X Ads (formerly Twitter)

X's attribution window settings are less flexible than competitors.

Default settings:

  • Attribution Window: 30 days (or platform default)

Customization: Limited; check X Ads documentation for current options.

Recommendation: Use shortest available window for performance campaigns.

Unity Ads

Unity Ads (targeting games) offers customizable attribution windows.

Default settings:

  • Typical window: 28 days

Customization: Configure in campaign settings.

Recommendation: For game installs, 7-day windows are standard.

How Attribution Windows Affect Your Reported Metrics

Window settings directly change your reported performance data. This is critical to understand because different windows create different "truths."

Same Campaign, Different Windows = Different Results

Let's say you run a TikTok campaign driving app installs:

Total raw events: 5,000 clicks, 850 installs

With 1-day click window:
- Attributed installs: 340 (install within 24 hours)
- CPA: $300 per install
- Reported ROAS: Appears weak

With 7-day click window:
- Attributed installs: 680 (install within 7 days)
- CPA: $150 per install
- Reported ROAS: Appears much stronger

With 28-day click window:
- Attributed installs: 760 (install within 28 days)
- CPA: $134 per install
- Reported ROAS: Appears strongest

Same campaign. Same spend. Same installs. Three different "truths" based on window settings.

This is why comparing campaigns across platforms with different attribution windows is misleading. Meta's 28-day default makes Meta campaigns appear better-performing than TikTok's 7-day default, even if they're equally effective.

The Danger of Misaligned Windows

Imagine you have:

  • TikTok campaigns: 7-day attribution window
  • Meta campaigns: 28-day attribution window

You'll systematically overestimate Meta performance and underestimate TikTok performance. Your optimization will skew budget toward Meta, even if TikTok is actually more efficient. This is a critical mistake in multi-channel campaigns.

Best practice: Standardize attribution windows across all platforms for accurate comparison.

SKAdNetwork Attribution Windows

Post-iOS-14, Apple's SKAdNetwork introduces its own attribution window concept, separate from MMP attribution windows.

SKAdNetwork operates with fixed windows:

SKAdNetwork default windows:

  • 2-day window: High confidence (most accurate)
  • 24-day window: Low confidence (probabilistic)

Apple applies these windows server-side. Your MMP also applies its own attribution windows. Both work in parallel.

How SKAdNetwork windows work:

Scenario 1: Install within 2 days of ad impression
SKAdNetwork: 2-day window applies, high confidence signal
Your MMP: Your click-through window applies (e.g., 7 days)
Result: Both systems attribute the install

Scenario 2: Install 3-10 days after ad impression
SKAdNetwork: 24-day window applies, low confidence signal
Your MMP: Your click-through window applies (e.g., 7 days, so attributed)
Result: Your MMP attributes; SKAdNetwork counts as low confidence

Scenario 3: Install 15 days after ad impression
SKAdNetwork: 24-day window applies (might attribute)
Your MMP: 7-day window doesn't apply; unattributed
Result: Discrepancy between SKAdNetwork and MMP data

This discrepancy between SKAdNetwork windows and MMP windows is a major source of confusion in post-ATT mobile marketing.

Managing SKAdNetwork Windows

You can't control Apple's windows directly, but you should align your MMP windows to SKAdNetwork's reality:

  • Set click-through window to at least 7 days to capture most SKAdNetwork-attributed installs
  • Recognize that low-confidence SKAdNetwork data (beyond 2 days) may not perfectly align with your MMP data
  • Use probabilistic attribution layers (like Audiencelab's signal engineering) to bridge SKAdNetwork and first-party data gaps

Choosing the Right Attribution Window for Your Goals

The optimal attribution window depends on your business model, user behavior, and optimization goals.

Decision Framework

Ask yourself these questions:

1. What's my conversion type?

  • App installs: 7-28 days (depending on campaign type)
  • Subscription: 7-14 days
  • In-app purchase: 3-7 days (users who convert quickly are highest value)
  • Content engagement: 1-3 days

2. What's my campaign objective?

  • Direct response (performance): 1-7 days
  • Brand awareness: 7-28 days
  • Re-engagement/retention: 1-3 days

3. What's my typical delay?

  • Immediate converters (e.g., puzzle games): 1-3 days
  • Medium delay (e.g., RPGs, social games): 3-7 days
  • Long delay (e.g., productivity apps): 7-14 days

4. What's my user acquisition stage?

  • High-intent users (search retargeting): 1-3 days
  • Mid-funnel users (lookalike audiences): 3-7 days
  • Broad awareness audiences: 7-14 days

Games (Casual, Puzzle):

  • Click-through: 3-7 days (fast converters)
  • View-through: 1-3 days
  • Rationale: Game players decide quickly; delays indicate less engaged users

Games (RPG, Multiplayer, Hardcore):

  • Click-through: 7-14 days (heavier apps, longer evaluation period)
  • View-through: 3-7 days
  • Rationale: Complex games warrant longer consideration

Social Apps:

  • Click-through: 7-14 days (social apps have network effects; users take time)
  • View-through: 1-7 days
  • Rationale: Viral products see delayed adoption as users invite friends

Productivity/Utility Apps:

  • Click-through: 7-14 days (high consideration)
  • View-through: 3-7 days
  • Rationale: These apps need deliberation; spontaneous installs are rare

Subscription Apps (Premium):

  • Click-through: 7-14 days (high consideration)
  • View-through: 1-3 days
  • Rationale: Subscription decisions take time; view-through captures awareness only

Finance/Banking Apps:

  • Click-through: 14-28 days (very high consideration)
  • View-through: 1-7 days
  • Rationale: Trust-building requires time; installs far from exposure are less reliable

Testing Different Windows

Rather than guessing, test empirically:

  1. Run duplicate campaigns with identical targeting and creatives
  2. Set different attribution windows on each campaign
  3. Run for 4-6 weeks to gather sufficient data
  4. Compare cost per high-value conversion (e.g., subscription, in-app purchase >$5)
  5. Choose the window that most reliably predicts LTV

The "right" window is the one where attributed installs show the highest LTV, not the one that inflates install volume.

Privacy Implications of Attribution Windows

Longer attribution windows increase user privacy risks. Here's why:

Privacy trade-off: Longer windows require more user-level data matching to function. In a privacy-first environment, this becomes impossible.

1-day window: Can work with minimal cross-domain tracking
7-day window: Requires cookies or device IDs to persist
14-28 day windows: Requires persistent identification across sites/apps

In post-ATT iOS, longer windows become less reliable because Apple limits cross-domain tracking. This is why modern attribution increasingly relies on probabilistic signals rather than deterministic windows.

Best practice for privacy: Use shorter windows (7 days or less) when possible. Longer windows increasingly require third-party data partnerships that may not scale in privacy-restricted environments.

Practical Implementation Checklist

  1. Audit current windows: Document attribution windows for each platform
  2. Standardize across platforms: Set consistent windows (e.g., 7-day click-through across all platforms)
  3. Test variants: Create A/B tests comparing 3-day vs 7-day windows
  4. Track LTV, not volume: Measure cost per high-value action, not raw installs
  5. Review quarterly: User behavior changes; windows should evolve
  6. Communicate to stakeholders: Explain why windows matter for accurate reporting

FAQ: Attribution Windows

Q: Why do my MMP numbers differ from platform native reporting? A: Different attribution windows. MMPs often use shorter, more conservative windows than platform defaults.

Q: What if my window is too short and I'm undercounting installs? A: Better to undercount than overcount. Underestimating performance leads to conservative optimization; overestimating wastes budget on low-quality traffic.

Q: Should I use the same window for all campaigns? A: No. Different campaign types warrant different windows. Awareness campaigns use longer windows; performance campaigns use shorter windows.

Q: How do SKAdNetwork windows interact with my MMP windows? A: Both apply in parallel. Use MMP windows that align with SKAdNetwork's 2-day and 24-day windows for best data alignment.

Q: What's the optimal window for subscription apps? A: 7-14 days for click-through (high consideration). Installs beyond 14 days rarely convert to subscriptions.

Q: Can I change attribution windows mid-campaign? A: Yes, but it changes how historical data is reported. Make changes deliberately, not constantly.

Q: How do I measure if my window choice is correct? A: Compare cost per high-LTV action (subscription, $5+ purchase) across different windows. The window showing lowest cost per quality install is optimal.

Next Steps: Optimizing Your Attribution Windows

Attribution windows are foundational to accurate mobile measurement. Getting them right requires intentionality and testing.

  1. Standardize windows across platforms (start with 7-day click-through)
  2. Run window comparison tests for your app type
  3. Measure impact on LTV metrics, not just install volume
  4. Adjust based on real performance data

Audiencelab's web-to-app platform works across attribution windows, using probabilistic signal engineering to provide accurate metrics even as deterministic attribution becomes less reliable in the privacy-first era. Understanding attribution windows helps you interpret Audiencelab's signals correctly and optimize campaigns for real impact.

Ready to master attribution and optimize for quality installs? Join Audiencelab to layer probabilistic attribution insights on top of your attribution window strategy and unlock better cost per quality install across all channels.