Status: Proposed 2026-04-24 (Kristerpher — conversational thread during PR #268 review) Supplement to: ADR 0017 — E2E with shadow-analytics posture Related PR: #268 Deciders: Kristerpher Henderson
ADR 0017 locked Option B: E2E encryption as base invariant, opt-in client-side shadow-analytics pipeline for consenting users. The 2026-04-24 review of PR #268 confirmed the surrounding decisions (k=20 anonymity, k-anonymity only for v1, analytics store as a separate service rather than a Raptor Blueprint, no DP-noise at v1, DPIA ack).
One question surfaced during that review that 0017 did not resolve: what happens when a user who previously opted out tries to opt back in? The initial instinct was "don't let them re-opt-in — it's simpler." On reflection, Kristerpher flagged that this instinct was answered without a clear picture of what the analytics data is actually for. Different goals imply different consent shapes, and the re-opt-in call is downstream of goal selection.
This ADR exists to articulate the data goals, rank them by near-term likelihood, and let the goal ranking settle the consent-UX open questions that 0017 left partially open (re-opt-in, single vs granular toggles, friction level at opt-in, default-copy framing).
Each is a real reason Raxx might want anonymized aggregate signal flowing into the analytics store. They have different product value, different consent weight, and different timing.
Purpose: Understand which UX flows break, which strategies confuse users, which features go unused, which error paths users hit most.
What the data looks like: Error-type histograms. Page-dwell-time buckets. Strategy-construction-abandoned-at-step counts. Feature-visit frequency.
Consent framing: "Help us fix what's broken and build what's useful." Users generally say yes.
Re-opt-in stakes: Low. If a user opts out and later opts in, they've helped us improve the product in the interim via the aggregate signal from other users. Re-opt-in just resumes that contribution.
Purpose: Power recommendations grounded in what actually works for cohorts. "Users in your investor-profile + regime-mix + risk-tolerance bucket had strong results with cash-secured puts on high-implied-vol single-names" — sourced from anonymized aggregate performance of similar users.
What the data looks like: Strategy-family × regime × win-rate-bucket × hold-period-bucket aggregates. Never individual trades.
Consent framing: "Share your signal to get better recommendations. The more users who opt in, the better your recommendations get." Reciprocal benefit.
Re-opt-in stakes: Medium. A user who opts out sees neighbors (friends, community) getting recommendations while their own feed is generic. Some will want back in. Locking them out creates permanent product friction.
Purpose: Raxx's shadow-analytics store is itself a valuable asset — anonymized aggregate trading behavior at retail-options scale is useful to market researchers, academics, or institutional buy-side. Raxx could monetize access (subscriptions, research licenses) or publish reports.
What the data looks like: Same aggregate signal as Goal 2, but exposed via a separate B2B product surface (API, published research, licensed dataset).
Consent framing: Heaviest. Under GDPR this is "sharing with third parties for commercial purpose" — different consent category than "product improvement." CCPA "do not sell" language applies in CA. Distinct opt-in required; cannot bundle with Goal 1/2 consent.
Re-opt-in stakes: Extreme. Withdrawal of consent here must be permanent and auditable — Raxx must be able to prove, per-user, which consent state was active at each moment data was used.
Timing: Genuinely interesting (per Kristerpher's PR #268 review comment: "there might be a value opportunity there in the future"), but not a v1 concern.
Purpose: Powers features like #211 (Founders referral leaderboard), future strategy leaderboards, community-insight widgets. User performance visible in aggregate + opt-in peer comparisons.
What the data looks like: Cohort-aggregate performance per leaderboard slice. Individual-user signal only surfaces if the user specifically opts into being visible.
Consent framing: Similar to social-media profile visibility. "Show your stats (anonymized / pseudonymous) in this leaderboard?" Users expect granular control.
Re-opt-in stakes: Medium-low. Turning leaderboard participation off and back on is expected behavior, like hiding/unhiding a social-media profile.
Timing: Matters when #211 ships and future leaderboards follow.
| Goal | MBT v1 | v1.5 | v2+ | Consent weight |
|---|---|---|---|---|
| 1. Product improvement | ✓ primary | ✓ | ✓ | Light |
| 2. Strategy recommendations | ✓ secondary | ✓ primary | ✓ | Medium |
| 3. Data product | — | — | maybe | Heavy |
| 4. Social / gamification | — | ✓ (leaderboards ship) | ✓ | Medium |
v1 serves Goals 1 + 2. Goal 3 is future-proofed by the shadow pipeline architecture (data is already anonymized + separated), but is NOT a v1 consent ask. Goal 4 waits on leaderboard features shipping.
Consent UX at v1 covers Goals 1 + 2 together — a single coherent ask framed as "help improve the product + improve your recommendations." Goal 3 (data-product) is explicitly out of scope for v1 consent; if that product line is later pursued, users will be asked for a new, distinct consent.
Single bundled toggle for Goals 1 + 2 at v1. Not granular sub-toggles, because: - The two goals are served by essentially the same aggregate signal at v1 - Multiple toggles = multiple confusing choices; single toggle = clear "yes / no" decision - Granular toggles can be introduced at v1.5 if user feedback requests them
Language on the toggle: something close to "Share anonymized usage signals so we can improve the product and your recommendations" + a link to a plain-language explainer of exactly what's shared.
Privacy toggles live in a dedicated, clearly-labeled Privacy panel within Settings (e.g. Settings → Privacy), NOT scattered across the app or buried inside unrelated settings pages. The panel is a durable, first-class destination — linkable, discoverable, and exhaustive (every toggle that affects what data Raxx sees or stores is listed here, not just the shadow-analytics toggle).
Rationale: - Users looking for "what can I control about my data?" should find the answer in one place on the first try - Scattered toggles create the perception (and reality) of hidden controls — a trust cost - Centralization also simplifies engineering: any new consent control lands in one panel, with one consistent pattern
This principle applies not only to the shadow-analytics toggle but to any future privacy-affecting control (Goal 3 consent if ever shipped, leaderboard visibility toggles per Goal 4, telemetry-only-if-crash toggles, etc.).
Precedent references (apps Kristerpher has cited approvingly): Proton's Settings → Privacy, Apple's Settings → Privacy & Security, Firefox's Settings → Privacy & Security.
v1 Privacy panel contents (minimum): - Shadow-analytics consent toggle (Goals 1 + 2 bundle, per this ADR) - Link to the privacy-policy page - Link to the data-export flow (GDPR Art. 20 right-of-access) - Link to the account-deletion flow (GDPR Art. 17 right-to-erasure) - Consent-history viewer (read-only; shows the user their own consent transitions over time)
Allow re-opt-in, with a confirmation screen. This reverses the initial "no re-opt-in" instinct from the #268 review, because:
Friction: Re-opt-in requires a second confirmation ("are you sure? you previously withdrew consent on [date]; opting in resumes data contribution from now forward. Past data is gone and will not be restored."). This is consistent with how reputable privacy-respecting products (Proton, Tuta) handle re-engagement.
Non-gaming protection: We track the count + cadence of opt-in/opt-out cycles per user. Rapid cycling (> 3 transitions in < 30 days) triggers a quiet flag and the user sees a different copy on their next opt-in attempt ("you've toggled this several times recently — let us know if the choice isn't clear"). No lockout, just a signal that UX needs work for them.
These re-open if goal ranking changes:
consent_history append-only table tracking every transition per user, sufficient to prove consent-state-at-time for GDPR Art. 7 compliance. No separate re-opt-in-block table needed.