This article explores why companies push people into apps and what that means for privacy.
The App Funnel Is Not an Accident
You see it everywhere. A website works, but the company still tells you to “use the app.” Banks push app-only alerts. Retailers hide deals behind app installs. Services make mobile checkout, account recovery, loyalty rewards, or support feel easier inside the app than on the web.
That pattern is not random. It reflects a simple business reality: apps give companies a more controlled environment for sign-ins, notifications, background updates, measurement, and repeat engagement. Regulators and platform documentation also make clear that apps can touch more sensitive data and involve more behind-the-scenes actors than the web.
“The mobile environment poses greater risks … than the web.”
Apps Give Companies More Control Than the Mobile Web
Native apps are not just miniature websites. They can be wired into the operating system in ways the browser usually is not. Apple’s developer documentation explicitly supports background notifications and background updates, while Android’s documentation explains how apps can request background location and how that access can also come through embedded SDKs or libraries.
That does not mean the web is harmless. Browsers can also access location, camera, microphone, notifications, and on-device storage when users grant permission, and UK ICO guidance makes clear that tracking and storage rules apply in browsers, mobile apps, and connected devices alike. The difference is not “web safe, app dangerous.” The real difference is that apps often combine permissions, persistent app sessions, operating-system hooks, and third-party SDKs into a denser, harder-to-see surveillance surface.
The Privacy Problem Starts With Permissions — and Gets Worse With SDKs
When people think about app privacy, they usually think about the permission pop-up. That is only the front door. The bigger issue is what happens after access is granted and how many parties sit behind the app itself.
CNIL’s 2025 guidance says mobile apps can access more varied and more sensitive data, including real-time location, photos, and health data, and notes that many stakeholders can be involved in a single app. Android’s developer guidance is even more direct: if an SDK or library accesses location, that access is attributed to the app. In plain English, an app can look like one service while quietly acting as a bundle of publisher code, analytics tools, ad-tech software, and third-party infrastructure.
That matters because privacy harm is rarely just one company “taking data.” It is often a chain: permission, collection, enrichment, sharing, measurement, targeting, retention, and disclosure. The longer that chain gets, the harder it becomes for users to know who has what, why they have it, and where it goes next.
The Business Incentive Is Obvious
The strongest evidence is not just technical. It is economic. A major study of Apple’s privacy labels found that the average app in the sample collected 24 data items across 16 data types, and that 80% of those collected data items were used for purposes unrelated to app functionality. After Apple made those practices easier to see, the average iOS app experienced a 14% drop in weekly downloads and a 15% drop in revenue relative to its Android counterpart.
That is the part many companies would rather not say out loud. Data collection is not just a side effect of the app economy. For many apps, it is part of the product logic. The more an app can personalize, measure, retarget, and re-engage, the more commercial value it can extract from the device in your pocket.
This Is a Global Privacy Problem, Not a Local One
The pattern shows up across markets, legal systems, and business models.
- France / EU: CNIL says mobile apps pose greater confidentiality and security risks than the web and launched a 2025 enforcement push around app privacy, permissions, and consent.
- United States: In January 2025, the FTC finalized an order against Gravy Analytics and Venntel, prohibiting them from selling, disclosing, or using sensitive location data except in limited circumstances tied to national security or law enforcement.
- India: Research discussed by Digital Impact Alliance, drawing on Arrka’s State of Privacy 2022 study, said 42% of Indian apps declared collecting exact location data while 76% were actually collecting it.
- China / WeChat ecosystem: Citizen Lab found that all WeChat Mini Programs in its study were enrolled in usage tracking, that the most fine-grained activity tracking occurred during Mini Program execution, and that permission boundaries between Mini Programs and the host platform were unclear.
Different countries have different laws, different platforms, and different levels of enforcement. The common thread is the same: the mobile ecosystem keeps creating more opportunities to collect, combine, and operationalize user data than most people can realistically track.
Push Notifications Are Useful — and They Also Create a Trail
Push notifications are sold as convenience. Sometimes they are. They can keep apps updated in the background, surface account alerts, and pull users back into the service at the exact moment a company wants attention. Apple’s own developer material promotes notifications and background updates for timely, relevant content.
But push infrastructure also creates records. Apple says that when a user allows an app to receive notifications, an APNs token is generated and registered to that developer and device, and that push-token requests from governments generally seek identifying details tied to the Apple account associated with the token. Apple’s U.S. law-enforcement guidelines say the Apple ID associated with a registered APNs token and associated records may be obtained with a 2703(d) order or a search warrant. Apple’s transparency page also shows worldwide push-token requests rising from 70 in 2022 H2 to 393 in 2025 H1.
That does not mean every notification is a surveillance event. It does mean the system creates another layer of metadata and another point where identity, app use, and legal process can intersect. Senator Ron Wyden’s 2023 letter made that concern public by pressing for transparency about government demands for push-notification records.
What Companies Gain — and What Users Risk
In practice, the tradeoff often looks like this:
| Company goal | App-side advantage | Privacy cost to the user |
|---|---|---|
| Higher retention | Push notifications, background updates, persistent sessions | More behavioral signals and more opportunities for re-engagement |
| Better measurement | Embedded analytics SDKs and device-linked events | More third parties in the data chain and less visibility into sharing |
| More personalization | Access to location, camera, contacts, and app activity | A richer profile of habits, movement, and relationships |
| Stronger ad targeting | Linked identifiers and detailed event data | Easier profiling, segmentation, and downstream monetization |
| Tighter platform control | App-only flows for loyalty, checkout, support, or identity | Fewer privacy-friendly alternatives and less user choice |
This is not a claim that every app uses every invasive tool. It is a map of the incentives the mobile ecosystem keeps rewarding. The more the product depends on growth, attribution, and re-engagement, the more pressure there is to ask for access, add SDKs, and keep the user inside the app funnel.
The Red Flags Are Usually Easy to Spot
The warning signs are not subtle.
- The website works, but the company still hides key features behind the app.
- The app asks for notifications, contacts, or location before it has earned trust.
- The service says consent is optional, but the experience is degraded unless you agree.
- The privacy notice is broad, vague, or written to describe almost any future data use.
- The app’s “convenience” features happen to line up neatly with more tracking, more identifiers, and more background access.
When those signs stack up, the privacy issue is usually not accidental. It is structural.
Convenience Is Real. So Is the Tradeoff.
None of this means apps are inherently evil. Some app features are genuinely useful. Some are safer than browser flows. Some need device-level access to work at all. The problem starts when companies use that truth as cover for excessive permissions, unnecessary app-only design, opaque SDK chains, and data practices that outrun what the service actually needs.
The blunt version is this: companies keep funneling people into mobile apps because apps are better for business control. That control can improve convenience, but it can also expand tracking, increase dependency, and make data collection harder to see and harder to refuse. If a service keeps pushing you into the app, the question is not just whether the app is faster. The real question is what the company gains once you stop using the open web and start living inside its software.