June 23, 2021

The CheapBit of Fitness Trackers Apps

Yan Shvartzshnaider (@ynotez) and Madelyn Sanfilippo (@MrsMRS_PhD)

Fitness trackers are “[devices] that you can wear that records your daily physical activity, as well as other information about your health, such as your heart rate” [Oxford Dictionary]. The increasing popularity of wearable devices offered by Apple, Google, Nike inadvertently led cheaper versions to flood the market, along with the emergence of alternative non-tech, but fashionable brand devices. Cheaper versions ostensibly offer similar functionality for one-tenth of the price, which makes them very appealing to consumers. On Amazon, many of these devices receive overall positive feedback and an average of 4-5 star reviews. Some of them are even labeled as “Amazon’s choice” and “Best buyer” (e.g. Figure 1), which reinforces their popularity.

In this blog post, we examine privacy issues around these cheaper alternatives devices, specifically focusing on the ambiguities around third party apps they are using. We report our preliminary results into a few apps that seem to dominate the marketspace. Note that fashion brands also employ third party apps like WearOS by Google, but they tend to be more recognizable and subject to greater consumer protection scrutiny. This makes them different than lesser-known devices.

Figure 1: LETSCOM, uses VeryFitPro, with over 13K reviews, labeled as Amazon’s Choice and is marketed to children.

Do consumers in fact pay dearly for the cheaper version of these devices?

Privacy issues are not unique to cheaper brands. Any “smart device” that has the ability to collect, process and share information about you and the surrounding environment, can potentially violate your privacy.  Security issues also play an important role. Services like Mozilla’s Privacy Not Included and Consumer reports help navigate the treacherous landscape.  However, even upholding the Minimum Security Standards  doesn’t prevent privacy violations due to inappropriate use of information, see Strava and Polar incidents.  

Given that most of the analysis is typically done by an app paired with a fitness tracker, we decided to examine the “CheapBit” products sold on Amazon,  with a large average number of reviews and answered questions, to see which apps they pair with. We found that the less-expensive brands are dominated by a few third-party apps primarily developed by small teams (or individuals) and do not provide any real description as to how data are used and shared. 

But what do we know about these apps?   

The VeryFitPro app seems to be the choice of many of the users buying the cheaper fitness trackers alternatives. The app has  5,000,000+ installs, according to Google Play, where it lists an email of the developer and the website with just a QR code to download the app. The app has access to an extensive list of permissions: SMS, Camera, Location, Wifi information, Device ID & Call information, Device & app history, Identity, Phone, Storage, Contacts, and Photo/Media/Files! The brief privacy policy appears to be translated into English using an automatic translation tool, such as Google Translate.

Surprisingly,  what appears to be the same app on the Apple Store points to a different privacy policy altogether, hosted on a Facebook page! The app  provides a different contact email  () and policy is even shorter than on the Play Store. In a three-paragraph policy, we are reassured that  “some of your fitness information and sports data will be stored in the app, but your daily activities data will never be shared without permission.” and with a traditional “We reserve the right, in our decision to change, modify, add or remove portions of this policy at any time. Please check this page periodically for any changes. Publish any changes to these terms if you continue to use our App future will mean that you have accepted these adjustments. [sic]” No additional information is provided.

While we found the VeryFitPro to be common among cheap fitness trackers, especially high-rated ones, it is not unique. Other apps such as JYouPro, which has access to the same range of permissions, offer privacy policy which is just two paragraphs long which also reassures users that “[they] don’t store personal information on our servers unless required for the on-going operation of one of our services.” The Apple version offers a slightly longer version of the policy. In it, we find that “When you synchronise the Band data, e.g. to JYouPro Cloud Service, we may collect data relating to your activities and functionalities of JYouPro, such as those obtained from our sensors and features on JYouPro, your sleeping patterns, movement data, heart rate data, and smart alarm related information.” Given that JYouPro is used by a large number of devices, their “Cloud service” seems to be sitting on a very lucrative data set. The policy warns us: “Please note also that for the above, JYouPro may use overseas facilities operated and controlled by JYouPro to process or back up your personal data. Currently, JYouPro has data centres in Beijing and Singapore.

These are however not the worst offenders. Developers behind apps like MorePro and Wearfit didn’t even bother to translate their privacy policies from Chinese!

Users’ privacy concerns

These third-party apps are incredibly popular and pervade the low-end wearable market: VeryFitPro ( 5,000,000+ installs), JYouPro (500,000+ installs), WearFit (1,000,000+ installs). With little oversight, they are able to collect and process lots of potentially sensitive information from having access to contacts, camera, location, and other sensors data from a large number of users.  Most of them are developed by small teams or unknown Chinese firms, which dominate the mHealth market.  

A small portion of users on Amazon express privacy concerns. For one of the top selling products LETSCOM Fitness Tracker  which uses VeryFitPro with 4/5 stars, 14,420 ratings and 1000+ answered questions, marketed towards “Kids Women and Men”, we were able to find only a few questions on privacy.  Notably, none of the questions was upvoted, so we suspect the remain unseen by the typical buyer. For example, one user was asking “What is the privacy policy for the app? How secure is the personal information? [sic]” to which another user (not the manufacturer) replied “A: This connects to your phone by bluetooth. That being said, I guess you could connect it only when you are in a secure location but then you wouldn’t have the message or phone notifications.” A similar concern was raised by another user “What is this company’s policy on data privacy? Will they share or sell the data to third parties?”

In another popular product, Lintelek Fitness Tracker with Heart Rate Monitor which used VeryFitPro with 4/5 stars, 4,050 ratings. Out of 1000+ answered questions, only a couple mentioned privacy. The first user gave a product 1 start with ominous warning “Be sure to read the privacy agreement before accepting this download”. Interestingly, the second user rated the product with 5 stars and gave a very positive review that ends with “Only CON: read the privacy statement if you are going to use the text/call feature. They can use your information. I never turned it on – I always have my phone anyway.

The fact that buyers of these devices do not investigate the privacy issues is troubling. Previous research showed that consumers will think that if a company has a privacy policy it protects their privacy. It seems to be clear that consumers need help from the platform. Amazon, Google and Apple ought to better inform consumers about potential privacy violations. In addition to consumer protection obligations by these platforms, regulators ought to apply increased scrutiny. While software are not conventional medical devices, hence not covered by HIPAA, some medical apps do fall under FDA authority, including apps that correspond with wearables.  Furthermore, as in Figure 1 shows, these devices are marketed to children so the app should be subject to enforcement of children’s privacy standards like COPPA

In conclusion, the lesser-known fitness tracking brands offer a cheaper alternative to high-end market products. However, as previous research showed, consumers of these devices are potentially paying a high-privacy price. The consumers are left to fend for themselves. In many cases, the cheaper devices pertaining to firms outside of US jurisdiction and thus US and European regulations are difficult to enforce.  Furthermore, global platforms like Amazon, Google, Apple, and others seem to turn a blind eye to privacy issues and help to promote these devices and apps. They offer unhelpful and possibly misleading labels to the consumers such as Amazon’s “best seller”, “Amazon’s choice”, Google’s Play Store’s download count and star ratings, which exacerbate an already global and complex issue. It requires proactive action on behalf of all parties to offer lasting protection of users’ privacy, one that incorporates the notions of established societal norms and expectations.


We would like to thank Helen Nissenbaum for offering her thoughts on the topic.

Announcing the Princeton-Leuven Longitudinal Corpus of Privacy Policies

We are releasing a reference dataset of over 1 million privacy policy snapshots from more than 100,000 websites, spanning over two decades.

By Ryan Amos, Elena Lucherini, Gunes Acar, Jonathan Mayer, Arvind Narayanan and Mihir Kshirsagar.

Automated analysis of privacy policies has proved useful in several research efforts, leading to results such as interactive deep-learning based policy summaries and compliance detection. These studies have highlighted the need for more sophisticated methods and data.

The analyses so far have been limited to a single point in time, or to short spans of time, as researchers didn’t have access to a large-scale longitudinal dataset that can be used to study how privacy policies have changed with time. 

To address this gap, we are releasing a dataset of over 1 million privacy policies collected from the Internet Archive’s Wayback Machine. To build this dataset, we developed a custom crawler that detects and downloads privacy policies from archived web pages. We processed the downloaded policies to clean up error pages, extract the text of the privacy policies, and filter out non-policy documents using machine learning.

Data Overview

This dataset contains 1 million English-language privacy policy snapshots from over 100,000 distinct websites chosen from the Alexa Top 100K from 2009-2019. In addition to sanitized privacy policy text and raw webpage HTML, the dataset includes metadata such as the archival time and the website URL that the policy belongs to. Although the dataset contains policies from as early as the late 1990s, more than 90% of the policies are from 2007 or later.

Obtaining access

Please send an email to stating your name and affiliation.

Since we are finalizing the data schema, format, and metadata, we would like to hear your specific requirements, if you have any.

Improving Protections for Children’s Privacy Online

CITP’s Tech Policy Clinic submitted a Comment to the Federal Trade Commission in connection with its review of the COPPA Rule to protect children’s privacy online. Our Comment explains why it is important to update the COPPA Rule to keep it current with new privacy risks, especially as children spend increasing amounts of time online on a variety of connected devices.

What is the Children’s Online Privacy Protection Act (COPPA)?

As background, Congress in 1998 gave the FTC authority to issue rules that govern how online commercial service providers should collect, use or disclose information about children under the age of 13. The FTC issued the first version of the Rule in 2000 which requires providers to place parents in control over what information is collected from their young children online. The Rule applies to both providers of services directed to children under 13 as well as those serving a general audience who have actual knowledge that they are collecting, using, or disclosing personal information from children under 13. This Rule was subsequently revised, after a period of public comment, in 2013 to account for technological developments, including the pervasive use of mobile apps. In 2019, the FTC announced it was revisiting the Rule in light of ongoing questions about the efficacy of the Rule in a data-fueled online marketplace and soliciting public comment on potential improvements to the Rule. 

Core Recommendations to Update the COPPA Rule

Our Comment makes three main points:

  • We  encourage the FTC to develop rules that promote external scrutiny of provider practices by making the provider’s choices about how they are complying with the Rule available in a transparent and machine-readable format. 
  • We recommend that the FTC allow providers to rely on an exemption from collecting or tracking information related to “internal operations” only under extremely limited circumstances, otherwise the exception risks swallowing the rule. 
  • We offer some suggestions on how education technology providers should be responsive to parents and recommend that the FTC conduct further studies about how such technology is being used in practice. 

We elaborate on each point below.

Enabling Effective External Compliance Checks Through Transparency

One of the central challenges with the COPPA Rule today is that it is very difficult for external observers (parents, researchers, journalists or advocacy groups) to understand how an online provider has decided to comply with the Rule. For example, it is not clear if a site believes it is in compliance with the Rule because it argues that none of its content is directed at children or because it has implemented rules that seek appropriate consent before gathering information about users. Making a provider’s choices on compliance transparent will enable meaningful external scrutiny of practices and hold providers to account. 

Under the COPPA Rule providers are responsible for determining whether or not a service is child directed by looking to a variety of factors. If the service is directed at children, then the provider must ensure they have verified parental consent before collecting information about users. If the audience is of mixed age, then the provider must ensure that it does not collect information about users under the age of 13 without parental consent.

The determination about whether a service is child directed, as the FTC explains, includes factors such as “its subject matter, visual content, use of animated characters or child-oriented activities and incentives, music or other audio content, age of models, presence of child celebrities or celebrities who appeal to children, language or other characteristics of the Web site or online service, as well as whether advertising promoting or appearing on the Web site or online service is directed to children . . . [and] competent and reliable empirical evidence regarding audience composition, and evidence regarding the intended audience.” If the service is child directed and children under the age of 13 are the primary audience, then it is “primarily child directed.” If services that are child directed, but do not target children as the primary audience, they are “child directed, but mixed audience” services under the COPPA Rule. 

If a mixed audience service seeks to collect information about users it can choose to implement an age gate to ensure it does not collect data about underage users. An age gate is, a mechanism that asks users to provide their age or date of birth in an age-neutral way. 

Our principal recommendation is that the COPPA Rule should be revised to explicitly facilitate external scrutiny by requiring providers to make their design choices more open to external review. Specifically, we suggest that the FTC should make sites or services disclose, in a machine-readable format, whether they consider themselves, in whole or part, “directed to children” under COPPA. This allows academic researchers (or parents) to examine what the provider is actually doing to protect children’s privacy. 

We also recommend that the FTC establish a requirement that, if a website or online service is using an age gate as part of its determination that it is not child directed, it must publicly post a description of the operation of the age gate and what steps it took to validate that children under 13 cannot circumvent the age gate. 

In addition, drawing on our work on online dark patterns, we suggest that the FTC examine the verifiable parental consent mechanisms used by providers to ensure that parents are being given the opportunity to make fully informed and free choices about their child’s privacy. 

Finally, we suggest some ways that platforms such as iOS or Android can be enlisted by the FTC to play a more effective role in screening users and verifying ages.

Restrict Providers from Relying on the “Internal Operations” Exception

Another significant issue with current practices is that providers rely on an exception for providing parental notice and obtaining consent before collecting personal information when they use persistent identifiers for “internal operations.” The 2013 revisions to the Rule included this new exception, but required it to be used for a limited set of circumstances necessary to deliver the service. It appears many providers now use that exception for a wide variety of purposes that go well beyond what is strictly necessary to deliver the service. In particular, users have no external way to verify whether certain persistent identifiers, such as cookies, are being used for impermissible purposes. Therefore, our Comment urges the FTC to require providers to be transparent about how they rely on the “internal operations” exception when using certain persistent identifiers and limit the circumstances when the providers are allowed to use such an exception.

Give Parents Control Over Information Collected by Educational Technology Service Providers

Finally, our Comment addresses the FTC’s query about whether a specific exception for parental consent is warranted for the growing market of providers of educational technology services to children (and their parents) in the classroom and at home. We recommend that the FTC should study the use of educational technology in the field before considering a specific exception to parental consent. In particular, we explain that any rule should cover the following issues: First, parents should be told, in an accessible manner, what data educational technology providers collect about their children, how that data is used, who has access to the data, and how long it is retained. Parents should also have the right to request that data about their children are deleted. Second, school administrators should be given guidance on how to make informed decisions about selecting educational technology providers, develop policies that preserve student privacy, and train educators to implement those policies. Third, the rule should clarify how school administrators and educational technology providers are accountable to parents for how data about their children are collected, used and maintained. Fourth, the FTC needs to clearly define what is meant by “educational purposes” in the classroom in considering any exceptions for parental consent.

* The Comment was principally drafted by Jonathan Mayer and Mihir Kshirsagar, along with Marshini Chetty, Edward W. Felten, Arunesh Mathur, Arvind Narayanan, Victor Ongkowijaya, Matthew J. Salganik, Madelyn Sanfilippo, and Ari Ezra Waldman.