December 5, 2020

The CheapBit of Fitness Trackers Apps

Yan Shvartzshnaider (@ynotez) and Madelyn Sanfilippo (@MrsMRS_PhD)

Fitness trackers are “[devices] that you can wear that records your daily physical activity, as well as other information about your health, such as your heart rate” [Oxford Dictionary]. The increasing popularity of wearable devices offered by Apple, Google, Nike inadvertently led cheaper versions to flood the market, along with the emergence of alternative non-tech, but fashionable brand devices. Cheaper versions ostensibly offer similar functionality for one-tenth of the price, which makes them very appealing to consumers. On Amazon, many of these devices receive overall positive feedback and an average of 4-5 star reviews. Some of them are even labeled as “Amazon’s choice” and “Best buyer” (e.g. Figure 1), which reinforces their popularity.

In this blog post, we examine privacy issues around these cheaper alternatives devices, specifically focusing on the ambiguities around third party apps they are using. We report our preliminary results into a few apps that seem to dominate the marketspace. Note that fashion brands also employ third party apps like WearOS by Google, but they tend to be more recognizable and subject to greater consumer protection scrutiny. This makes them different than lesser-known devices.

Figure 1: LETSCOM, uses VeryFitPro, with over 13K reviews, labeled as Amazon’s Choice and is marketed to children.

Do consumers in fact pay dearly for the cheaper version of these devices?

Privacy issues are not unique to cheaper brands. Any “smart device” that has the ability to collect, process and share information about you and the surrounding environment, can potentially violate your privacy.  Security issues also play an important role. Services like Mozilla’s Privacy Not Included and Consumer reports help navigate the treacherous landscape.  However, even upholding the Minimum Security Standards  doesn’t prevent privacy violations due to inappropriate use of information, see Strava and Polar incidents.  

Given that most of the analysis is typically done by an app paired with a fitness tracker, we decided to examine the “CheapBit” products sold on Amazon,  with a large average number of reviews and answered questions, to see which apps they pair with. We found that the less-expensive brands are dominated by a few third-party apps primarily developed by small teams (or individuals) and do not provide any real description as to how data are used and shared. 

But what do we know about these apps?   

The VeryFitPro app seems to be the choice of many of the users buying the cheaper fitness trackers alternatives. The app has  5,000,000+ installs, according to Google Play, where it lists an email of the developer and the website with just a QR code to download the app. The app has access to an extensive list of permissions: SMS, Camera, Location, Wifi information, Device ID & Call information, Device & app history, Identity, Phone, Storage, Contacts, and Photo/Media/Files! The brief privacy policy appears to be translated into English using an automatic translation tool, such as Google Translate.

Surprisingly,  what appears to be the same app on the Apple Store points to a different privacy policy altogether, hosted on a Facebook page! The app  provides a different contact email  () and policy is even shorter than on the Play Store. In a three-paragraph policy, we are reassured that  “some of your fitness information and sports data will be stored in the app, but your daily activities data will never be shared without permission.” and with a traditional “We reserve the right, in our decision to change, modify, add or remove portions of this policy at any time. Please check this page periodically for any changes. Publish any changes to these terms if you continue to use our App future will mean that you have accepted these adjustments. [sic]” No additional information is provided.

While we found the VeryFitPro to be common among cheap fitness trackers, especially high-rated ones, it is not unique. Other apps such as JYouPro, which has access to the same range of permissions, offer privacy policy which is just two paragraphs long which also reassures users that “[they] don’t store personal information on our servers unless required for the on-going operation of one of our services.” The Apple version offers a slightly longer version of the policy. In it, we find that “When you synchronise the Band data, e.g. to JYouPro Cloud Service, we may collect data relating to your activities and functionalities of JYouPro, such as those obtained from our sensors and features on JYouPro, your sleeping patterns, movement data, heart rate data, and smart alarm related information.” Given that JYouPro is used by a large number of devices, their “Cloud service” seems to be sitting on a very lucrative data set. The policy warns us: “Please note also that for the above, JYouPro may use overseas facilities operated and controlled by JYouPro to process or back up your personal data. Currently, JYouPro has data centres in Beijing and Singapore.

These are however not the worst offenders. Developers behind apps like MorePro and Wearfit didn’t even bother to translate their privacy policies from Chinese!

Users’ privacy concerns

These third-party apps are incredibly popular and pervade the low-end wearable market: VeryFitPro ( 5,000,000+ installs), JYouPro (500,000+ installs), WearFit (1,000,000+ installs). With little oversight, they are able to collect and process lots of potentially sensitive information from having access to contacts, camera, location, and other sensors data from a large number of users.  Most of them are developed by small teams or unknown Chinese firms, which dominate the mHealth market.  

A small portion of users on Amazon express privacy concerns. For one of the top selling products LETSCOM Fitness Tracker  which uses VeryFitPro with 4/5 stars, 14,420 ratings and 1000+ answered questions, marketed towards “Kids Women and Men”, we were able to find only a few questions on privacy.  Notably, none of the questions was upvoted, so we suspect the remain unseen by the typical buyer. For example, one user was asking “What is the privacy policy for the app? How secure is the personal information? [sic]” to which another user (not the manufacturer) replied “A: This connects to your phone by bluetooth. That being said, I guess you could connect it only when you are in a secure location but then you wouldn’t have the message or phone notifications.” A similar concern was raised by another user “What is this company’s policy on data privacy? Will they share or sell the data to third parties?”

In another popular product, Lintelek Fitness Tracker with Heart Rate Monitor which used VeryFitPro with 4/5 stars, 4,050 ratings. Out of 1000+ answered questions, only a couple mentioned privacy. The first user gave a product 1 start with ominous warning “Be sure to read the privacy agreement before accepting this download”. Interestingly, the second user rated the product with 5 stars and gave a very positive review that ends with “Only CON: read the privacy statement if you are going to use the text/call feature. They can use your information. I never turned it on – I always have my phone anyway.

The fact that buyers of these devices do not investigate the privacy issues is troubling. Previous research showed that consumers will think that if a company has a privacy policy it protects their privacy. It seems to be clear that consumers need help from the platform. Amazon, Google and Apple ought to better inform consumers about potential privacy violations. In addition to consumer protection obligations by these platforms, regulators ought to apply increased scrutiny. While software are not conventional medical devices, hence not covered by HIPAA, some medical apps do fall under FDA authority, including apps that correspond with wearables.  Furthermore, as in Figure 1 shows, these devices are marketed to children so the app should be subject to enforcement of children’s privacy standards like COPPA

In conclusion, the lesser-known fitness tracking brands offer a cheaper alternative to high-end market products. However, as previous research showed, consumers of these devices are potentially paying a high-privacy price. The consumers are left to fend for themselves. In many cases, the cheaper devices pertaining to firms outside of US jurisdiction and thus US and European regulations are difficult to enforce.  Furthermore, global platforms like Amazon, Google, Apple, and others seem to turn a blind eye to privacy issues and help to promote these devices and apps. They offer unhelpful and possibly misleading labels to the consumers such as Amazon’s “best seller”, “Amazon’s choice”, Google’s Play Store’s download count and star ratings, which exacerbate an already global and complex issue. It requires proactive action on behalf of all parties to offer lasting protection of users’ privacy, one that incorporates the notions of established societal norms and expectations.


We would like to thank Helen Nissenbaum for offering her thoughts on the topic.

Every move you make, I’ll be watching you: Privacy implications of the Apple U1 chip and ultra-wideband

By Colleen Josephson and Yan Shvartzshnaider

The concerning trend of tracking of user’s location through their mobile phones has very serious privacy implications. For many of us, phones have become an integral part of our daily routine. We don’t leave our homes without and take them everywhere we go. It has become alarmingly easy for services and apps to collect our location and send them to third-parties while the user is unaware. Location tracking generally works poorly indoors. Tracking services can infer your general location up to a building using current technologies like GPS, WiFi, cellular triangulation. However, your movements inside can’t be precisely tracked. This level of obfuscation is about to disappear as a new radio technology called ultra-wideband communications (UWB) becomes mainstream.

In its recent iPhone launch, Apple introduced the U1 ultra-wideband chip in the iPhone 11. Ultra-wideband communications use channels that have a bandwidth of 500Mhz or more, with transmissions at a low power. In this blog post, we would like to give a brief introduction into the technology behind the chip, how it operates and discuss some of its promises as well as implications for our day-to-day activities.

Figure 1: UWB consumes a wide bandwidth, at 500+Mhz. In comparison, a broadband WiFi channel is 20Mhz.

Why would users want ultra-wideband? On the iPhone 11 Pro product page, Apple says, “The new Apple‑designed U1 chip uses Ultra Wideband technology for spatial awareness — allowing iPhone 11 Pro to understand its precise location relative to other nearby U1‑equipped Apple devices. It’s like adding another sense to iPhone, and it’s going to lead to amazing new capabilities”. For now, the features available to the U1 chip are restricted to “[pointing] your iPhone toward someone else’s, and AirDrop will prioritize that device so you can share files faster”. 

However, as the number of devices equipped with a UWB chip grows, it will enable a broad spectrum of applications. UWB is not a new technology, but we are seeing a renewed interest due to vastly improved operational distance. Over the years, researchers have developed a variety of UWB applications such as estimating room occupancy, landslide detection, and human body position/motion tracking. Perhaps the leading use case for UWB technology has been precise indoor localization, with accuracies between 10-0.5cm. Indoor localization  is the process of finding the coordinates of a target (i.e. a phone) relative to one or more fixed-point anchors that also contain UWB radios. The relative coordinates are then mapped to a reference (e.g. blueprints) to provide an absolute location. High-accuracy localization is especially useful in contexts where traditional GPS is not accurate enough, or cannot reach. A number of other technologies have been explored for indoor localization, such as WiFi and Bluetooth, but the accuracy of these techniques is on the order of meters1, not centimeters.

The key to enabling centimeter-level localization is the wide bandwidth of UWB. Transmissions that occupy a broad bandwidth are short in duration and known as pulses or impulses. These short duration impulses allow accurate measurement of time of flight (ToF): the time it takes for a signal to propagate from point A to point B. Radio frequency (RF) waves travelling through air have a velocity that is very close to the speed of light. This means that if we can accurately measure time of flight, then we know the distance between A and B.  Similar to how bats use echolocation to sense their environment, UWB pulses can be used to sense distances between two transmitters. The shorter the duration of the impulse, the more precise the distance measurement will be. There are a few different ways to use this information for localization/positioning, but the most common for navigation is time difference of arrival. This system relies on having three or more anchors that are also equipped with UWB chips. The anchors have synchronized clocks. To calculate the position of the phone, the anchors forward their ToF measurements to a central service that knows the absolute location of the anchors (e.g. mapped onto blueprints) and calculates where the phone is located relative to the anchors. 

Figure 2: Time Difference of Arrival (TDoA) UWB localization system 

For now indoor localization is not common, since most buildings do not have an UWB anchor infrastructure. However, in October 2019 it was announced that Cisco is teaming up with Czech company Sewio to integrate UWB chips in wireless access points. This is a major step towards enabling ubiquitous indoor localization, as it will make it much more likely that any building with WiFi can also support indoor localization. The new Cisco access points will support IEEE 802.15.4z, an ultra-wideband communications standard that was designed by the UWB Alliance, an organization that receives input from members like Apple, Decawave, Samsung and Huawei. Apple’s U1 chip adheres to the same standard, so the U1 and the Cisco access points will be able to communicate. If an Apple U1 chip responds to ranging exchanges initiated by the Cisco access points, then it is a simple matter of the owner of the network running a location calculation service to obtain the Apple U1 chip’s position. 

What makes the current generation of UWB chips stand out is that for the first time they will be deployed in mobile phones, which for a lot of people is an inseparable part of their daily routine. While it is promoted by Apple as just another sensor to “Share. Find. Play. More precisely than ever,“ this technology has the power to disrupt existing societal norms. Suddenly businesses will be able to track an individual’s location within their stores down to the centimeter, which gives them the power to track which products you look at in real-time. Similar to the debated facial recognition technology, UWB localization offers a new capability to capture and ultimately profile identities of a user. Essentially, the new chip is a marketer’s dream in a box. Shops already track your purchases, leading to cases like the infamous 2012 case where Target unintentionally divulged a teen’s pregnancy to her father. When a store has UWB-enabled access points, it will be easy to monitor a phone’s location indoors and track what you considered purchasing in addition to what you actually purchase. Even without UWB, Cisco already has a feature that lets stores track your presence via phone WiFi, “to engage users and optimize marketing strategies”. 

This WiFi tracking is possible even if your device is not associated with the network, because devices with the WiFi chip enabled periodically send out probe packets to discover which networks are available. A similar technique could be used with UWB to enable even more precise tracking throughout the store. This means that your location information could be used even if location permissions are closely monitored for apps on the phone. The Cisco/Sewio announcement off the bat mentions a “location-based marketing in retail” as a potential use case. In a mall-wide network setup, the routers could retain information that will enable inferring your movements in other stores as well.  Essentially, offering a physical world analogy to web tracking. Companies like Five Tier, JCDecaux and other use existing location tracking technologies to display ads to the users in the vicinity on nearby screens, even billboards. Current WiFi-based phone tracking lets retailers monitor which store you are in, but with UWB, companies will be able to monitor which products you are looking at. This information could be used to push targeted ads that could follow you both physically and online. Imagine going to browse for jewelry, and then seeing billboards for diamonds follow you as you drive home, and have that continue on your web browser and smart TV once you get home. 

Historically companies have opted to chase the marketing dream instead of respecting users’ privacy. Companies like Google and Facebook argue that they provide users with adequate privacy controls, but privacy researchers disagree. Furthermore, privacy choices are often eroded either by bugs or misleading requests. One recent incident report by Brian Kreb, details how Apple continues to collect location information, despite location-based system services being disabled. According to Brian, Apple’s response stated, “this behavior is tied to the inclusion of a new short-range technology that lets iPhone 11 users share files locally with other nearby phones that support this feature, and that a future version of its mobile operating system will allow users to disable it”.  And even if location services are reduced or disabled, some apps constantly try to get users to turn these services back on. As Figure 3 shows, some messages are deceptive, causing users to believe that the app won’t work without re-enabling high-accuracy (WiFi-assisted) location. And even if the apps using location data are trustworthy, choosing to leave high precision location services enabled can still allow stores with UWB infrastructure to closely track you without your explicit consent by using one-way ranging with probe packets2 (see 7.1.1.2 in Application of IEEE Std 802.15.4).

Figure 3: Some mobile phone apps repeatedly encourage users to turn on location permissions that are not actually necessary.

UWB technology could disrupt our preconceived privacy expectations about how our location data is shared and used. In a recent empirical study Martin, Kirsten E., and Helen Nissenbaum show that “that tracking an individual’s place – home, work, shopping – is seen to violate privacy expectations, even without directly collecting GPS data, that is, standard markers representing location in technical systems.”  

It can also offer potential benefits to the consumer. For example, we can envision an UWB localization service that helps you find a specific store inside a large mall, navigate underground tunnel systems such as those featured in the cities of Montreal and Seoul, or helps you navigate to the precise location of where an item is located in a store. Nevertheless, given the current state of privacy policies, confusing controls, and with the current privacy regulations being poorly equipped to address the potential violation of users’ privacy expectations in public places, without proper oversight, there is a significant risk in these types of technologies being misused for nefarious purposes such tracking and surveillance. As these technologies become pervasive, it becomes vital to fully consider the implications of these techniques on our way of life, specifically the effect they have on the established societal norms and expectations.

In this blog post we outlined what UWB is and how it can be used to track location with unprecedented accuracy. While accurate location tracking could be useful, users often find that their data is used in unexpected ways that requires close reading of dense legal agreements. This flow of information is legal, but still violates users’ privacy expectations. These expectations are even more deeply violated when a phone’s location can be tracked despite carefully selected privacy settings on the device. Although this level of ubiquitous centimeter-level tracking is not yet a reality, the pieces are rapidly falling in place. Now is the time to act, before the norms of privacy erode further. Regulators, businesses and end-users need to work together to design a system that can benefit both businesses and customers without unexpected consequences for the customers. 

We would like to thank Helen Nissenbaum for providing feedback on the early drafts.


Footnotes

1. Research projects in wifi localization have achieved accuracies of 10-30cm, but commercially available localization solutions are accurate within meters.
2. Recall that probe packets are sent out periodically to let your device sense which networks you can join. All a retailer needs to do to track your location is collect the timestamps that your device’s probes arrive at their anchors. Some users may erroneously believe that encryption protects them from this kind of tracking, but only packet payloads (not headers) are encrypted. Sequence numbers and source IDs are contained in the UWB standard packet headers.

PrivaCI Challenge: Context Matters

by  Yan Shvartzshnaider and Marshini Chetty

In this post, we describe the Privacy through Contextual Integrity (PrivaCI) challenge that took place as part of the symposium on applications of contextual integrity sponsored by Center for Information Technology Policy and Digital Life Initiative at Princeton University. We summarize the key takeaways from the unfolded discussion.

We welcome your feedback on any of the aspects of the challenge, as we seek to improve the challenge to serve as a pedagogical and methodological tool to elicit discussion around privacy in a systematic and structured way.

See below the Additional Material and Resources section for links to learning more about the theory of Contextual Integrity and the challenge instruction web page.

What Is the PrivaCI Challenge?

The PrivaCI challenge is designed for evaluating information technologies and to discuss legitimate responses. It puts into practice the approach formulated by the theory of Contextual Integrity for providing “a rigorous, substantive account of factors determining when people will perceive new information technologies and system as threats to privacy (Nissenbaum, H., 2009).”

In the symposium, we used the challenge to discuss and evaluate recent-privacy relevant events. The challenge included 8 teams and 4 contextual scenarios. Each team was presented with a use case/context scenario which then they discussed using the theory of CI. This way each contextual scenario was discussed by a couple of teams.

 

PrivaCI challenge at the symposium on applications of Contextual Integrity

 

To facilitate a structured discussion we asked the group to fill in the following template:

Context Scenario: The template included a brief summary of a context scenario which in our case was based on one of the four privacy news related stories with a link to the original story.

Contextual Informational Norms and privacy expectations: During the discussion, the teams had to identify the relevant contextual information norms and privacy expectations and provide examples of information flows violating these norms.

Example of flows violating the norms: We asked each flow to be broken down into relevant CI Params, i.e., Identify the actors involved (senders, receivers, subjects), Attributes, Transmission Principle.

Possible solutions: Finally, the teams were asked to think of possible solutions to the problem which incorporates previous or ongoing research projects of your teammates.

What Were The Privacy-Related Scenarios Discussed?

We briefly summarize the four case studies/privacy-related scenarios and discuss some of the takeaways here from the group discussions.

  1. St. Louis Uber driver has put a video of hundreds of his passengers online without letting them know.
    https://www.stltoday.com/news/local/metro/st-louis-uber-driver-has-put-video-of-hundreds-of/article_9060fd2f-f683-5321-8c67-ebba5559c753.html
  2. “Saint Louis University will put 2,300 Echo Dots in student residences. The school has unveiled plans to provide all 2,300 student residences on campus (both dorms and apartments).”
    https://www.engadget.com/2018/08/16/saint-louis-university-to-install-2300-echo-dots/
  3. Google tracks your movements even if users set the settings to prevent it. https://apnews.com/828aefab64d4411bac257a07c1af0ecb
  4. Facebook asked large U.S. banks to share financial information on their customers.
    https://www.wsj.com/articles/facebook-to-banks-give-us-your-data-well-give-you-our-users-1533564049

 

Identifying Governing Norms

Much of the discussion focused on the relevant governing norms. For some groups, identifying norms was a relatively straightforward task. For example, in the Uber driver scenario, a group listed: “We do not expect to be filmed in private (?) spaces like Uber/Lyft vehicles.” In the Facebook case, one of the groups articulated a norm as “Financial information should only be shared between financial institutions and individuals, by default, AND Facebook is a social space where personal financial information is not shared.”

Other groups, could not always identify norms that were violated. For example, in the same “Google tracks your movements, like it or not” scenario, one of the teams could not formulate what norms were breached. Nevertheless, they felt uncomfortable with the overall notion of being tracked. Similarly, a group analyzing the scenario where “Facebook has asked large U.S. banks to share detailed financial information about their customers” found that the notion of an information flow traversing between social and financial spheres unacceptable. Nevertheless, they were not sure about the governing norms.

The unfolded discussion included whether norms usually correspond to “best” practice, due diligence. It might be even possible for Facebook to claim that it is all legal and no laws were breached in the process, but this by itself does not mean there was no violation of a norm.

We emphasized the fact that norms are not always grounded in law. An information flow can still violate a norm, despite being specified in a privacy policy or even if it is considered legal, or a “best” practice. Norms are influenced by many other factors. If we feel uneasy about an information flow, it probably violates some deeper norm that we might not be consciously aware of. This requires a deeper analysis.

Norms and privacy expectations vary among members of groups and across groups

The challenge showcases the norms and privacy expectations may vary. Some members of the group, and across groups, had different privacy expectations for the same context scenario. For example, in the Uber scenario, some members of the group, expected drivers to film their passengers for security purposes, while others did not expect to be filmed at all. In this case, we followed the CI decision heuristic which “recommends assessing [alternative flows’] respective merits as a function of the of their meaning and significance in relation to the aims, purposes, and values of the context.” It was interesting to see how by explaining the values of a “violating” information flows, it was possible to get the members of the team to consider their validity in a certain context under very specific conditions. For example, it might be acceptable for a taxi driver to record their passengers onto a secure server (without Internet access) for safety reasons.

Contextual Integrity offers a framework to capture contextual information norms

The challenge revealed additional aspects regarding the way groups approach the norm identification task. Two separate teams listed the following statement as norms: “Consistency between presentation of service and actual functioning,” and “Privacy controls actually do something.” These outline general expectations and fall under the deceptive practice of the Federal Trade Commission (FTC) act; nevertheless these expectations are difficult to capture and asses using the CI framework because they do not articulate in terms of appropriate information flows. This also might be a limitation of the task itself, due to time limitation, the groups were asked to articulate the norms in general sentences, rather than specify them using the five CI parameters.

Norm violating information flows

Once norms were identified, the groups were asked to specify possible information flows that violate them. It was encouraging to see that most teams were able to articulate the violating information flows in a correct manner, i.e., specifying the parameters that correspond to the flow. A team working on the Google’s location tracking scenario could pinpoint the violating information flow: Google should not generate flow without users’ awareness or consent, i.e., the flow can happen under specific conditions. Similar violations identified in other scenarios. For example, in the case, where an Uber driver was streaming live videos of his passengers onto the internet site. Here also the change in transmission principle and the recipient prompted a feeling of privacy violation among the group.

Finally, we asked the groups to propose possible solutions to mitigate the problem. Most of the solutions included asking users for permissions, notifying or designing an opt-in only system. The most critical takeaway from the discussion on the fact that norms and users’ privacy expectation evolve as new information flows are introduced, their merits need to be discussed in terms of the functions they serve.

Summary

The PrivaCI Challenge was a success! It served as an icebreaker for the participants to know each other a little better and also offered a structured way to brainstorm and discuss specific cases. The goal of the challenge exercise was to introduce a systematic way of using the CI framework to evaluate a system in a given scenario. We believe similar challenges can be used as a methodology to introduce and discuss Contextual Integrity in an educational setting or even possibly during the design stage of a product to reveal possible privacy violations.

Additional material and resources

You can access the challenge description and the template here: http://privaci.info/ci_symposium/challenge

The symposium program is available here.

To learn more about the theory of Contextual Integrity and how it differs from other existing privacy frameworks we recommend reading “Privacy in Context: Technology, Policy, and the Integrity of Social Life” by Helen Nissenbaum.

To participate in the discussion on CI, follow @privaci_way on Twitter.
Visit the website: http://privaci.info
Join the privaci_research mailing list.

References

Nissenbaum, H., 2009. Privacy in context: Technology, policy, and the integrity of social life. Stanford University Press.