The global wearable tech market is expected to double from 526 million devices in 2016, to more than 1 billion devices by 2022. We can expect this number to keep growing as our cities become more connected, making the switch from 4G to 5G.
Fitbit came onto the scene 10 years ago, looking like a digital pedometer crossed with a money clip. Now it’s the market leader in fitness trackers and has become a term synonymous with personalized health monitoring technology. And recently, Google announced their purchase of Fitbit for $2.1 billion. But in 2016, Fitbit had a security breach on its hands: hackers were able to obtain user account information by taking stolen email addresses and passwords from third-party sites, tapping into the company’s databases and trying to order replacement parts fraudulently under the owner’s warranty. The company addressed the issue right away and has since taken multiple steps toward protecting its users’ data.
But the question remains: How secure is personal information stored on these devices, which require certain details in order to start a profile and can be as sensitive as a woman’s ovulation cycle?
Anonymized data is generally scrubbed of any information that could be used to identify a single, individual person.
How anonymous is anonymized data?
Fitbit says that users largely have control of what information they provide and what is shared through either the company itself or via third-party applications or uses.
“To the extent that information we collect is health data or another special category of personal data subject to the European Union’s General Data Protection Regulation, we ask for your explicit consent to process the data,” the company says. “We obtain this consent separately when you take actions leading to our obtaining the data, for example, when you pair your device to your account, grant us access to your exercise or activity data from other services or use the female health tracking feature.”
Fitbit does acknowledge that some data might be aggregated and anonymized and shared with third parties, “including public reports about exercise and activity. Data collected and shared with consent is kept for as long as the account is active.”
Anonymized data is generally scrubbed of any information that could be used to identify a single, individual person. It’s supposed to be used more broadly to identify trends or habits among groups in a given area — whether that’s a community or a country. But there are some recent developments that might raise some concerns about who has access to this data and how it’s being used.
It’s also possible to reverse course and use anonymized data to identify someone, through the fitness tracker data collected by some employers as part of corporate wellness activities, for example. Given the starting and ending points identified on a device wearer’s day and where that person’s activity takes place, it’s possible to drill down and determine who the data represents within a specified group.
Additionally, Google has entered into relationships with two large health organizations, Ascension and the Mayo Clinic. The Mayo Clinic treats some very difficult diseases, and now all of its records are being stored within the Google Cloud platform. Ascension, meanwhile, is the second-largest health system in the United States and is working with Google on Project Nightingale, in which personal health data is shared under Google’s Cloud to help develop artificial intelligence-based services for medical care providers.
HIPAA permits the sale and sharing of anonymized data “to help the covered entity carry out its health care functions,” and, in this case, Ascension makes treatment suggestions, orders tests and offers procedures based on the information it has. And while Google says the data it hosts would not be shared outside the scope of its contract with Ascension, the limits of that scope are unclear. Fitbit has also stated that any health and fitness data collected will not be used in Google ads.
Device vulnerability

The “Stop Marketing and Revealing The Wearables and Trackers Consumer Health Act” or Smartwatch Data Act was introduced by U.S. senators Bill Cassidy and Jacky Rosen in November 2019. This law would ensure data from fitness trackers, smartwatches and other health apps, like Strava, could not be sold or shared, ever, without individual consumer approval and permission. The bill applies to non-anonymized consumer data that could be sold or shared for any purpose to “domestic information brokers, other domestic entities, or entities based outside the United States unless consent has been obtained from the consumer.”
Zhiqiang Lin, a computer science professor at Ohio State University, discussed security issues among Bluetooth devices, like fitness trackers, and says that it’s possible to “sniff” their signal at much further distances than the typical few hundred meters with some simple digital amplification.
“There is a fundamental flaw that leaves these devices vulnerable – first when they are initially paired to a mobile app, then again when they are operating,” he said at the Association for Computing Machinery’s Conference on Computer and Communications Security in 2019.
“While the magnitude of that vulnerability varies, we found it to be a consistent problem among Bluetooth low-energy devices when communicating with mobile apps… At a minimum, a hacker could determine whether you have a particular Bluetooth device, such as a smart speaker, at your home, by identifying whether or not your smart device is broadcasting the particular universally unique identifier (UUID) identified from the corresponding mobile apps.”
Dr. Lin and his team identified more than 1,400 vulnerable apps offered via the Google Play store but did not include anything from Apple.
Fitness trackers aren’t the real target — it’s the device they’re connected to.
Chet Wisniewski, a research scientist at Sophos, agrees, adding that fitness trackers might leak information shared via Bluetooth, but “it’s reasonably difficult to capture, and it’s of little value to attackers. As you move up to things like smartwatches, the risk increases, but mostly due to… theft, not so much interception. A found smartwatch within a few meters of the paired smartphone could be used to steal emails and contacts.”
But fitness trackers aren’t the real target — it’s the device they’re connected to. By accessing, even briefly, a piece of Bluetooth-enabled wearable technology, hackers could first manipulate a little bit of data, like the number of steps taken in a day, but then drop in some malicious code that can infiltrate the user’s smartphone, computer or email contacts.
The FBI even issued a warning this month advising people to be aware of the information they’re sharing online, the data that is collected by the corporations behind IoT devices, and how to adjust privacy settings to prevent possible intervention by hackers.
Device encryption and data protection
There are some solutions to these issues. Integrated Circuit Metric (ICMetric) technology might be the key to protecting both wearable tech users and the companies that make them from hackers. ICMetric technology uses the accelerometer and gyroscope sensor to generate a cryptographic security system that will identify an unrelated device that tries to infiltrate a given group and raise an alarm. In other words, if a hacker tries to use a digital tool to glean information from a fitness tracker, it won’t be able to provide the correct encrypted code identifying itself as part of the group. It’s like not having the secret password to get into a club.
Device manufacturers need to ensure their technology is properly encrypted and safeguarded to reduce the risk of hacking. Security updates should be provided on a regular basis. White hat hackers should be employed who can identify back-door access issues that can compromise the integrity of a device. Encryption should not focus just on the device but on the Bluetooth signal received and sent from the device and the tablet, computer or smartphone with which it interacts, with required PIN codes or passwords for another layer of security. Further, companies would be smart to incorporate a remote erase function, in which data can be deleted if it’s found to be contaminated, but this would need to be done in a way that allows users to have a remote copy of that data protected and saved.
Recent Comments