Every breath you take: Data privacy and your wearable fitness device
Vol. 72, No. 2 / March-April 2016
Not so very long ago, the only way to track your blood pressure, glucose levels, heart rate, and sleep patterns was to visit a doctor’s office where a physician would employ state-of-the-art medical instruments and then offer a diagnosis based on the results. Today, you need look no further than your smartphone.
The advent of health-surveillance tools and mobile fitness applications has ushered in a new era of consumer health care that holds enormous promise. Individuals are more empowered than ever to take control of their health, and it is possible to provide real-time tracking and reporting of critical information about fitness to physicians, swiftly and across vast distances.
Yet all this potential poses a challenge in terms of health data privacy and security that rivals – if not surpasses – the threats associated with financial data. For more than a decade, consumers have battled the prying eyes of data brokers and hackers to protect their financial information. Now, the battlefront has shifted to include health and lifestyle information that could prove even more sensitive and consequential if hacked.
“Health data is more vulnerable in general as a data set than financial data because you can’t replace it like you can a credit card,” says Michelle De Mooy, deputy director of the Consumer Privacy Project at the Center for Democracy & Technology, a Washington, D.C.-based nonprofit that advocates for civil liberties and human rights on the Internet. “When you have a diagnosis, it’s something that’s a part of your medical history for life. When people are victims of medical identity theft or their medical records have been hacked, there are very few good remedies for those situations. They are unprotected, and sometimes their whole families are unprotected.”
That matters, De Mooy says, because wearables such as fitness wristbands and monitors have become so sophisticated, designed to track user activity and share their data with a multitude of applications and devices, with few if any restrictions. These tools are capable of measuring brain activity, calorie intake, miles walked and run, swimming strokes, blood oxygen and blood sugar levels, and heart rates. They are both fitness coach and a proverbial “black box” for a consumer’s health. They also are a gateway to the lives of their users.
Up to now, informed consumers have been willing to sacrifice a little privacy to gain the benefits associated with fitness trackers and smartwatches: improved wellness, vanquishing unhealthy eating habits, and feeling more liberated to manage their health care. No question these are worthy goals, but legal experts believe Americans may be reaching a critical juncture on health and fitness data because now it is under more threat than ever. Where it was once both taboo and illegal for hackers and corporations to poke around in certain types of health records, this data is being viewed by some as the missing piece in consumer profiles.
A Federal Trade Commission (FTC) study released in May 2014 revealed that 12 mobile health applications and devices transmitted information to 76 different third parties, and some of the data could be linked back to specific users. In addition, 18 third parties received device-specific identifiers, 14 received consumer-specific identifiers, and 22 received other key health information.
“What we have is this vast amount of information that is being created, and, as the Federal Trade Commission has found, not a lot of attention yet being paid by consumers to how it’s being used and shared,” says Kristi Wolff, special counsel at Kelley Drye & Warren, LLP. “A lot of companies are coming out with innovative products, but they’re new and not taking some of the necessary precautions that more established companies would take in terms of data privacy and security.”
With so much at stake, privacy advocates are debating how best to protect consumer wellness and fitness information. Medical or health data generated by doctors, hospitals, and other clinicians is covered by the Health Insurance Portability and Accountability Act (HIPAA), which limits access to patient health records and punishes those who violate the protections. But the information generated through fitness trackers, smartphones, and mobile applications is generally not covered by HIPAA regulations.
Industry officials argue that additional government regulation, if it comes, would be slow and stifling to an enterprise that relies on innovation at the speed of light. Likely, rules would be outdated the moment they are adopted, and they would be solving yesterday’s problems rather than forecasting the privacy concerns of tomorrow’s devices.
Many believe the best way forward is to encourage companies to adopt a best-practice model on data privacy for fitness trackers and applications. Best practices in privacy policies might focus on keeping health data on the device rather than in the cloud, letting the user choose with which applications to share their information, and prohibiting the stats from being sold to data aggregators for behavioral advertising.
What is indisputable is that there is a great deal of flux and uncertainty in how consumers and technology companies operate within the new digital economy and digital ecology, suggests Jennifer S. Geetter, a partner at McDermott Will & Emery, LLP. Businesses, consumers, lawyers, regulators, policymakers, and investors are not only operating under a different set of rules, but also a different set of possibilities.
“We are in the midst of an important dialogue about our digital world, and one vitally important piece of that is our digital health world,” Geetter says. “I expect we’re going to continue to see volatility and flexibility because the technologies are changing. People always talk about innovation on the technology side, but we’re going to have to be legal innovators as well.”
Small Devices, Big Business
Most consumers are familiar with wearable devices, if they’re not wearing one now. Many look like space-age watches and have the functionality of desktop computers. They are a part of the Zeitgeist where people compare the number of steps they’ve taken or flights of stairs climbed around the “water cooler” or on Facebook. Many predict that one day these devices will be as ubiquitous as cell phones.
Among wearable devices, Fitbit captured the public’s imagination first and to the greatest extent with its trim, multi-colored plastic wristbands that resembled the Livestrong yellow bracelets, but without the controversy. What began as a homely, hyped-up pedometer in 2007 has turned into a handsome tracking device, monitoring sleep habits, blood glucose levels, calories, heart rate, distances traveled, and routes used.
But Fitbit isn’t alone in the vast frontier of health-tracking tools. Today there is a long list of competitors producing nearly identical lollipop-colored wristbands and mind-boggling applications. Its rivals range from Garmin to Jawbone, Under Armour to Google, and Xiaomi to Misfit. Even jewelry company Swarovski has entered the fray with its bejeweled Shine. Of course, casting a long shadow over the entire category is the Apple Watch, part designer adornment, part lifestyle computer. The Apple Watch, like most Apple products, became a wearable industry leader even before it was shipped to stores in April 2015.
Sales of wearable gadgets – smartwatches, smart eyewear, and fitness-tracking devices – have exploded in popularity in recent years, and they are expected to register some 30.9 million units in sales in 2015, according to the U.S. Consumer Electronics Sales and Forecasts, the semiannual industry report of the Consumer Electronics Association (CEA). The forecast, which was released in January 2015 and updated in July, estimated that health and fitness trackers would lead sales among wearable devices with projected sales of 20 million units, and revenues reaching $1.8 billion in 2015, an 18 percent increase over .
International Data Corporation (IDC), a global marketing research firm, reported that consumer spending on wearable devices tripled in 2014 compared to 2013. IDC predicts that production of health trackers will jump from 20 million units in 2014 to more than 120 million units in 2019. Meanwhile, the NPD Group, a sales tracking company, reported that more than 25 percent of U.S. consumers already use fitness apps.
Observers believe wearable tech sales will remain robust as long as the applications market follows suit. The availability of applications from third-party developers will ensure the long-term growth of the tracker and smartwatch markets, but it also creates a tsunami of new health information and privacy concerns.
Applications such as Runkeeper, Fit-Star Personal Trainer, Nike+ Training Club, and Fitnet allow consumers to use their smartphones to set their physical regimen and monitor their fitness goals. (Fitnet even uses a smartphone’s camera to evaluate whether users are exercising the right way.) A 2014 report released by ACT: The App Association, an industry group for application makers, shows that the number of health and medical apps doubled between 2012 and 2014, and analysts are expecting revenues to reach $26 billion by 2017.
“It’s one of those odd confluences in that everyone is interested in the same thing, which is improved patient outcomes,” says Morgan Reed, executive director of ACT. “What we’re really trying to do is have healthier and more able people to be in charge of their own lives and be able to connect to a doctor when they want.”
Connecting to doctors presents another series of opportunities and hurdles. Nearly 100 million wearable remote patient monitoring (RPM) devices – such as pulse oximeters, blood pressure cuffs, ECG monitors, and continuous glucose monitoring tools – are expected to be produced over the next four years, according to ABI Research, which follows global connectivity and emerging tech trends. Already, Apple, Google, and Samsung have signaled that they plan to produce RPM devices.
RPM devices, most often recommended by physicians and covered under HIPAA, are transitioning some health care activities away from the doctor’s office into people’s homes. By collecting data from a variety of devices and applications, and sharing it securely over the Internet, patients and physicians can be more closely connected in designing treatments, experts say. They also can provide even broader community benefits.
One such device is Propeller Health’s inhaler, which has built-in sensors, connects through Bluetooth to smartphones, and lets individuals respond to asthma attacks while also tracking where those attacks occur. Working with Propeller Health, the city of Louisville, Kentucky, is moving beyond the individual to the community to find asthma hotspots. Launched back in 2012, Louisville deployed a network of air sensors, along with giving away 500 inhalers, to map out areas that coincided with individual asthma attacks. The data was used to monitor air pollution and then devise treatment plans for local asthma sufferers.
The whiz-bang qualities of these devices are many, but they still suffer the fair-weather nature of consumers. Much like other once-popular tech gadgets, 3D glasses and PDAs among them, fitness trackers are plagued by the same problems as other smart devices: the short attention span of consumers. The NPD Group reports that 40 percent of activity-tracker owners stop using the devices within six months of purchasing them. That may account for forecasts that smartwatch sales could eventually overtake fitness tracker sales, as it is easier to remain loyal to a watch than to a monitor, the sole purpose of which is to remind consumers to keep their New Year’s resolutions.
The Internet of Things
Fitness-tracking devices live in a much larger world of technology than health care, one that technologists have described as The Internet of Things (IoT). The IoT is a place where devices are linked and communicate with each other, an amalgam of sensors, programs, and connectivity. It is a universe where physical objects, from coffee pots to furnaces to automobiles, come to life at their owners’ bidding. The FTC estimates that some 25 billion connected objects and devices will be online in 2015.
Last January, FTC Chair Edith Ramirez expressed concerns about the privacy risks posed by the IoT, and she encouraged the tech industry to address those concerns or risk losing the confidence of consumers and the full adoption of IoT’s promise. “IoT has the potential to provide enormous benefits for consumers, but it also has significant privacy and security implications,” warned Ramirez in her remarks at the 2015 Consumer Electronics Show, which has become an annual pilgrimage for tech geeks.
“Connected devices that provide increased convenience and improve health services are also collecting, transmitting, storing, and often sharing vast amounts of consumer data, some of it highly personal, thereby creating a number of privacy risks,” Ramirez added.
Attorneys believe that the three primary privacy challenges for the IoT, which includes health tracking devices, are the ubiquitous data collection that exposes a deep well of personal information; the potential for unexpected uses of consumer data by everyone, from employers to insurance companies, and the adverse consequences that could arise from those uses; and heightened security risks from hackers who may be tempted to commit larceny by the sheer volume of data.
“It’s inevitable that we move to an Internet of Things approach,” says Elliot Golding, counsel at Crowell & Moring LLP. “It’s hard to see how a washer talking to a refrigerator becomes an application that excites us, but there is a place where integrated and automated devices will improve the quality of life. But when you have all these devices talking to each other, the least secure device becomes the security level for all your devices.”
Given that reality, FTC’s Ramirez encourages companies to “bake” privacy into their devices or applications from the start. The strategy would be to push companies to build devices with privacy elements such as additional sets of passwords and encryption; to reduce the amount of data that devices collect and store; to make data as anonymous as possible; and to increase company transparency with additional consumer notices on devices – particularly if companies plan to share the data with third parties – and the ability to consent or not to data collection.
“In my mind, the question is not whether consumers should be given a say over unexpected uses of their data; rather, the question is how to provide simplified notice and choice,” Ramirez said.
Privacy of What?
Observers say that what constitutes health data these days changes as quickly as the technology that creates it, which makes it hard to figure out what to do with that data and whether it needs to be protected. After all, the number of steps a person takes daily isn’t the same data as one’s white blood cell count, or is it?
The answer to that question will be an important factor in deciding where to draw the line on privacy with newly generated information from fitness trackers, mobile apps, and smartwatches. It’s a question government regulators are already trying to answer. Industry is also wrestling with its desire to avoid tough privacy regulations while touting the health benefits of their applications.
Most people can agree that information collected about patients by health care providers – doctors, hospitals, or health clinics, to name a few – to guide medical treatment decisions and care qualifies as health data. And that information is and should be protected under HIPAA and HITECH, the Health Information Technology for Economic and Clinical Health Act, which was enacted to promote the adoption and use of health-information technologies. But can consumers really expect privacy when their tracker sends off their GPS coordinates at the same time as it shares their steps in the cloud?
The jury’s out on whether a person’s stair or step count could be included under HIPAA or state regulations governing health privacy or security breaches. Part of the problem for government and industry has been balancing the benefits of these gadgets with the myriad of ways user information is being collected and shared with third parties, such as advertising firms and app developers.
“In most cases, HIPAA is not going to apply to the devices or the information,” Crowell’s Golding says. “What rules are going to govern how we handle very sensitive information is an issue that we’ll need to carefully think about. How do you protect privacy and security in a smart way, both at the outset of creating a device and on an ongoing basis?”
Because privacy and security policies tend to be reactive, their use will likely depend on the evolution of how the public defines health information over the next few years. An individual’s heart rate or blood glucose levels certainly seem to qualify as health data, but are they exempt from health privacy regulations because people create that data on their smartphones? It is a muddle.
“There’s a huge debate right now about what to do with all this health care information that’s being gathered outside of the existing HIPAA regulatory structure,” says Kirk Nahra, a partner at Wiley Rein LLP. “There’s an increasing consensus that we do something and no consensus on what we do.
“Information we would normally think of as health information is getting collected and analyzed outside the normal hospital and doctor’s office settings,” Nahra adds. “It doesn’t necessarily mean the companies in this business are doing bad things with the data, but they could.”
At the moment, however, the majority of consumers seems largely disconnected from and unfazed by health privacy concerns. No horrendous breach of consumer-generated health data has captured the public’s attention. In fact, consumers are more likely to marvel at the cool devices or the space-age apps than be creeped out by the thought of some unseen data broker collecting their information and selling it.
Legal experts say the disconnect reflects a common pattern in public discussions about privacy over the last few decades, and especially during the Internet era. It is always an evolving process as society comes to better understand what are reasonable expectations of privacy for consumers, and how those match with the public’s interest in accessing these devices.
"I think people really do care about privacy, but they don’t really know what to do about it,” says De Mooy of the Center for Democracy &Technology. “The data sharing that’s happening isn’t visible to them. There’s a lot of backend to these technologies that people don’t know about. Companies need to be educated on how to be transparent, and people need to be educated on who they can and should trust.”
Expectations of privacy tend to evolve over time, and they are definitely being transformed with these new technologies, likely faster than the public knows. There are innumerable societal factors at play here because the devices serve multiple roles for many different people, and their expectations about the applications are countless.
Partly there is a generational divide, where Millennials may feel perfectly comfortable sharing health data while the stodgy Baby Boomers fear the specter of Big Brother over their shoulder. There’s also the reasoning of some that people who don’t have anything to hide can ignore the potential of privacy breaches. After all, they’re healthy, active, and trim, so what would it matter if an insurance company or their employer acquired their fitness records?
Partly the situation may be the result of ignorance. It is unclear whether people understand how their data is used and how it moves between fitness trackers to the Internet and beyond. For example, few consumers know that hidden deep within most privacy policies is a clause that allows companies to turn over data to the government for valid law enforcement queries and other legal requests.
Attorneys say privacy policies are useless in the face of nefarious hackers, which is why there is so much emphasis on building stronger hardware and software security functions into devices and the systems that govern them. Privacy policies only come into force when dealing with a company’s choices about collecting and sharing consumer data.
The marketplace for data is a fierce one, and data brokers buy information as quickly as it is created. They in turn sell it to any number of companies, from insurers looking to determine premiums or sell policies to employers doing background checks on new hires. Consumer expectations of privacy cannot keep pace with this kind of technology, experts say, and that could pose some real problems for health and fitness data generated by and about them.
“This is an area that is simultaneously extremely exciting and kind of scary,” Golding says. “In my view, wearables have the potential to be a truly disruptive technology that can help society and help people. Yet, there’s so much danger with them in dealing with the potentially sensitive information they create.”
Corporate Best Practices
Industry officials acknowledge the dangers of unrestricted disclosures of fitness and well-being information, but they fear that an aggressive approach by government to limit the collection and sharing of data could prove a wet blanket on innovation. It could thwart industry and consumers from realizing the benefits of the IoT, especially the potential of health-monitoring devices.
“One reason that we’re where we are with these new technologies is they’ve grown out of an open environment,” says Anna L. Spencer, a partner at Sidley Austin LLP. “I’m very hesitant to say and impose requirements on a technology that is nascent and that could really transform health care, which desperately needs transforming.”
While the FTC’s Ramirez appreciates the importance of keeping the door open for innovation to advance technology and the U.S. economy, she seems particularly unwilling to give the industry a fully open field in which to operate. “I question the notion that we must put sensitive consumer data at risk on the off chance a company might someday discover a valuable use for the information,” she noted.
Industry has responded in the past year by pressing for a corporate best-practices model in addressing privacy concerns, hoping to avoid the heavy hand of federal and state regulation. Corporations already have adopted policies that better communicate privacy information to consumers, that focus on more thoughtful approaches to the security architecture and data discipline, and that vow not to sell or share data with third parties. And app makers have begun to offer free and paid versions of apps where consumers can get more privacy restrictions if they want to pay for them.
The best-practices approach received its heartiest endorsement in September 2014, when Apple Chief Executive Officer Tim Cook wrote an open letter to consumers that effectively set the standard for corporate transparency on privacy, data collection, and data sharing. Cook’s letter – and Apple’s accompanying consumer privacy Web site – was a response to the hacking of iCloud accounts and complaints that Apple hadn’t done enough to protect customer data.
“Our business model is very straightforward: We sell great products,” Cook wrote. “We don’t build a profile based on your email content or web browsing habits to sell to advertisers. We don’t ‘monetize’ the information you store on your iPhone or in iCloud. And we don’t read your email or your messages to get information to market to you.”
Legal experts believe that corporations such as Apple, Microsoft, and dozens of other top-line technology companies are defining best practices with their very public privacy policies, and they believe it is the best way forward to guard consumer privacy while balancing the desire to pursue innovation.
“The leaders in the industry are going to look to the importance of their brand,” Kelley Drye’s Wolff says. “Consumers have impressions of brands, and they ask themselves whether they can trust a company. They recognize there are always going to be risks, but I think there is value in the consumer mind and the company mind in terms of offering a good product with robust consumer protections.”
Adopting a regulatory scheme that relies on industry best practices also has the advantage of timeliness. “Nobody wants technology at the speed of government,” ACT’s Reed says. “That is especially true here in the personal health information space. I think that industry best practices are going to be the first step for highlighting and clarifying for consumers what’s going on.”
While the FTC has provided some guidance for companies, outside groups are also stepping in to offer up sample privacy foundations. In August 2015, the Online Trust Alliance (OTA), a nonprofit think tank, issued a privacy and security framework that companies could adopt for IoT devices, although it initially focused on home automation and health-wearable technologies. The principles underlying the recommendations are transparency and data security, which are based on the Fair Information Practice Principles (FIPPs), the FTC’s widely accepted guidelines for privacy-focused data collection practices.
“Security and privacy by design must be a priority from the onset of product development and be addressed holistically,” the OTA noted in releasing the framework, which covers everything from consumer access to privacy policies to breach response and consumer notification plans. “It must be a forethought versus an afterthought, focusing on end-to-end security and privacy.”
A Role for Government
The push for a best-practices approach to privacy protections has won wide support within the industry, and even some privacy advocates say there are good reasons to give companies a chance to solve the most egregious problems with tighter internal controls over information. There is a general fear that an aggressive regulatory scheme is the last thing these technologies need to grow.
Yet, technological innovation is always going to force a conversation about what is reasonable expectation of privacy for consumers, especially in a fast-changing technological age, and that may be a question that is best resolved by government, some say. While personal-generated health information may not rise to the level of HIPAA, it is closely watched by government agencies, from the FTC to the U.S. Food and Drug Administration (FDA) to the U.S. Department of Health and Human Services (HHS) and its agencies to state regulatory bodies.
For some attorneys, the fuss about adding new privacy restrictions on device usage and information sharing is little more than window dressing. They believe there are enough statutes and regulations on the books today to successfully enforce privacy rights and protections in the United States.
“It’s not that there is a complete absence of law,” says Sidley’s Spencer. “Just because HIPAA doesn’t apply, that doesn’t mean it is a free-for-all. It’s not. There are restrictions such as enforcement of the FTC Act, which prohibits unfair and deceptive trade practices. The FTC has, through its enforcement, focused on privacy and ensuring companies comply with their privacy notices. There are a myriad of unfair and deceptive practice statutes that state attorneys general use all the time to fight businesses engaging in practices that harm consumers.”
Still, for some, the best solution is a government one. “If companies of fitness devices have the ability to sell personal health data to insurers, employers, and others, users should be alerted and given the opportunity to decline. The FTC should require fitness devices and app companies to adopt new privacy measures that will help conceal the identity of individuals and develop policies to protect consumer information in the event of a security breach,” said U.S. Senator Charles Schumer (D–N.Y.) in a written statement in August 2014.
Many observers believe the FTC does a pretty good job of using the tools at hand to monitor and enforce consumer privacy protections guarded by the FTC Act. It regularly brings legal actions against companies that violate privacy rights or fail to maintain security. And it frequently applies section 5 of the Act to ensure companies are not employing unfair and deceptive practices in their dealings. Even friends of the FTC say that its approach to regulation could be strengthened, as could its penalties, to send a more forceful message to tech companies about the importance of privacy.
In January 2015 the FDA issued a draft guidance paper, titled “General Wellness: Policy for Low Risk Devices,” noting that it doesn’t plan to regulate “low risk general wellness products” that are being described by retailers and manufacturers as “medical devices” under the Federal Food, Drug, and Cosmetic Act. Many expect the FDA, however, to keep an eye on the growth of these devices and their health claims.
Some have suggested that the privacy of new data being created could be addressed by expanding HIPAA to cover consumer-generated health and fitness information. It would have the benefit of not creating an entirely new statute, and it would stake out clearly the parameters of what “health data” means today.
“When we wrote the HIPAA rules, we defined the health care industry by law in a certain way that most accurately reflected it at the time. It has become less accurate over time,” Wiley Rein’s Nahra says. “You could change HIPAA to have it cover all health information regardless of where it’s coming from. Right now, it has to come from the right place for it to be health information and protected.”
HHS has already proven that it is open to changing what constitutes a covered entity under HIPAA. In January 2013 it issued new rules under HIPAA and HITECH that included “business associates,” or contractors or subcontractors for covered companies, to capture the increasing use of third-party cloud companies for storage of protected medical information. The same could be done for consumer-generated devices and their corporate developers.
Regulating the devices and applications more broadly would likely prove complex, however. Small changes in functionalities or how the product is offered can shift the regulatory location from the FTC to the FDA, and that can add to consumer confusion as to expectations of privacy. It can also be difficult for innovators who are trying to invent new devices but aren’t sure where to look for guidance on how best to protect consumers in the process.
“I think the FTC will have to decide for itself what it feels is the best way to convey to the commercial community what its regulatory expectations are and a way to communicate to consumers what their reasonable expectations of privacy can be,” McDermott’s Geetter says.
Meanwhile, there has been interest at the state level to adopt new regulations for health-tracking devices. Already, 47 states have separate data breach notification statutes, and California, Florida, and Texas have expanded their laws recently to widen the scope of information that qualifies as personal, such as medical history, treatment, or condition. In Texas, the Texas Medical Records Privacy Act covers any person who comes into possession of personal health data, and it bans the sale of that information without authorization.
Not surprisingly, industry leaders are loathe to face a medley of new state regulations governing the privacy of wearable devices and imposing new limits on how consumer-generated information can be shared. “The problem is when you end up with a patchwork of state regulation . . . you end up with harm to smaller, innovative companies that are trying to do something innovative in the market,” ACT’s Reed says.
The FTC dipped back into the public debate at its November 2015 workshop examining the privacy issues that arise with advertising and marketing, and the tracking of consumer activities across different devices. Some of the trouble stems from consumers who end up interacting with various platforms, applications, and software when using their smartphones and wearable devices.
“More consumers are connecting with the Internet in different ways, and industry has responded by coming up with additional tools to track their behavior,” said Jessica Rich, director of the FTC’s Bureau of Consumer Protection, in a statement. “With the advent of new tracking methods, though, it’s important to ensure that consumers’ privacy remains protected as businesses seek to target them across multiple devices.”
Privacy as Collateral Damage
Talk to most attorneys about the IoT or wearable devices, and there is a sense of inevitability that these remarkable technological devices will eventually fail in protecting consumer privacy and all this newly generated information. This brave, new world of technology is unforgiving when it comes to data because data is its currency.
“A few years ago, users of Internet services began to realize that when an online service is free, you’re not the customer,” wrote Cook in his famous letter last year. “You’re the product.”
Certainly, attorneys, privacy advocates, and government officials believe that best practices can go a long way toward protecting data. They all are reluctant to engage the full force of government on the problem because it will always come at a cost. Most likely the price will be paid in innovation, as privacy requirements limit the choices of designers and developers looking to create that next generation of application or device.
“Privacy protections are about choices; as you add privacy protections you, by definition, restrict how data is used and disclosed,” Geetter says. “Finding the right balance between what we want to do, who we want to share the data with, and how to protect it is the exciting challenge.
“We don’t know what we’re going to invent tomorrow,” Geetter adds. “We know we’re going to invent something. The challenge for our privacy regulation landscape is to try to accept that we need a model that can spring into action as innovation occurs. We cannot afford to develop a privacy and security framework that works only for today and is out of date tomorrow.”
All this effort by tech companies and government is designed to protect consumers, in part, from themselves. Everyone loves these new technological devices and the free tools found in applications. The public has grown accustomed to tools like Gmail and Yahoo arriving on their virtual doorstep for free. In fact, some would argue that it has made individual consumers particularly lazy in being advocates for their own privacy, and it has made it very easy to discard those hard-won protections.
Last year a Canadian court became one of the first courts worldwide to accept data from a Fitbit as evidence in a personal injury case. The plaintiff was injured five years ago, and once the case goes to trial, her lawyer plans to analyze the data from the Fitbit to prove her daily activity levels were below those of someone her age and profession. She was a personal trainer.
While it is certain fitness trackers will end up as part of litigation below the border, it is not clear whether they will help or hurt a case. A lack of standardization, users’ failure to wear, charge, or sync their devices, and the unreliability of logged activity, which could be generated with a wave of the hand, make wearable devices unreliable witnesses.
What’s important here, observers say, is not the reliability of the device to be used in court as much as consumers’ decisions to give up their privacy protections for convenience. Many individuals are more than willing to open up their lives for expediency in court, or in their workplace when offered bonuses or discounts from employers for wearing health trackers, or with life insurance companies that reduce premiums for customers who wear smart watches.
“Our longing for convenience means we’ve created a matrix that can and will be used against us,” wrote software analyst R. “Ray” Wang in his Harvard Business Review article, “Beware Trading Privacy for Convenience,” in June 2013. “Most of us just don’t know it yet.”
This article appeared in the December 2015 issue of Washington Lawyer, the official publication of the District of Columbia Bar, and is reprinted with permission.