Thursday, October 8, 2015

The dark side of wearables: How they're secretly jeopardizing your security and privacy

The seductive lure of activity and health wearables make it easy to forget, or ignore, the inherent security and privacy risks involved.

The gentle buzz of a wearable device vibrates on your wrist at 7 am. You sync the device with your smartphone to see how well you slept. The result: poor. You feel groggy, so you drink extra coffee, which increases your heart rate, a data point also recorded by your wearable. You trudge through your morning routine more slowly than usual, so you skip that half hour on the elliptical and head for the office.

Along the way, you run into traffic, which boosts your stress level. You finally arrive, and head to your desk. You're crunched for time at lunch and you opt to have a submarine sandwich delivered. By the time you get home, the earlier hope of an evening workout is forgotten. All you want to do is crash on your couch with a cold beer and a takeout pizza. A few beers later, it's time for bed, and you turn out the light and hope for a better night's sleep.

Imagine the impact after a few weeks of this behavior. Your poor sleep quality is triggered by alcohol and lack of exercise. These, combined with a sleep deficit that slowly affects your overall health, leads to weight gain, increased blood pressure, and other problems. And all of this data is stored in your wearable device—or, more accurately, in its cloud software.

The fact that so much data is collected through a wearable device, such as an activity tracker, a smartwatch, or a pulse tracker, means that there are tangible risks involved, according to Conan Dooley, a senior security engineer with Box, and previously a senior security analyst with Bishop Fox.

If that data was carelessly stored, and then stolen through a data breach by a malicious third party and sold to unscrupulous organizations that want to use that data to assess your health risks, you could one day face steep increases in health insurance, or even a policy cancellation. The risk of this is so real that some companies are buying data breach insurance to protect themselves in the case of consumer information getting into the wrong hands.

If you've willingly shared this data with your health insurer, through discount options at work, you may already be facing rising insurance costs without any data breach necessary, since many employers offer "good health" discounts to employees who stay within regulation weight and exercise parameters to receive a significant savings on health insurance.

These are significant repercussions for simply wearing a device on your wrist to tell you how many steps you took a day and what your resting heart rate averages. It's up to the consumer to determine the level of risk they're willing to take versus the benefit they get from their wearable devices.

By the end of 2015, there will be an estimated 200 million wearable devices on the market according to ABI Research. By the end of 2018, there will be 780 million wearable devices on the market. This gives hackers plenty of opportunities to steal sensitive data and benefit financially from it.

What consumers need to know

As more consumers purchase wearable tech, they unknowingly expose themselves to both potential security breaches and ways that their data may be legally used by companies without the consumer ever knowing.

"There is an opaque bubble around all of this data and what we do with it. Until we give people more access to their data and, frankly, the option to delete it, this thing has grown more personal as a result," Dooley said. "As often as not, the complexity of infrastructure means that deleting data is often very difficult to do because of the interconnected nature of databases and the need for historic reference. Your website can't just break. If a person goes to a picture that's no longer available ... there is a complexity around the deletion of data."

istock000057616578medium.jpg

Just because you agree to share your data with one company, or the government, doesn't mean that that company will be in business next year, or new laws could be passed that change access to the data that you willingly gave up your privacy rights to share.

"Really we're entering this world where everything is cataloged and everything is documented and companies and governments will be making decisions about you as an individual based on your data trail. If you want to be considered an individual and not just a data point, then it's in your interest to protect your privacy," said Josh Lifton, MIT Media Lab Ph.D. and CEO of Crowd Supply.

And if a company files for bankruptcy, what does that mean for the data they've collected?

Consider the litigation involving RadioShack, said Tatiana Melnik, an attorney who works in healthcare IT and data security. "As part of their bankruptcy they were trying to sell all of the consumer data they had collected over the years. Apple stepped in and said you can't sell data that was collected in conjunction with an iPhone user," she said.

"Enforcement, aside from the actual data aggregation, is going to become a risk. How are you going to keep companies from redistributing that data? What rights do consumers have to their own information?" Melnik asked.

As reported previously on ZDNet, the mass collection of data on US citizens has spurred the Federal Trade Commission to send a report on data brokers to Congress in May 2014, asking for legislation to allow people to know what data is being collected about them and who is collecting it. Data brokers have collected an average of 3,000 data segments on nearly every US consumer, according to the FTC report. This is outside of the data being collected by wearable devices.

As wearable devices make their way into the workplace and corporate networks, they bring a host of security and privacy challenges for IT departments and increase the amount of data that data brokers have to sell about an individual.

Jeff Jenkins, chief operating officer and co-founder of APX Labs, talked about the security and privacy of wearables during a panel interview with Tech Pro Research at CES 2015. Because wearable devices are designed to be small and portable, Jenkins said, "you have to make sure you're thinking security first and you're thinking about the information that's being generated by them. You have situations where it's no longer just personal data that may be exposed or compromised, but also potentially operational data, that could be sensitive in nature."

The reason behind the security breaches is because personal data is extremely valuable. Gary Davis, chief consumer security evangelist at Intel Security, said, "The information that's contained on your wearable that's stored either on your smartphone or stored downstream on a cloud [service] is worth ten times that of a credit card on a black market."

"Credit card companies have gotten so good at being able to detect fraud and if there' s another high profile retail breach, they typically say, 'Okay here is when the breach took place, let's cancel everything done during that breach.' Done. An extremely short life on the black market.

But this information being stored on these wearable devices doesn't go away. You can't change your Social Security number, you can't change your date of birth. This is personally identifiable information that you can't change," Davis said.

With health information, it goes a step further. "This person had this injury, let's process a claim for a fraudulent pain prescription and go sell it on the black market. It's hard to clamp down on that because of HIPPA. There's a reason why you hear about all these mega breaches going after healthcare companies. Hackers realize this is high value stuff," Davis said.

What manufacturers need to know

Part of the problem with the security of these devices is because wearable makers are rushing to beat their competitors and get their product onto the market first.

"If my challenge is to get my device out there as soon as I can and make it as convenient as I can ... They're basically putting out these devices that are extremely vulnerable to attack. That's true for wearables," Davis said.

"It's all about land grabbing right now. They're all trying to be first to market. The challenge for security people is it's hard enough to get consumers to update their apps on their smartphones or update their operating system and making sure they're applying the right security patches, which is pretty straightforward by updating in the app store. Doing it on a wearable device is significantly more complex. It will be harder once you get these devices out in mass to apply security patches. Users won't go to the time or effort to make these devices more secure," Davis said.

Faster is not always better, even in the technology world.

Melnick said, "Getting a piece of technology out quick is costly. Not in terms of money, but you sacrifice privacy and security and other considerations in order to get technology out as quickly as possible because you want to be first on the market.

"Unfortunately for companies, that's shortsighted because if you build privacy and security as part of your development process you're actually long term saving money because when something happens with your technology, which is inevitable, fixing those errors and dealing with the investigation and dealing with regulators is significantly more costly than compared if you had done it right the first time."

To reduce the risk, companies need to build privacy and security into their existing development process, Melnick said.

John Dixon, director of marketing for Freescale Semiconductor, said that wearables have the same fundamental challenges as Internet of Things (IoT) devices. Wearables can provide a wealth of data on an individual, including information on their location.

There are things for people to consider before buying a wearable device. "A number people will know all of your personal data. Do you care if people know your pulse and movement? There may be situations where that is really important. People like Apple and Samsung and these other bigger companies, I think for the phones [they design], they are big enough companies that they have huge teams looking at device security," Dixon said.

The problem is that this security isn't part of many IoT devices, he said. "The challenge with some of the IoT watches is that if you're paying $500 for a watch the manufacturer can afford to include it, but if you're buying a pulse or an activity tracker it does not include it, most likely. You're counting on the vendors like ourselves having security measures in place," Dixon said.

Wearables with "price points under $300 to $400 you're relying on the semiconductor," he said.

Freescale is focusing on off-the-shelf solutions for startups developing wearable devices. Freescale supports a startup incubator with security guidelines and up to 100 companies can be in one incubator and they're given baseline security measures to implement in devices, Dixon said.

"We've created WaRP. The WaRP platform is an open source smartwatch which allows manufacturers to be able to add any individual functions. There are so many functions you can add to a wearable watch. You can write your own software ... to create whatever functionality you want. The i.MX 6 which is what it's based on is one of our most secure platforms.

"It's not just about the first product, it's about iterations of the product. Once they switch to a new product the cost is quite high. If they don't have security in the first product, they can put it in the second. This platform is field upgradable. If you have a smartwatch you can have the ability to do it remotely and add security," Dixon said.

Fitbit and inherent risks

One well-known manufacturer of wearable tech, Fitbit, is the first wearable tech-focused company to go public. Fitbit filed a successful IPO in June, with CNBC reporting the stock opening 52% above its IPO price at $30.40, and by August 5, 2015 it closed at a high of $51.64.

This shows the interest in wearable tech, but Fitbit's SEC filings also show some of the risks that manufacturers face. In the company's S-1 filing with the SEC on May 7, it outlined the risks, including the following:

"If we are unable to successfully develop and timely introduce new products and services or enhance existing products and services, our business may be adversely affected. We must continually develop and introduce new products and services and improve and enhance our existing products and services to maintain or increase our sales. The success of new or enhanced products and services may depend on a number of factors including, anticipating and effectively addressing consumer preferences and demand, the success of our sales and marketing efforts, timely and successful research and development, effective forecasting and management of product demand, purchase commitments, and inventory levels, effective management of manufacturing and supply costs, and the quality of or defects in our products. The development of our products and services is complex and costly, and we typically have several products and services in development at the same time."

Again, because there is so much competition, it's important to manufacturers to try to be the first to market. Some companies release user agreements that promise security and privacy practices that they cannot live up to. "Companies in the wearable space need to be sure that the things they are telling customers are actually true." If something is included in a company's user agreement that they don't actually do, then they are being 'risk negligent,'" Melnick said.

Data breach insurance

It's not directly related, but important to note that for several years, companies have been purchasing cyber liability insurance to deal with data breach risks and the potential for consumer litigation, Melnick said.

But the insurance companies are starting to fight back.

In May, Columbia Casualty became the first insurance company to challenge their liability after their client, Cottage Health System, had a data breach caused by a lack of encryption on their network servers that made confidential patient information accessible on the internet. The insurance company paid $4.125 million to settle the plaintiff's class action suit against the healthcare provider but the insurance company has now filed to recoup those funds. They say the company misrepresented their control.

The outcome of this case could make it even more urgent for companies to not only protect the data they are collecting via wearables, but make sure that everything they're promising to deliver security-wise in the end user licensing agreements [EULAs] is true, Melnick said.

The overreaching nature of the EULAs are a concern to Josh Waddell, vice president of mobile solution management at SAP.

"If there's ever a big problem, the EULAs are not enforceable. They can write all that stuff in there, 'We have the power to use your data. We're going to take your pulse every second and then we're going to send it to your health insurance company.' They can write that in a EULA but if that got out, no judge is going to say, 'You wrote that in your EULA so that's fine,'" Waddell said.

Privacy vs. security

Even the actual health information, without a breach, can pose a problem since there's so much personal data being gathered. This dips into privacy issues rather than security.

A safeguard to privacy is needed, said Ian Chen, marketing manager for Freescale Semiconductor's sensor solution division.

"Companies give you a discount on health insurance if you wear a device. Then you look at the data the wearable is giving you. Is it fair if they say if you don't go to the doctor in the next three months your insurance will go up? What if they can mine the data and find out you're an aggressive driver and raise your insurance rate?" Chen said.

But consider who has access to the data. "Everyone says Apple and Google has all the data. People forget that Verizon and AT&T have it, too," Chen said. "Verizon and AT&T can rent the data out."


To solve this problem, Chen said, "I think we should have a privacy protocol that device to device have to communicate and say that, 'I'm requesting this level of privacy and I'm instructed to protect it up to that level.'"

Privacy rules need to be instituted quickly, because the amount of data being collected is growing at an astronomical rate. "By 2025, there will be more data generated from sensors and devices than all of the data being generated today from every source," Chen said.

As for what consumers should do, the problem is that a lot of consumers aren't particularly knowledgeable about technology and they don't always pay attention to the devices in their house that are connected to the internet, nor are they aware of the things that can happen, Melnick said.

The key turning point will be the first big litigation on the subject, Melnick predicted. The FTC has litigation going on regarding privacy of data, but not of wearables.

Dooley also believes it will take a major lawsuit to spur manufacturers to better protect the privacy and security of the data they are collecting.

"I honestly don't know where we really go from here with that because I feel like unfortunately we're going to have to have that Ford Pinto moment. I think that's a particularly interesting event in retrospect because Ford Pintos were never significantly shown to be more dangerous in the long run than any other car on the road at the time. Any car with a rear mounted gas tank was as likely to cause a similar incident in a similar situation," Dooley said.

After the Ford Pinto's were labeled a fire hazard, standards were set for the auto industry, Dooley said.

Insurance fraud and solving crime

Wearable devices can also be used to support cases where insurance companies are suing for fraudulent claims.

Karhrman Ziegenbein, CEO of Toonari Corp., said his company works with police and insurance companies to solve crime and fraud, using data from wearables and mining social media.

"There's a lot of insurance fraud out there. A lot of people are using wearable tech. If a person a got hit by a car and says, 'I can't walk anymore,' but the person is using a device like that, then you can get the data from this device. Not everybody is planning these things really well out. If the person is wearing this device at the deposition, the attorney can see it. Or the attorney can ask under oath, 'Do you wear a Fitbit?'"

Once it's known that the defendant uses a wearable device, it's considered transactional data. "You can subpoena this data or get it through discovery."

Since distance walked and elevation are part of the data collected, it gives a good indication of someone's fitness level. It can prove fraud, and it can also benefit the user, if they're truthful, because it would show that they're not able to walk and they're not as active as they were before the accident, he said.

"The Android devices are much worse than the Apple device. There are much more things the Android apps have access to. For us it's not as regulated as you have it with Apple. That's why you have more challenges. It's nice because you can do more things with it," Ziegenbein said.

Regulation vs. compliance

Part of the debate involves whether the manufacturers should regulate themselves, or if the government should get involved.

Crowd Supply's Josh Lifton said, "Regulation can work, it can also be a complete failure so I wouldn't put all my eggs in that basket. Regulation is a reflection of public sentiment, or it should be. I think it may be effective without regulation. I would welcome regulation. I think privacy and security of data is a fundamental right. I think this is one of the most important topics to be discussed right now."

Too many people are willing to give up their data without measuring the cost.

Basil Hashem, senior director of end user computing strategy at VMware, said, "I think we're in a world I call the Uber-fication of our lives. You ask someone point blank for their location and credit card number and they say no. You say, 'sign up here and I'll pick you up and they give it to you.' In our lives, conveniences seems to trump privacy every time."

To resolve this, Hashem said, "It's incumbent to the industry to police themselves. I think in certain cases it makes sense to have government regulations. There are a lot of things we can do to educated consumers and device manufacturers to show what data you collect."

Alan Dabbiere, chairman of AirWatch by VMware, said, "You don't want the government invasive but you don't want them late to react to this new world."

Fred Steube, senior director of emerging technology for Cox Target Media, said, "I think the privacy breaches will continue to increase but we'll need guarantees from these companies that they won't resell, reuse or otherwise share our data. Ultimately either on the hardware side or the marketing side they'll try to self impose best practices so it doesn't become creepy and weird and a negative experience for consumers.

"I think that will happen. I just see people getting upset and Congress leading the way in and being regulated by the government," Steube said.

Dooley said the solution would be a collective group of regulators, combining government and manufacturers. "There is a need to have a balance. Where that balance is struck depends a lot on individual's comfort with privacy in general. The fact is most folks aren't even ready to have that conversation yet."

Weighing the risks

Wearable devices will continue to grow in popularity, as consumers appreciate the immediate access to fitness tracking, health tracking and other convenient measurements. As of yet, there have been no well-publicized data breaches involving the data collected by health and fitness wearables and smartwatches, so there hasn't been a public outcry about the privacy and security risks.

But numerous experts say that will eventually happen, because the value of the data is worth much more than that of, say, stolen credit card numbers. Security options are being offered through some resources, such as Freescale, but they are few and far between at this point.

Until solid regulations are in place, either through the government or private industry, or a combination of both, there will be inherent security and privacy risks involved with wearable devices. Meanwhile, it will remain up to the consumer to determine if the risks of wearing that trendy Apple Watch or Misfit Shine are worth the gain.

No comments:

Post a Comment