Skip to main content

Newsroom

image of cloud

Liz Brown

The most recent Bentley Research Colloquium focused on Big Data and a broad range of issues and topics surrounding the topic. This series highlights some of the issues examined or suggested by colloquium presenters.

Imagine a world in which your boss knows how much sleep you are getting. A world where the fate of a job offer could hang on how high your heart rate rises during the interview.

We are immersed in technological advances that promise to make our lives easier and healthier. There are countless mobile applications and devices that will monitor your exercise level, sleep patterns, heart rate. Yet we often unknowingly place ourselves at risk by using them.

These programs collect all kinds of personal information that we trust will not be abused. But the truth is there are few, if any, regulatory protections to keep that material private. The legal strictures with the power to stop employers or anyone else from gaining access to your name, email address, age, height, weight, and health updates are, for the most part, not in place.

All of us are likely to find a reason for alarm here. As a Bentley University professor of business law and former partner in an international firm specializing in intellectual property, my primary concern is that of discrimination. We have developed an unprotected realm of information with the potential to damage the rights and opportunities of a multitude of people.

In today’s workplace, employee-monitoring practices are widespread. Businesses use software to track computer use, phone conversations, and even for off-site surveillance. Unprotected health data provides an additional source of information. Employers with such access could withhold a raise or fail to hire an individual who does not get an enough sleep, shows signs of depression, or records evidence of poor cardiovascular health. The possibilities go on.

Simply by using digital tools to aid in self-care, we become vulnerable to mistreatment. An analysis of 43 popular wellness apps by nonprofit Privacy Rights Clearinghouse found they often transmit unencrypted information over insecure network connections. Only 13 percent of free apps and 10 percent of paid apps encrypted all data connections between the app and the developer’s website. The analysis concluded that users should not assume any of their data is necessarily private.

Current laws do not offer necessary solace. The medical information we share with a doctor or hospital is covered by the privacy provisions of the Health Insurance Portability and Accountability Act, or HIPPA. However, a device that records your health data is only protected if prescribed or provided by your doctor.

The Civil Rights Act outlawed discrimination based on race, color, religion, sex or national origin. And the American Disabilities Law prohibits discrimination against individuals with disabilities in all areas of public life, including jobs, schools, transportation, and all public and private places that are open to the general public.

Yet now we have the potential for new forms of prejudice against which citizens are not yet safeguarded. If health data, for instance, leads to unjust bias due to your weight, there is no legal protection. That kind of discrimination does not fall under gender or ethnicity or any of the protected areas. You are on your own.

Ours laws evolve with technology but often lag behind. As an example, the Genetic Information Nondiscrimination Act was passed about two decades after it became possible to use genetic information to unfairly discriminate against or stigmatize individuals on the job.

Right now there is a lack of regulatory clarity for both the public and the people who make the mobile applications and devices. The government should help with clear guidelines that tell us what can be collected and how it can be sold and what kind of notice should go to the user and by what time. It would benefit everyone to have a clear set of rules.

Fortunately, we don’t need to have laws in order to know where we stand. Employers can create policies that protect personal health data.  More apps can adequately describe the risks involved with using them and develop privacy guidelines of their own. Right now more than a quarter of the free apps, and 40 percent of the paid apps, have no privacy policy at all.

We lawyers rely on the definition of what constitutes a reasonable expectation of privacy. As a society, we must decide what that means. We know the millennial generation has a lower expectation of privacy than any other. Are they going to say, “Privacy, what’s that?”

As a lawyer, I want to say that all we need are laws and transparency. But the truth is I cannot tell you when or if we’ll start noticing there is a problem at all.

Liz Brown teaches Business Law at Bentley University. 

Big Data Series

The Promise and Threat of Big Data: Inside Bentley's Research Colloquium
Digital Health Data Matters for Cancer Survivors
Are Wearables Destroying Your Privacy? 
When Googling Goes Bad
Finding the Signal in the Noise of Big Data
The Trouble with Big Data When It Comes to Women on Corporate Boards
Is Your Data Wearing a Black Hat? 
Join the Transforming Tech Industry