Skip to main content

Newsroom

Mary K. Pratt

Anyone who has ever searched something on the Internet knows that the computer remembers what you did. Click on that shirt for sale, check details on those airline seats, look at the hotel price, and ads for those items will keep popping up. The computer is, indeed, keeping tabs.

And it’s tracking much more than your search history. Organizations of all kinds are gathering all sorts of details about you and your neighbors via computers, intent upon studying you under the banner of big data and analytics. So be warned, says Bentley professor and user experience expert William M. Gribbons: “Anytime someone is interacting with technology, someone else is collecting data.”

 

Awash in Algorithms

Sure, humankind has been keeping records since the days of cave painting. But technology has exponentially increased the amount and types of information available on each and every individual, along with the ability to access, analyze and use that information.

“Companies use algorithms to know who you are, your past purchases, whether you want the latest thing, whether you want it now, and what kind of price you’ll pay,” explains Gribbons, director of the MS in Human Factors in Information Design (HFID) program and founder and senior consultant with Bentley’s User Experience Center (UXC). “The next step is that you’ll walk into a store and the sales clerk will be getting information about you and what it will take to get you to buy something.”

The question now is, where do we draw the line?

 

Lessons from History

The line dividing “can” and “should” is familiar territory for W. Michael Hoffman, founder and executive director of the Center for Business Ethics at Bentley.

“A lot of ethical issues have opened up in trying to deal with new technologies and how they affect people’s rights, particularly around security and privacy,” explains Hoffman, who has seen similar challenges unfold in medicine, finance and a host of other industries during his decades-long career.

The Hieken Professor of Business and Professional Ethics cites specific areas of concern that need to be addressed. To start, Hoffman says, people must ask who is collecting and accessing all this data — and who is keeping track of all the individuals charged with monitoring data. Is anyone vetting those workers who can access an organization’s collection of data on individuals?

“Finding out these different facts to target consumers in more direct and relevant ways . . . there is good that can come out of this, not only for the corporations but for people who are looking for specific goods and services,” he says. “The darker side is that data could be used maliciously and in a way that could harm people.”


Right to be Forgotten

Some are already trying to limit the sting of this data revolution.

Deanna (Brown) Duplak ’82 is among those who believe that links to information based on a search by one’s name should not be available to anyone, at any point, for the rest of time.

The former Computer Information Systems major has drafted a “right to be forgotten” bill that’s now in the Massachusetts Legislature. The bill would allow removal of links to data found through a name search; it’s meant to protect people from inaccurate, hurtful and prejudicial information popping up anytime someone searches their name online.

“Why should anyone, with the click of a button, be led to information that is irrelevant or out of date?” asks Duplak, noting that Europe already has such a law in place. She sees efforts in the U.S. gathering support as people learn how much data can be found by entering one’s name in a search engine such as Google, Bing or Yahoo.

Read more about Duplak and her bill in her article: When Googling Goes Bad.


Whose Mandate?

Duplak, like Hoffman, believes that society must determine who can collect what data, on whom, for what purposes.

“The questions need to be asked and the concerns need to be raised,” she says. “There won’t be any quick answers. But you do have pockets of people and organizations starting to look at the issues.”

For example, the Council for Big Data, Ethics, and Society launched in 2014, prompted by a request from the National Science Foundation. Members are researchers and professionals from numerous fields who bring social and cultural perspectives to big data initiatives.

But so far, neither the council nor any other group has received a clear mandate to develop society-wide ethical standards.

“I suspect there will need to be some sort of regulatory body to oversee the use of big data and the ethical decisions that have to be made,” says Hoffman, who points to panels and review boards established to consider ethics questions in the field of medicine.


Cause for Optimism

The pending issues are many and complex, according to Niek Brunsveld MBA ’13, senior policy adviser for research and innovation at the University of Amsterdam and a visiting lecturer in the Bentley MBA program.

“Even at a very basic level, big data analysis rests on many, many presumptions that contain value judgments,” he says. “But, of course, it also has a huge potential to improve our lives and the lives of those who are in dire circumstances — which is an ethical imperative as well.”

Brunsveld expects that, in the coming decades, society will rise to the challenge.

“We will be drawing up programs, at our organizations, that make us aware of our values, help us develop more shared values, and make us better appreciate new pathways, new technologies from an ethical perspective,” he says. “If societal stakeholders — academia, businesses, governments and NGOs — work together from the start to the finish of the innovation process, we will come to agreement on where we do and where we do not want technological innovations in big data to go.”