Skip to main content

Newsroom

Emerging technologies are creating new ethical challenges for UX designers

Bill Gribbons

New technologies have always produced unintended consequences. But user experience (UX) designers and engineers face a number of new ethical challenges today with the rise of technology and our interaction and dependence on it.

UX designers’ primary job is to improve usability and extend productivity. But they also have a responsibility to address the unintended consequences of new technologies, some of them with a clear ethical dimension. Following is a look at some of the principle ethical quandaries that UX designers will run up against and must deal with responsibly.

Human costs and de-valuing work

So much of the UX discipline’s early efforts were driven by the desire to improve human performance and productivity while reducing errors. Few questioned the value of these gains, achieved by optimizing system design, augmenting human ability, and automation, especially as it eliminated dangerous, repetitive, or tedious work — think of assembly line factory jobs that in past decades injured and maimed scores of people.

But some forms of automation come at the cost of diminishing the work’s intellectual and emotional value. Consider the levels of automation found in fast-food restaurants or warehouse fulfillment centers, where work is de-humanized, worker growth is diminished, and the value of rewarding work is stripped away. Undoubtedly these issues were at play with the spate of protests and suicides by distraught Foxconn workers in recent years.

The question for the UX professional who designs these work experiences then is: at what point must efficiency and optimization yield to human concerns?

"De-skilling"

Over the past two decades, there have been tremendous advances in the development of powerful support systems that augment human intelligence in demanding environments. For example, some aircraft systems, such as the Boeing Dreamliner and the  F-35 Lighting II, have become so complicated that they challenge the human capacity to fly them without assistance from an “intelligent” assistant. The positive benefits of this technology can reduce error and improve safety.

At the same time, UX researchers must examine the possibiliy that automation can create a situation where skilled operators can be replaced be less-skilled operators. (On a mainstream level, that would include losing the ability to navigate without the aid of GPS, or more simply the ability to do math without using a calculator.)

In some cases, the gains from technology will outweigh the loss of skills. In others, the level of support and automation might warrant reconsideration. Whatever the outcome, it is critical that UX designers initiate this conversation, so that users of technology can make informed choices about their extent and consequences.

Influencing user behavior

We’ve gotten pretty good at being able to subconsciously influence and alter behavior (by nudging, for one), which creates a vexing ethical conundrum for UX designers. The UX professional must understand that for every product created with the “best intention,” there will be another that deliberately nudges the user to ends not in the user’s best interest. Thus on the one hand, they recognize that human behavior often results in sub-optimum choices and actions. On the other hand, they recognize that they have the potential, through design, to affect that behavior in other ways — positive and negative.

So how do UX professionals define their ethical responsibilities as they subconsciously influence users’ decisions or actions? The case of producing negative outcomes is clear; less clear is who determines what is “positive.” The line between the two is often not well defined. Take for instance the medicare prescription drug plan finder tool on the medicare.gove site which navigates this dilemma well. It guides and supports the user in an unbiased fashion to the plan that best aligns with their health needs — a great improvement over early support efforts on the site.

The erosion of privacy

With the best intentions, technologies have been developed to remotely monitor the activities of the elderly — what and how much they eat, where they’re located, even when they take their prescriptions. Similarly, products like vuezone or  Car Connection allow parents to monitor every movement of their children — what they’re doing at home, how fast they are driving, where they are at 2 a.m.

The benefits of such technologies are real, for one allowing the elderly to live independently or for parents to be confident in the safety of their children . Yet such constant monitoring of the individual can also have the opposite effect, instead leaving one feeling the loss of highly valued privacy and dignity because of non-stop monitoring. With each new capability comes added consequences.

The dangers of distraction

The convergence of technologies can tax our attention spans in a way that threatens the limits of human capabilities. One case is the increased integration of communication, navigation, and entertainment technologies in automotive design. We now have GPS screens, entertainment monitors, handsfree cellphone use, and advanced stereo systems with various control mechanisms.

While these technologies deliver unquestionable value and pleasure to the driver and passengers, they indisputably divide the operator’s attention, distracting him or her from the stated purpose of driving, leading to life-threatening situations (and that’s not even including texting while driving). The problem has become so severe that the Highway Safety Administration has created a website to address this issue.

So what responsibility do UX professionals have in these situations? The likelihood of distraction and its consequences should become an area of intense focus in the UX discipline’s research agenda.

At the end of the day, UX professionals must increasingly consider where their responsibilities lie — with the organization that reaps financial gains from the technology sold, or with the user who may possibly suffer negative or life- threatening consequences from these products.

Bill M. Gribbons is professor of information design and corporate communication and director of the graduate human factors program at Bentley University.

This article first appeared on Gigaom.com.