The Stand.

Stories from UOW

Are we becoming increasingly enslaved to technology and in the thrall of its power, without fully understanding the consequences?

In Middletown, Ohio, a city of about 50,000 people in the United States Midwest, local police requested an unusual search warrant. They subpoenaed a person’s pacemaker. Police were suspicious of conflicting statements made by Ross Compton, 59, after a fire at his house.

He’d claimed to have been able to pack a few belongings in a suitcase, smash a glass windows, escaping the house and the inferno inside.

The problem was that Compton has an artificial heart implant. As Ohio paper the Journal-News reported, police used a search warrant to access the data stored in the device, which included “his heart rate, pacer demand and cardiac rhythms prior to, during and after the fire”.

A cardiologist called as an expert witness reviewed the data and concluded there was no way the suspect “would have been able to collect, pack and remove the number of items from the house, exit his bedroom window and carry numerous large and heavy items to the front of his residence during the short period of time he has indicated due to his medical conditions,” the paper reported.

The tell-tale heart: data from a pacemaker was used in criminal proceedings.

It wasn’t supposed to be this way. At least the people in the labs who came up with the ideas and conducted the research, from the pioneering science through to the clinical trials, didn’t intend it to be this way.

Here was the coming together of multiple strands of medicine, biology, chemistry and engineering with physicians, business leaders and the patients themselves to develop the first implantable pacemaker in 1958.

A device that has saved thousands of lives around the world. And in 2017 a descendant of that device is ‘testifying’ in court, helping to lock up an arsonist.

The pace of change

Professor Katina Michael, from the University of Wollongong’s Faculty of Engineering and Information Sciences, says new technologies have the potential to make us smarter, healthier, and even happier than ever before.

But the speed of development is outpacing our understanding of how they influence our lives.

We cannot – nor should we wish to – stop the march of time. With possibility comes responsibility: this is the paradox of technological potential.

Professor Katina Michael

Away from the hype of product launches that tap into the power of celebrity and pop star fandom, there are the email and cloud storage hacks, the debates about data privacy and rampant speculation about technology’s capacity to act and think for us.

The resurgence of films and TV shows about the dark underbelly of the technology that has become inseparable from daily life, Professor Michael says, is a result of a growing awareness of the need for a balanced discussion about technology. About its allure, appeal and potential for good, as well as the risks, unintended or otherwise.

“Technology has become all pervasive,” Professor Michael says. “It’s right next to us, it’s wearable in many cases or within arm’s reach. The smartphone has become a Swiss Army knife. I can write messages, I can take photos, I can look up navigational directions, I can call somebody, I can submit my university assignments, I can be on holiday and still have access to the internet.

“When we started to look at unintended consequences of technology, I put together an artwork in [the NSW South Coast holiday town] Tathra based on 100 responses from random people. When I asked people, ‘What are the unintended consequences of technology?’, only five people out of 100 actually provided benefits.”

They know the unknown knowns

The TV series Black Mirror, which debuted in the UK in 2011 and is currently enjoying popularity in Australia and the US through Netflix, is a critique of technology wrapped in social satire.

The disturbing thing about Black Mirror is that it’s not all that far fetched.

Its episodes are for the most part grounded in technology we already know about. They depict a “future we might actually inhabit”, as Jenna Wortham wrote in the New York Times Magazine, “making the show a lot more effective as a critique of the tech industry’s trajectory – one that might make you think twice about which devices you buy and which services you use”.

The series draws on a simple premise: what could possibly go wrong … in the worst possible way. Professor Michael says that unlike the first pacemaker inventors, who could never have predicted such an outcome, many technologists have a good idea of the impact certain technologies could have.

“There’s this supposedly invalid category, that negative consequences cannot be intended but merely are unanticipated. This is where people intentionally refuse to acknowledge that they know that there is the potential for negative consequences.

“Examples of this are ‘stickiness’ drivers in gaming and certain apps that tap into our desires and encourage repeated use to the point of addiction. It may be good design from the perspective that you get more users, but it’s not good design given the mental health issues, the sleep deprivation issues, the anxiety issues we might be causing unsuspecting subscribers who opt-in.

“Take the example of the pacemaker. We first wrote about embedded surveillance devices in the early 2000s and M G Michael coined the term ‘uberveillance’. We said embedded surveillance devices would be used like this, and here it is now. That’s just the trajectory of technology.

Professor Michael says there’s an uneasy relationship of trust we have with technology platforms we use. We have no idea how robust their security is, or what they’re doing with our data, yet we’ll happily share photos and text of private, perhaps even intimate moments of our lives.

Despite hundreds of data breaches exposing millions of user accounts worldwide – including major vendors such as Yahoo and Sony – any form of consumer protection for Australians didn’t arrive until just last month (February 2017), when Parliament passed the Privacy Amendment (Notifiable Data Breaches) Bill 2016 into law.

Control + C

Researchers quantify the intent behind a new technology, and how users adopt it, through the three Cs: care, convenience and control. With control being the dominant feature or function. What might start with the intention of making life better for either the wearer, or for those who are wary, or weary of the wearer, end up having a control dimension.

Like the GPS-enabled alerts for people with Alzheimer’s, which alert carers and family members if the wearer has gone wandering. Or the anklets and tracking software used by justice departments in Australia and abroad to keep tabs on the movements of criminal offenders, such as paedophiles.

“It’s always a trade off between these things,” Professor Michael says. “The Wander Alert for the Alzheimer’s patient means they have limits, they can’t go outside. Yes, it’s for their good, but at its heart it’s a form of control.”

Consider the supposed convenience of the tap-and-go payment methods.

“What do they want? More transactions. The credit card companies get commissions on transactions. Tap-and-go means that people don’t have to put in a PIN so they’re less conscious about their spending. That’s control – attempting to control spending behaviour.

“Credit card companies have chosen near field communication for contactless payment for supposed ease of use, despite it being the most insecure technology in the world. How does that make sense when you are trying to crackdown on fraud?”

We as humans seek stimulation or have needs that can be at least momentarily fulfilled via many online activities.

Professor Frank Deane

There must be an example of a technology development that is 100 per cent, no-holds-barred, hand-on-heart good for humanity.

Life saving medical implants?

“They can and have failed, causing serious injury, illness or death. And many of them are connected to data receivers or cloud services. Who’s accessing that data?”

Elderly people using tablets to video call family, overcoming isolation?

“Many don’t have the media literacy to avoid email scams or how in-app purchases work.”

The ability to communicate wherever, whenever, with whomever?

“If you can broadcast out, someone, with the right tools, can peer in.”

Greater access to a world of knowledge via the internet?

“During US presidential election 19 million bot accounts were tweeting in support of either Donald Trump or Hillary Clinton, working to sway public opinion and spreading fake news stories.”

The human factor

The element missing from the discussion thus far is the human. A human hell-bent on causing grief can take any device or tool and turn it to a destructive use. The problems, social media addiction for example, are slated back to the human.

It’s not the widget that makes you narcissistic / anxious / addicted or whatever ailment you choose. It’s that the human user is a narcissist / anxious / prone to addiction with or without the widget.

“Much of this is tapping into natural human processes,” explains Professor Frank Deane from UOW’s School of Psychology. “We as humans seek stimulation or have needs that can be at least momentarily fulfilled via many online activities.

“When we find what we want we get a little pleasure hit, for want of a better word, and we seek more of it. We’re programmed to do that and in many cases that process can be used to reinforce positive behaviours.”

Professor Deane says there are subtle technology effects that contribute to conditioning and potentially addiction. The notification sound on a phone can itself have the desired effect. “When you hear that sound, that signals that you may receive a small reinforcement, such as hearing from a friend or getting the latest gossip.

“From a research point of view, we’re only just beginning to try to understand the real effect on our social relationships and long-term, we still don’t really know what those effects are. This is in part because it’s such a rapidly changing environment.”

Yet, technology is almost inseparable from our daily lives, and forgoing the convenience seems an unnecessarily backward step. Professor Michael says that we as users of technology need to take responsibility. Control the tool, don’t let the tool control you, she urges.

“We need to become aware of the potential consequences, firstly, and responsible for ourselves and our behaviour,” she says. “We need to set limits in all facets of our life. Most of this is nothing new. If I’m overeating, I’ve got to eat less. If I drink, I can’t drive because I could harm others.”

Street artist Banksy explored the connection between technology and life in his mural dubbed ‘mobile lovers’. Photo:

Control the tool, don’t let the tool control you

Professor Michael uses technology every day, teaches technology at university, researches technology, edits research-based publications about technology – yet is regularly pigeon-holed as a naysayer, a harbinger of technological doom.

“The first reaction is silence after I speak,” she says with a laugh, revelling in the idea of disrupting the popular discourse. “Very few approach me immediately after I deliver presentations but by the afternoon, it’s sunk in. You can see people are really thinking about it and continue the conversation well after the conference is over.”

She subscribes to the view of “cautious optimism” espoused by Ann Cavoukian, the former Information and Privacy Commissioner for Ontario, Canada.

“We cannot – nor should we wish to – stop the march of time, Professor Michael says. “With possibility comes responsibility: this is the paradox of technological potential. It’s possible to develop technology that is privacy enhancing and celebrates humanity.

“We have to be optimistic no matter how dystopian these shows and movies are, no matter how apocalyptic some of these things might seem. However dark things might sometimes appear, hope must never be diminished.

“We have to have belief in the ability of the human spirit to triumph over unintended consequences.”

You might also like