Home  »  Intense AI push is “irresponsible engineering”

Intense AI push is “irresponsible engineering”

The widespread push to develop artificial intelligence products is “irresponsible” because of the enormous amount of physical resources required to support AI, says a leading ethics and computer science scholar.

“The picture is that we're investing a huge amount of our engineering expertise, capital (and) resources into developing this computing that requires a huge amount of material infrastructure to support,” said Batya Friedman. “What I want to argue is that this is an untenable direction for us…. Computation of this sort across the board, at the kind of scale that is being advocated for right now, I want to argue is untenable – it's irresponsible engineering.”

Friedman is a U.S. computer scientist who studies the ethics of technology. She was the guest speaker at a recent Critical Tech Talk, a series hosted by the University of Waterloo’s Critical Media Lab with support from Communitech.

A pioneer in the field of “value sensitive design,” Friedman laid out her thesis for why engineers and other designers of technology should do more at the start of the design process to consider the human and environmental impact.

She introduced the concept with a few straightforward examples, such as the need to consider patients and caregivers when designing health-care technology.

Friedman then introduced more complex examples: first, nuclear energy; second, plastics; and third, artificial intelligence and machine learning.

To properly assess the impacts of such technologies, she said engineers and scientists should consider three key factors that shape the potential harmfulness of a product: structure, scale and time.

Using these factors to assess nuclear energy and its waste products yields a number of clear but disturbing facts: “… it is unsolved deadly residue in which a small amount is extremely harmful and for an exceedingly long period of time, 1,000 years to 24,000 years,” said Friedman.

“What I want to say is that any technology which has that characterization for structure, scale and time, we shouldn't build. It's simply irresponsible technology.”

Applying a similar analysis to the production of plastic products provides a similar but somewhat different conclusion, Friedman said. 

Structurally, plastics do not decompose and can be harmful – in terms of clogging oceans and landfill sites, and as toxic microplastics that harm living things when ingested. Regarding scale, the global plastic industry has produced over eight billion tons of plastic since the 1950s and continues to produce more than 367 million tons each year. 

“If the trends continue, by 2050 the plastic industry could account for 20 per cent of the world's total oil consumption. And (regarding) time, there's no known end; plastic does not decompose. All the plastic that's ever been produced ends up in the environment and is still present there in one form or another.”

But unlike nuclear waste, Friedman argued, there are some important uses for plastics – such as in medical technology – for which there are no substitutes.

“Plastics is a more complicated case,” she said. “I would not want to go so far as to say for those applications for which we don't have other materials, that we stop developing those. Maybe we look for alternatives, but while we still don't have those alternatives perhaps we need to continue (using plastics).”

Friedman then applied her structural, scale and time analysis to artificial intelligence, machine learning and “computing writ large.”

Although we talk about “the cloud” as if it were some sort of ethereal space, Friedman noted that it consists of huge amounts of hardware in the form of enormous server and data centres. Building this kind of physical infrastructure – and then powering it and keeping it cool – requires extracted minerals, electricity, water and many other resources.

While that may be manageable on a small scale, the cumulative impact of the “massive growth” in AI – all the hardware and energy production to create and run these computing systems – is “untenable,” said Friedman.

“None of this is new news but it’s the magnitude of it all that we need to take into account,” she said. 

Friedman also urged her audience to “think outside of the AI-ML box.”

“We need to stimulate alternative moral and technical imaginations,” she said. “A real lesson to take away from nuclear power is, ‘Don't put all your eggs in one basket.’ We don't want to just develop AI and know how to think within an AI and machine learning frame, because then we won't be in a position to imagine other kinds of solutions.”

This is why value sensitive design is a crucial part of the design process, she said.

“When we hold ourselves accountable to ‘responsible innovation’ we need to be able to do these kinds of analyses and then, on the basis of these analyses, make assessments as to whether or not these are technologies to explore further or to back off from.”

Over the course of her talk, Friedman offered a number of suggestions about how to insert value sensitive design into daily practice and engineering education.

  • Include human values as criteria for evaluating system performance alongside other criteria such as reliability.
  • Think long-term and the impact of scale.
  • Include more discussions about “materiality” – the kind of materials used and their impact over time and scale – in the design process, and make it a key part of engineering education.
  • Remember that our planet is finite yet regenerative, and engineer within this constraint.

Perhaps Friedman’s most challenging suggestion was to say “no” to some technology solutions.

“Have the courage not to build, to just say no,” she said. “That's a huge responsibility that we have to say to policymakers and industry and the public – ‘No, this kind of engineering solution is not a good one for our society going forward.’ We need to say no to it, we can build it but it's not taking us in livable directions. That needs to be part of the training and the mindset that we give to our young engineers going forward.”

The comment attracted an observation and a question from Carl Tutton, one of two UW PhD students who were invited to participate in Friedman’s Critical Tech Talk.

Tutton, who is researching end-of-life electronic waste policy, said he often hears design engineers say how hard it is to say “no” to the demands of employers and clients, especially when so much time and money has been invested in existing products such as mobile phones and laptops.

Friedman acknowledged the challenge. But she said the answer lies in using science and research to explore more sustainable options.

“So, ‘use your imaginations’ is what I really want to say,” said Friedman. “That’s saying yes, that's not saying no. It's saying yes to what the alternatives are that you would really like to see.” 

"Communitech helps tech-driven companies start, grow and succeed. Communitech was founded in 1997 by a group of entrepreneurs committed to making Waterloo Region a global innovation leader. At the time it was crazy talk, but somehow this community managed to pull it off. Today, Communitech is a public-private innovation hub that supports a community of more than 1400 companies — from startups to scale-ups to large global players. Communitech helps tech companies start, grow and succeed in three distinct ways: - Communitech is a place – the center of gravity for entrepreneurs and innovators. A clubhouse for building cool shit and great companies. - Communitech delivers programs – helping companies at all stages with access to capital, customers and talent. We are here to help them grow and innovate. - Communitech partners in building a world-leading ecosystem – making sure we have all the ingredients (and the brand) to go from a small startup to a global giant."

This website uses cookies to save your preferences, and track popular pages. Cookies ensure we do not require visitors to register, login, or share any identity information.