Shannon Vallour: Technology and AI are human all the way down – Digitalmunition




Featured RUxNODg3MDg4OTI.jpg

Published on August 29th, 2020 📆 | 4584 Views ⚑

0

Shannon Vallour: Technology and AI are human all the way down

Professor Shannon Vallor explains why she wants to fuse the best of hi-tech advances and ethics to create “a future worth wanting.”

Saturday, 29th August 2020, 11:34 am

Updated Saturday, 29th August 2020, 11:36 am

We must avoid technological determinism – the idea that technology leads and society merely follows.

That’s a lie, and a convenient evasion of responsibility for those building their values into these technologies.

Sign up to our daily newsletter

The i newsletter cut through the noise

Technology and Artificial Intelligence are human all the way down – built to promote, optimise or systematise, to create power and realise specific values in the world.

Humans are the creators of this technology and we need to ensure that human accountability isn’t lost.

My work at the Centre for Technomoral Futures (part of Edinburgh Futures Institute) starts with the premise that technology and morality, or technology and ethics, are intimately related.

That’s because all technologies reflect and enable human powers, choices and values.

So we have to reject the artificial, damaging split between technology and society. Doing technology right is no different to doing society right. Technology does not live outside our social world; it’s interwoven.

I want to figure out, using a blend of data-driven and humanistic tools, what are the forms of expertise, technological and moral, that can design and manage systems and ways of living that work better for people, to build better futures.

It’s not the tools themselves that can build those better futures, it’s people and the moral and social intelligence they use.

To help achieve that, we need to reunite forms of expertise currently cleaved off from each other in universities and encouraged to develop in a relationship of antagonism.

At present, we see ethicists telling technologists where they have gone wrong, or technologists telling ethicists that they are deluded or irrelevant.

How can we intervene earlier, create something which isn’t a battle between technology and ethics but something truly collaborative?

I want to take energy and desire out there and give them a path to action at the Centre – to provide for people with a desire to use their technical and moral intelligence together, and bring them into their work more.

How can we use data and AI in socially and politically constructive ways to build systems and institutions that actually support people?

The digital environments we have built are not conducive to the kind of community, democratic structures and types of leadership most of us want for our futures. We have to address their systemic harms, such as disinformation on a huge scale, which corrodes the social virtue of honesty as respect for truth.

What digital environments, platforms, processes and systems do we need to enable a future worth wanting, where we can flourish together?

Humanity needs to make progress in step with technological progress – and the clock is ticking ever faster – to make those transformational changes to the way technology and other elements of society interact.

Science and technology should be unleashing human opportunities at every turn to allow us to be sustainable and flourishing. Yet for many people on this planet, their opportunities to create new and better ways of life are shrinking thanks to political and environmental destruction. That’s a fundamental crisis.

We have to move quickly to use our technological and moral intelligence to remove obstacles to a sustainable and flourishing future. The window of opportunity is here, but will not be open indefinitely.

Professor Shannon Vallor is Baillie Gifford Chair of Ethics in Data and AI at the University of Edinburgh

Source link

Tagged with:



Leave a Reply

Your email address will not be published. Required fields are marked *


loading...