Tech-savvy investigators are ready to put algorithms under the microscope — if companies let them | CBC News (2024)

Science

Much as companies already have outsiders review their finances and the security of their computer systems, they might soon do the same with their decision-making code.

In the nascent field of algorithm auditing, researchers evaluate the behaviour of decision-making code

Tech-savvy investigators are ready to put algorithms under the microscope — if companies let them | CBC News (1)

Matthew Braga · CBC News

·

Tech-savvy investigators are ready to put algorithms under the microscope — if companies let them | CBC News (2)

Decades before you could buy a plane ticket on your phone, there were computerized reservation systems (CRS). These were rudimentary information systems used by travel agents to book customers' flights. Andthey had one devious flaw.

By the early1980s,80 per cent of travel agenciesusedthesystems operated by American and United airlines. And it didn't take long before the two airlines realized they could use that monopoly to their advantage — namely, by writing code designed toprioritizetheir own flights onCRS screens over thoseof their competitors.

Naturally, U.S. aviation regulatorsweren'tpleased, and the companies were ordered to cut it out. But the case — described in a 2014 paper from researcher Christian Sandvig — lives on today as one of the earliest examples of algorithmic bias.

It's a reminder that algorithms aren't always as neutral or well-intentioned as their creators might think — or want us to believe — a reality that's more evident today than it's ever been.

Tech-savvy investigators are ready to put algorithms under the microscope — if companies let them | CBC News (3)

In U.S. courts, reports generated by proprietary algorithms are already being factored into sentencing decisions — and some have cast doubts on the accuracy of the results. Sexist training sets have taught image recognition software to associatephotos of kitchens with women more than men.

And perhaps most famously, Facebook has been the target of repeated accusations thatit* platform,which serves content according to complexalgorithms, helped amplify the spread of fake news and disinformation, potentially influencing the outcome of the 2016 U.S. presidential election.

Yet, giventhe important role algorithmsplay in so manyparts of our lives, we know incredibly little about how these systems work. It's whya growing number of academics have established a nascent field for algorithmic audits. Much likecompanies already have outsiders review their finances and the security of their computer systems, they might soon do the same with their decision-making code.

Algorithmic auditors

For now, it's mostly researchers operating on their own, devising ways to poke and prod at popular software and services from the outside — varying the inputs in an effort to find evidence of discrimination, biasor other flaws in what comes out.

Some of the field's experts envision a future where crack teams of researchers are called in by companies — or perhaps on the order of a regulator or judge — to more thoroughly evaluate how a particular algorithm behaves.

There are signsthis day is fast approaching.

Last year, the White House called on companies to evaluate their algorithms for bias and fairness through audits and external tests. In Europe, algorithmic decisions believed to have beenmade in error or unfairlymay soonbe subject to a "right to explanation" — though how exactly this will work in practice is not yet clear.

A Harvard project called VerifAI is in the early stages of defining "the technical and legal foundations necessary to establish a due process framework for auditing and improving decisions made by artificial intelligence systems as they evolve over time."

Tech-savvy investigators are ready to put algorithms under the microscope — if companies let them | CBC News (4)

Harvard is one of a handful of schools — including Oxford and Northwestern — with researchers studyingalgorithmic audits, plus a new conference devoted to the subject will kick off in New York next year.

Outside academia, consulting giant Deloitte now has a team that advises clients on how they can manage "algorithmic risks." And mathematician Cathy O'Neil launched an independent algorithm consultancy of her own last year, pledging "to set rigorous standards for the new field of algorithmic auditing."

Scrutinizing secret code

All of this is happening amidst rising political backlash against some of the most powerful tech companies in the world, whose opaque algorithms increasingly shape what we read and how we communicate online with little external scrutiny.

One of the challenges, says Solon Barocas, who researchesaccountability in automated decision-making at Cornell University, will be determining what, exactly, to scrutinize and how. Tech companies aren't regulated the same way asother industries, and the mechanisms that are already used to evaluate discrimination and bias in areas such as hiring or credit may not easily apply to the decisions that, say, a personalization or recommendation engine makes.

And in the absence of oversight, there's also the challenge of convincing companies there's value in letting in algorithmic auditors.O'Neil, the mathematician and a well-known figure in the field, says her consulting firm has no signed clients — "yet."

  • ANALYSIS| When algorithms go bad: Online failures show humans are still needed

Barocasthinks companies "actually fear putting themselves in greater risk by doing these kinds of tests." He suggests some companies may actually prefer to keep themselves — and their users — in the dark by not auditing their systems, rather than discovera bias they don't know how to fix.

But whether companies choose to embrace external audits or not, greater scrutiny may be inevitable. Secret and unknowable code governs more parts of our lives with each passing day. When Facebook has the power to potentiallyinfluencean election, it's not surprising that a growing number of outside observers want to better understand how these systems work, and why they make the decisions they do.

ABOUT THE AUTHOR

Tech-savvy investigators are ready to put algorithms under the microscope — if companies let them | CBC News (5)

Matthew Braga

Senior Technology Reporter

Matthew Braga is the senior technology reporter for CBC News, where he covers stories about how data is collected, used, and shared. You can contact himviaemailat matthew.braga@cbc.ca. For particularly sensitive messages or documents, consider using Secure Drop, an anonymous, confidential system for sharing encrypted information with CBC News.

Corrections and clarifications|Submit a news tip|

Related Stories

  • Facebook announces staffing increase as it turns over thousands of Russia-linked ads to U.S. Congress
  • Twitter hands over ads from Russian TV network to congressional investigators
  • Analysis When algorithms go bad: Online failures show humans are still needed
  • When do Canadian spies disclose the software flaws they find? There's a policy, but few details
Tech-savvy investigators are ready to put algorithms under the microscope — if companies let them | CBC News (2024)
Top Articles
Latest Posts
Article information

Author: Lilliana Bartoletti

Last Updated:

Views: 5399

Rating: 4.2 / 5 (53 voted)

Reviews: 84% of readers found this page helpful

Author information

Name: Lilliana Bartoletti

Birthday: 1999-11-18

Address: 58866 Tricia Spurs, North Melvinberg, HI 91346-3774

Phone: +50616620367928

Job: Real-Estate Liaison

Hobby: Graffiti, Astronomy, Handball, Magic, Origami, Fashion, Foreign language learning

Introduction: My name is Lilliana Bartoletti, I am a adventurous, pleasant, shiny, beautiful, handsome, zealous, tasty person who loves writing and wants to share my knowledge and understanding with you.