I never thought I’d hear a CEO talk about “technological determinism” on the main stage, but Tim Cook uttered those very words last week at IAPP’s Global Privacy Summit held in Washington, DC.
This is how Cook opened his keynote: “But we know, too, that technology is neither inherently good nor inherently bad; it is what we make of it. It is a mirror that reflects the ambitions and intentions of the people who use it, the people who build it, and the people who regulate it.”
Cook’s statement evokes historian Melvin Kranzberg’s famous first law of technology: “Technology is neither good nor bad; nor is it neutral.” But Cook stopped short of that important last turn.
As I’ve been interviewing folks in the emerging responsible and ethical technology field, the question of tech’s neutrality hangs over every discussion. Companies like Thoughtworks lead with the premise that “responsible technology begins with acknowledging that technology is not inherently neutral.”
Even Instagram’s Adam Mosseri has admitted, “We’re not neutral. No platform is neutral. We all have values, and those values influence the decisions we make.”
Engineers and tech leaders talking about how technology is not neutral represents a significant shift in thinking in the industry
This framing draws on decades of scholarship that acknowledges humans’ role in shaping the development, application, and uses of technology.
For the policy and privacy professionals in Cook’s IAPP audience, this is a familiar claim — they’re tasked with regulating data’s collection and use to mitigate potential harms to consumers.
For technologists and engineers, the idea that data and algorithms embed values has been harder to swallow. Historically, those arguments are countered with claims that “data is objective” and “algorithms are just math” and “technology is just a tool.”
But in recent years, former Google engineers like Tristan Harris have advocated for humane technology based on the idea that “we shape technology, and technology shapes us.” And it’s not just in response to concerns that social media and big tech have raised.
Companies like VMware are tying digital ethics to their environmental, social, and governance efforts. They recognise that “tech can’t be neutral because it’s created by people. And people aren’t neutral. Acknowledging this is the first step in our industry getting real about what we owe a world we’re fundamentally reshaping by the day.”
Data and AI changed the conversation
With the rise of big data and artificial intelligence that followed, research and frameworks addressing the ethical use of AI have revealed just how many human inputs go into a purportedly autonomous system.
How might training data perpetuate historical systemic biases? What outcomes are your models optimizing for? Who gets to determine the conditions of fairness? What began as a wave of concern for AI’s unchecked autonomy has uncovered all the human decisions and design choices at every step of the way. That’s helped more people see how all technologies — including data itself — embed human values and assumptions and therefore shape or constrain behaviours based on those affordances.
If technology isn’t neutral, who’s responsible for its impacts?
Tim Cook rallied around this call to action: “Those of us who create technology and make the rules that govern it have a profound responsibility to the people we serve. Let us embrace that responsibility.”
That echoes positions we’ve heard repeatedly throughout our research. Industry leaders have begun articulating principles and values, standing up review boards, and appointing C-suite executives to deliberately acknowledge and account for that responsibility.
Responsible technology practices make values explicit
Whether you’re building a data strategy, an enterprise architecture, or determining what KPIs and objectives and key results to measure and optimize against, every element of technology is laden with human decisions and values.
What responsible and ethical tech efforts attempt to do is articulate those values and assumptions, make deliberate and informed decisions, and account for their impacts on every stakeholder.
Technology is not neutral
We make bold claims at Forrester. This is one that lays the foundation for an awakening that is driving tech firms and tech executives to examine their responsibilities in the design, development, deployment, and use of technologies. How is your organization approaching responsible and ethical technology practices? Reach out to me at [email protected] to contribute to this ongoing research.
First published on Forrester blog