From techno-optimism to techno-realism: What it means to innovate responsibly
As part of Grace Hopper Celebration week, I shared a few things I’ve learned over the years about how to innovate responsibly when building for a diverse and global community. Here’s an adapted excerpt from my keynote.
Optimism is crucial to innovation; you can’t innovate sustainably without it. But over the years, sometimes through challenging lessons, we as an industry have learned the hard way that unbridled techno-optimism — the belief that technological advancements are inevitably net positive for the world — isn’t a healthy approach to responsible innovation.
That being said, it’s also not productive to go to the other end of the spectrum toward extreme techno-pessimism; humans have a habit of rejecting new technology because it represents unsettling change, even when it ends up being net positive in hindsight.
As is true with most things in life, the truth lies somewhere in the middle. We need to balance that deep well of optimism with the right levels of skepticism and pessimism to achieve what I believe is the path toward sustained responsible innovation. We need to move from techno-optimism to techno-realism.
We need to move from techno-optimism to techno-realism.
I still believe in technology’s profound capacity to create good in the world, but here’s what’s changed: I no longer believe that good is inevitable, and I also understand that it is not evenly felt across all communities.
The seeds of my techno-realism were planted long ago in me by my graduate school mentor, Red Burns, who founded the Interactive Telecommunications Program at NYU. Red fundamentally believed that technology should empower people and be developed collaboratively with them. That it must be done by leveraging a diverse set of perspectives. And that we should see technology not as an end in and of itself, but as a means to lift people up. One of her standard axioms was that we shouldn’t look at the world as a market, but rather as a place that people live in. We are designing for people, not machines.
Red passed away a few years ago, but her notions around human-centered responsibility and techno-realism still inspire me to this day. Responsible Innovation is complicated work, especially on global platforms like Facebook, where we are looking to support and protect an incredibly diverse community of nearly 4 billion people. Because we all want to ensure that tech is a force for good in the world, we should be sharing all we can, to help each other best support the communities we serve.
That’s why I want to share a couple of important lessons I’ve learned on key ways we can all broaden our viewpoints as technologists and work toward a more equitable future:
Cultivate societal awareness
A key prerequisite to designing for social platforms is proactively cultivating societal awareness, in ourselves and in our work — that is, an understanding of how preexisting prejudices, biases, and systems of oppression can make their way onto tech platforms. Sadly, humans do not check these things at the door when we log in to an app. What we design and build and put out into the world may not create these dynamics, but it may unintentionally reinforce or amplify them.
What we design and build and put out into the world may not create these dynamics, but it may unintentionally reinforce or amplify them.
Here’s an example: In designing new voices for Facebook Assistant, the product team engaged with our Responsible Innovation team to consider the potential unintended consequences of assigning names and gender to voice assistant products.
As you probably know, most voice assistants on the market are stereotypically “female” by default, and this is because research shows that, often, people are more comfortable with a stereotypical female voice in the role of an assistant. We helped the team question this standard practice and explore how it may also unintentionally reinforce gender stereotypes and even encourage sexism.
Ultimately, the Assistant team decided to roll out four new Assistant voices on Portal and Oculus to give people the option to select from a variety of voices and personalize their experience. While the voices are varied in tone and tenor, they are not assigned a name or gender and are instead labeled by their tonal qualities — for example, “low, warm” or “medium, musical.”
This exercise in reflection early in the design process allowed us to question the status quo: Why should voice assistants be gendered in the first place? Sometimes the responsible approach is to go beyond what some people may prefer or feel comfortable with in the interest of dismantling existing norms that could cause harm. Bringing a societally aware lens to our work can help us understand when we should question common practices to work toward a more equitable future.
At Facebook, we are trying to increase this awareness of bias in our world by educating technologists about the systems of oppression that exist in society. Once you are trained to see these systems, it becomes more intuitive to remember that people are not coming to our platforms with the same set of life experiences, opportunities, or challenges. And ultimately, it’s about using that awareness to make different decisions that result in more equitable and positive outcomes.
[If you’re interested in conducting equity training at your organization, the National Equity Project, an Oakland-based nonprofit, offers resources and training on this and other equity-related topics.]
Expand our view beyond “users”
The term “users” has long been scrutinized, in part because it reduces the identity of humans to being solely about their use of a given product or service. I’ve tried to avoid the term and instead use a fairly obvious alternative: people.
But the bigger issue with the term “users” is that it can limit our purview of the people we feel a responsibility toward when we’re designing and developing new products or technology. It’s so important that we take a broad view of who we are building for and the outcomes that our work might produce. This takes some unlearning, in fact. Many of us were trained to think about “target users” we intentionally design for. But we need to go beyond just considering the impact on the people who use a product to consider the people who might be impacted by the use of it by others.
So how can we expand our traditional view of “users”? Investing in UX research is a crucial baseline, but it’s sufficient to really see the whole picture of impact on people and society. So we can also do a more expansive stakeholder analysis to help reveal whole new communities that may have a vested interest in the decisions we make. This practice is common in environmental impact reports, which take into account not just the people who will live and work in a new building being constructed, but the surrounding community, the plants and animals, and the air and water around it.
But there’s a broader reframing of the practice of design that is important to consider as well. We can find ways to go from designing and building for people and communities to designing with them. This isn’t just semantics; it’s called codesign, and it’s a participatory approach where community members are treated as equal collaborators in the design process.
This isn’t just semantics; it’s called codesign, and it’s a participatory approach where community members are treated as equal collaborators in the design process.
At Facebook, teams are leveraging the practice of codesign in a variety of contexts. One example I love is from our New Product Experimentation team. In its work to develop a product for young activists to connect with friends and seek community involvement, the team wanted to find a way to be more proximate to its audience and move toward a model of shared investment.
So the team reached out to the community to gather a panel of advisers, whom they meet with in weekly one-on-one or group workshops and pay a meaningful honorarium for the engagement. And they created a channel where advisers could collaborate with each other and the product team in real time.
These kinds of collaborations push us to not just study and observe communities, but bring them into the development process as true stakeholders.
Technology’s path toward a more equitable future
Right now, our optimism is under siege. Society seems so broken by so many challenges. The environmental crisis. War. Wealth disparity. The sexism, racism, homophobia, transphobia, and many other forms of discrimination that oppress so many members of our human family.
Technology has a crucial role to play in addressing these complex societal challenges, but we need to be very intentional about ensuring that it improves the status quo instead of reinforcing it or making it worse. This requires us to significantly rethink and expand the scope of technologists’ professional responsibility. In a sense, we all need to be functioning as a new kind of digital urban planner. That means enriching the hard skills of coding and designing with a study of history, anthropology, sociology, and psychology, and deeply engaging and collaborating with the communities we serve.
In a sense, we all need to be functioning as a new kind of digital urban planner.
The path forward is not always clear, but one thing I know is key. It’s crucial that women, people of color, and members of other underrepresented groups have the opportunity to lead us all through all these challenges. My mentor Red Burns knew this. It’s why she always prioritized diversity when building each class — she knew it would enrich the collaborations that organically formed among the students. She knew it would make us all more globally minded in how we approach the development of technology. And she knew it would cultivate one of the most critical resources any technologist needs to have to innovate responsibly: empathy.
I’m so grateful that my path crossed with Red’s years ago. In honor of Grace Hopper, I want everyone to take a moment and think about a woman who has challenged you or helped you believe in yourself and what you are capable of. It could be a relative, a teacher, a coach, a manager, a colleague, a friend. If the woman you are thinking about right now is still with us today, I encourage you to take time to reach out and thank them. They may not know how they impacted your life.
As I get older, I get more clarity about the things I want to do, the things I want to build, the legacy I want to leave. And a lot of it relates to being more of a techno-realist than I used to be. But as it turns out, I’ve found myself more and more optimistic as I’ve allowed some skepticism and pessimism to creep into my view of technology. And thinking about the community of women and emerging designers everywhere who are pushing the boundaries of where tech can and should go, I can’t help but be hopeful for the future.
This article was originally published on Facebook’s tech@ blog.