IA y procesos colectivos de toma de decisiones, con Katherine Milligan

5 de abril de 2022

En este podcast "Inteligencia Artificial e Igualdad", la Senior Fellow Anja Kaspersen y Katherine Milligan, directora del Collective Change Lab, exploran lo que podemos aprender del movimiento de impacto social y emprendimiento para gobernar el impacto potencial de los sistemas de IA. ¿Qué es el cambio sistémico y la creación colectiva de sentido? ¿Por qué es importante para replantear la ética en la era de la información?

ANJA KASPERSEN: I'm thrilled to welcome a dear friend today whose work I have been following with great interest for many years. Katherine Milligan is a director of the Collective Change Lab and an adjunct faculty member at both the Graduate Institute of International and Development Studies in Geneva and Fordham University in New York, where she teaches undergraduate and graduate-level courses on social innovation and social entrepreneurship. Katherine is regarded as one of the pioneers in social entrepreneurship. Previously she was the executive director of the Schwab Foundation for Social Entrepreneurship, the sister organization of the World Economic Forum, which supports the largest community of late-stage social entrepreneurs in the world.

Katherine, it is such a delight to share this virtual stage with you to talk about social impact and hopefully during the course of our conversation to demonstrate why understanding systems change at a deeper level and learning from the social entrepreneurship ecosystem is important for any discussion on grappling with the societal and ethical implications of embedding AI systems into our day-to-day-lives.

KATHERINE MILLIGAN: Thank you so much for having me.

ANJA KASPERSEN: To get us started, Katherine, could you quickly explain what the Collective Change Lab is?

KATHERINE MILLIGAN: Yes, I would be happy to. The Collective Change Lab is a cross between a think tank and a capacity builder. Our goal is to catalyze a global movement towards more transformational ways of doing the work of systems change, and the way we go about doing that is really twofold: Research and then storytelling—how do we tell stories that are more reflective of how system change actually happens? We do that in deep partnership with our practitioner partners; and then capacity building—how do we work with practitioners? By "practitioners" I mean collective impact leaders, social innovators, and collective change leaders. How do we work with practitioners to help them deepen their own practice in community with other social change leaders from other geographies and working in other systems on a variety of topics but really always focused on the how? No one person, none of us individually, has all the answers, so what we do in communities of practice is facilitate the emergence of answers together through collective sense-making processes.

ANJA KASPERSEN: Thank you, Katherine, for explaining what the Collective Change Lab is all about.

Now let's spend a little bit of time on your personal and professional journey for our listeners to get to know you a bit better. I know you have been working in this space of social entrepreneurship, social impacts, and systems change that you just alluded to for more than two decades. I am wondering where and when did it all start. What sparked your interest initially? If you do not mind, it would also be helpful if you could define what "social entrepreneurship" is and maybe provide even a few examples for our listeners to benefit from.

KATHERINE MILLIGAN: My personal journey into the world of social entrepreneurship I can trace back even to my undergrad university years. I had some formative experiences on study abroad programs in Kenya and in Europe. I also had an incredible mentor and teacher in Dana Meadows, a systems dynamics expert, who guided me, who was always available for my questions, and who helped me to navigate into my first job out of university. Collectively those experiences awakened me to both the profound global inequities facing our world as well as the beauty of the diversity in our global human family. At the end of my senior year, when the recruiters—the investment banks and professional service firms—came onto campus, I know that was not the path for me.

But what is the path? What was the path then, 20 years ago? There were not well-worn social impact career paths, not like today, in social innovation, corporate social responsibility, and social impact. Those things were not even taught in universities back then, and there certainly were not specialized firms like Devex and PCDN to help young professionals find their way into social impact careers.

I carved my path through trial and error or zigging and zagging as I sometimes like to think of it. I had experience in non-governmental organizations and public sector agencies in Washington, DC. I went abroad and worked for a grassroots community-led development organization in West Africa. There I got a close-up view to how the aid sector works, and there are a lot of dysfunctional power dynamics in the aid sector and in every sector.

That led me to study public policy at Harvard. I was really interested in the nexus of trade and development, particularly how global trade rules affected smallholder farmers in Africa, where I lived. I didn't have the language for it at the time, but in hindsight what I realize now is that I was studying a system, and that again was a formative experience for me. It taught me a few things.

I spent a year traveling around on a post-Master's scholarship and did more than 200 interviews. I went into that year with a whole bunch of assumptions and hypotheses, and I walked out of that year with a much deeper and more nuanced understanding of the realities. Those learnings too I have carried with me in the remainder of my career, first how from the outside it is so easy to categorize people, like the "good guys" and the "bad actors." We love to oversimplify things in our mind, but the truth is rarely so simple. That experience helped me hold together multiple truths and stitch together perspectives to see the whole.

The other thing that it taught me—and this I see as a thread running throughout my career ever since—is that many of our most complex problems defy simple technical solutions. Our systems are complex, dynamic, and constantly adapting. Even knowing where to intervene in a system isn't necessarily straightforward, and the effects of an intervention, those unintended consequences, can really surprise us. Those are learnings that I have carried forward with me over two decades, even as I realized that becoming highly specialized in international trade rules was not my path, and so I zigged or zagged again, whichever one it was, into the world of social entrepreneurship, and I have been in love with it ever since.

To answer your question, Anja, about what social entrepreneurship is, there is no one definition, but the one that resonates the most with me is one that I was part of co-creating about a decade ago with impact investors, social entrepreneurs, scholars, and academics, again stitching together multiple perspectives, to come up with a definition that would resonate, and it is: "The application of innovative, pragmatic, market-based, and sustainable solutions to social and environmental problems with an emphasis on low-income, vulnerable, or marginalized populations." In other words, social entrepreneurship is about changing the status quo and designing models that are sustainable over time and that have a big, scalable impact on a large group of people, particularly underserved customers and groups.

A couple of examples: Bioregional is a social enterprise that was founded around the year 2000 in the United Kingdom. At that time they teamed up with city planners to create the UK's first large-scale mixed-use sustainable community, and they did it on a plot of land that had once been used for sewage sludge. The housing community that they build, called BedZED, was designed to be free of fossil fuel consumption. It radically reduced residents' climate emissions, and they also used locally sourced or reclaimed building materials, water-saving appliances, etc. That is Bioregional rolling up their sleeves and learning how to do it: How are different sustainable communities possible? Through that learning process they created and codified the "One Planet Living" principles, and they received numerous awards. To this day they have delegations visiting from around the world.

Then they faced a decision: How do they replicate that model globally? Rather than deciding to build a second BedZED and a third and a fourth and maybe expand to other markets, they actually embarked upon a strategy to integrate One Planet Living principles into the projects of major real estate developers. In the two decades since their creation they have advised everyone from municipal authorities, multinationals, real estate developers—including Disney Paris—on about $30 billion of housing developments and commercial real estate projects, designing a novel and innovative approach and then thinking about how it actually shifts the status quo and has a much more scalable impact on a large group of populations.

A second example that feels very pertinent today is REFUNITE. This social enterprise was founded by two Danish brothers, very close. During their travels they met a young man in Africa who fled from civil war, and he was separated from his brother. He didn't even know if his brother was still alive.

The Mikkelsen brothers, the co-founders of REFUNITE, knowing how inconceivably painful it would be to not know if they had been separated by such a horrific event and to not even know if their brother was still alive, decided to help this young man find his brother. That took them into this bizarre, byzantine process of family reunification procedures in the UN system: You fill out a paper form, you provide identifying information. If you are fleeing for your life, you might not have these on your person for obvious reasons, and then maybe your family members in another refugee camp somewhere else will fill out their forms, and just maybe sorting through all of this paperwork the United Nations will be able to make a match. Again, going back to social entrepreneurship, an innovative, pragmatic approach to solving an old social problem.

REFUNITE works with more than 20 mobile operators and tech companies. Think about them as sort of a hybrid between the worlds of technology, business, and a nonprofit. They are a 501(c)(3), and what they did that was so disruptive is they created an anonymous service powered by artificial intelligence (AI) and voice recognition that is now the world's largest missing persons platform for refugees and forcibly displaced populations. It has more than 1 million registered users. They partner with telecoms in countries with the largest refugee populations to create awareness about the REFUNITE via SMS, and the service itself is free.

Over years they also began engaging with the United Nations and worked with them to shift the status quo of how refugees are processed and how family members are reunited. So, from a few hundred family members, maybe a few thousand per year, to date REFUNITE has reconnected 50,000 family members since its creation. Those are just a couple of examples of what those principles look like in practice.

ANJA KASPERSEN: Thank you for sharing those powerful stories of social entrepreneurship. You mentioned using technology in some of these applications. How in a social entrepreneurship type of setting will you go about addressing the ethical considerations?

KATHERINE MILLIGAN: Technology plays a vital role for many social sector organizations. It significantly drives down costs. It can enable real time or close to real time decision-making on everything from what's happening in homelessness with a certain city to how to improve patient adherence to HIV medication. It can also be put in the service of refugees as we saw with REFUNITE or provide access to remote communities that was not previously possible if you look at the case of Zipline in Rwanda delivering blood and other emergency medical supplies by drone.

Just as importantly technology can facilitate communication and collaboration across organizations, which is essential because this is one of the grand challenges of our day. No single organization can "solve" a social problem. They all require collaboration across organizational boundaries, whatever system we're working in—the health care system, the food system, or education.

Yet, as a veteran social entrepreneur who I respect greatly, Jim Fruchterman, who is the founder and CEO of Tech Matters, commented, "The social sector is decades behind on tech infrastructure," partly because many social sector organizations are starved of general operating funds out of which they can make those investments even though social change leaders themselves want to deploy software and data to make their organizations more efficient and effective and they are actively exploring the role of tech like AI, machine learning, and blockchain to support the communities they serve.

Technology can play a supporting role, but technology alone won't get us there. Ultimately when we talk about systems and the work of systems change we need to hold a fundamental truth front of mind, which is that systems are made up of people. Sometimes we lose that truth. So ultimately what we need to do if we want to make our systems work more inclusively for everyone is to support people working in those systems to be able to change in fundamentally consciousness altering ways, and unless we do that then the systems that they are a part of aren't really going to significantly change either.

ANJA KASPERSEN: I would like to repeat one sentence you said which I find very important: "Sometimes we lose sight of the simple truth about systems. They are made up of people." This is such an important point, especially when discussing the impact of AI and algorithmic technologies and how systems in our day-to-day life change as a result, and too often a debate in the technology space and the AI research space is presented as if the technology will somehow evolve on its own separate trajectory, which—as you argue as well in several of your articles—is not the case. It is human all the way.

To borrow another phrase from your work, "Systems do not transform until the people in them do." Can you explain what you mean by this statement and also your view on systems change, which—as many of our listeners will be familiar with—is a concept that has been loved and hated in equal parts in social science for many years, and why it has come to the fore in your view in recent years and is relevant I would gather as well to any discussion of providing leadership and ethical oversight both to the embedment of new technologies as much as to how do you scale and generate social change through entrepreneurship.

KATHERINE MILLIGAN: You talked about this shift. Let's start there. It is true. Of course it has been around since the 1960s. This isn't new, and yet I would say within the last ten years or so many of the major institutions that support intermediary organizations in the social innovation/social entrepreneurship ecosystem have really galvanized and coalesced around this term. There is lots of energy and different actors who are coalescing around this term.

What is it? How to interpret, how to define it? Again, lots of definitions out there. The Academy for Systems Change, which is of course a reference for all practitioners, has a very academic definition of systems change as "altering the rules and standards that make a system work the way it does as well as the goals, norms, and beliefs that, if left unchallenged, can present systems from working more inclusively." They go on to say that that involves deep shifts in mental models, relationships, and taken-for-granted ways of operating. For the wonky among us, that's fantastic. For the social change leaders and founder CEOs and entrepreneurs that I know, it's a little hard to wrap your mind around that.

Another definition of systems change that I love because I think it's very simple, accessible, and evocative, comes from Social Innovation Generation in Canada, and that definition is: "Systems change is shifting the conditions holding a problem in place." That is very useful and very evocative, this notion that problems are being "held in place" by conditions. What are those conditions?

A few years ago one of my colleagues, John Kania, set out with a couple of his collaborators, Peter Senge and Mark Kramer, to articulate those conditions in a seminal paper called "The Water of Systems Change." I would encourage listeners to read that paper. I can't tell you how many times I have heard from social change leaders that that framework has really helped to inform their strategy or give them a language for things that they didn't have a language for.

We don't have time to go into the details here, but at a high level I do think it's useful to just touch on what each of those conditions are. It is sometimes referred to as the "six conditions" framework. They are: policies, practices, and resource flows; relationships, connections, and power dynamics; and mental models. It is often drawn as an upside-down triangle, or sometimes it is shown in the iceberg model. The point of that that I would like to illustrate for the listeners is that most of the practitioners I know focus all of our time and energy on shifting the structural conditions of systems: How do we alter resource flows? What about new regulations? "If only we achieve this legislative victory or this policy reform." There are a lot of reasons for that—because it's quantifiable, because we can see it and measure it, because it's the key performance indicators our funders and our stakeholders are asking for, and because we also know how to do that work, that mobilization and those efforts.

But power and relationships? That is much harder to translate into an organizational strategy or into a set of activities. How is that someone's job description, to sort of work on building relationships between all of the actors in a system or to think about how are you transforming power imbalances in dealing with the issue of homelessness, for example, or in the criminal justice system? That second level of systems is seen by some—certainly power dynamics are seen by those who have the least power, but they are not visible to everyone, and it is unclear how they make a system work the way it does.

That last level of systems, that deepest level, are mental models. These are really our world views, our assumptions about the root causes of social or environmental problems and about how we make progress in addressing them. They are always there. They are operating in the background. Rarely do they actually enter our conversations or our strategy discussions with our funders and with our partners.

The point with this framework is that all six conditions are interlinked and mutually reinforcing. If you think about that for a minute, you know intuitively that that is true. A shift in one can catalyze a shift in another. A shift in legislatures' mental models can trigger a policy reform, or a change in power dynamics can alter how resources are allocated. We can all think of examples of this from our life. Yet we don't actually often, those of us working to shift systems to make them more inclusive and more just, work at all of these levels of systems simultaneously, at the structural level and these deeper levels of systems, the so-called "cultural" levels of systems, namely power, relationships, and mental models.

ANJA KASPERSEN: One of the issues that you mentioned was an issue around power and power dynamics. There are many initiatives focused on the ethics of building and deploying AI systems which also include how to come to grips with how AI not only represents power and powers to be but also holds the potential to radically alter power dynamics, what it means to be human in the information age, decision-making transparency, and how to make sure that we build and educate leadership, especially on the regulatory side, to the fore of our priority areas. What would be your thoughts on power dynamics, and have you looked specifically at how technology can be a bit of a game changer in terms of that power dynamics that goes beyond the regular models of system change?

KATHERINE MILLIGAN: I think one of the big issues for me is around agency and voice. Again, there are lots of different definitions of power, but one interpretation is the power to set the agenda, the power to make decisions, and the power to shape how the narrative is told. I have seen over and over again very well-meaning and well-intentioned efforts that go into kind of consultation mode with a community to engage the community voice. Conducting a focus group is not authentic engagement of community and voice. For me this really comes down to decision-making and not creating for but creating with, and there is a huge difference there in how you do that. Again, it is human all the way. You said it earlier, Anja.

What I mean by that is that you don't parachute in and do a two-hour focus group and, great, we've ticked that box, we have this perspective, and we can parachute out again. Why would a community that has been researched, outsiders have come and gone, and they have been traumatized and re-traumatized, what is that relationship that they have with the consulting body or with the entity? I think that gets lost a lot of times, this notion of doing the deep relational work, really building relationships where people feel safe, they can be vulnerable, they can share their lived experience and their perspectives because there is so much wisdom in their lived experience and their perspective, but shared in a way that is not re-triggering or re-traumatizing.

That is a difficult skill set, to be able to do that, to be able to hold that space, build that trust—"This work moves at the speed of trust" is the cliché, but it's very true—and then together in co-creation mode facilitate the emergence of answers together with people who are impacted by the problem, who are experiencing what it means to be at the blunt end of that system on a daily basis, and who have lots of ideas, creativity, wisdom, and experience to bring to bear on how to make systems work better for everyone.

ANJA KASPERSEN: It reminds me of Virginia Dignum, who is a professor of AI and a very well-recognized AI researcher. She recently published a paper about relational AI, so looking at the power dynamics more through a relational lens, and what was interesting in her paper was she applies the word Ubuntu to how to do more responsible AI research and responsible innovation.

Then I found this preface that was written by Nelson Mandela in a book called Mandela's Way: Fifteen Lessons on Life, Love, and Courage, which encapsulates many of his interpretations of Ubuntu, where he speaks about it as an African concept that means: "The profound sense that we are human only through the humanity of others, that if we are to accomplish anything in this world it will in equal measure be due to the work and achievements of others." Some of what you just said brings back this notion of Ubuntu, this collectiveness, which I also heard you refer to initially when you were explaining what the Collective Change Lab is all about.

KATHERINE MILLIGAN: Beautifully expressed. Join the link from that beautiful expression and interpretation of Ubuntu to the framework, a system change that we were talking about earlier. There is something around—if we acknowledge that all of these conditions are interdependent and mutually reinforcing, how do we do the work of shifting systems towards equity and justice at those deeper levels of systems? How do we build relationships across power divides and between different actors in the system who only see their small piece and maybe even feel territorial or that they own the issue and don't want to collaborate with the other actors working in that same space? And what about our own mental models? How do you do that work?

I don't have the answer. I don't know that any one of us has the answer, but I think the practitioners who are experimenting with transformative practices are bringing people in the system to deep relationship together, building a container for sometimes safe conversations, sometimes also really difficult conversations, conversations that surface tension and past painful experiences and hardship, in ways that help us feel a deep sense of connection and a sense of our shared humanity. I think that's when transformation becomes possible.

When we are not solely in the realm of the technical, analytical—"Let us look at this data set"—but when we are in deep relationship with other humans, when we feel a sense of connection to them and go through a shared experience together, there are new ideas, creativity, and energy that can flow from that in a way that merely transactional business relationships—and if we think about the way that we typically collaborate and that these partnerships forms so that we all issue the press release and the bosses are happy—that is really what we're trying to advocate for at the Collective Change Lab. We are not going to get to more radical outcomes unless we start with more radical practices, more radical ways of working together, and by "radical" we simply mean being in deep relationship with each other and creating experiences and conversations that help us connect to our shared humanity.

ANJA KASPERSEN: You think that this newfound realization that actually being in deep engagements with each other, those deep relationships, actually come out of this alienation that technology in some way has brought to us? In recent podcasts we have been discussing can you code gut feelings, can you actually train a machine to identify human emotions, can you code empathy, and what is the difference in empathy and compassion when you are relating to software and algorithms? Maybe you can code empathy, but you can't code compassion because that is such a uniquely human feeling.

All of these discussions are going on in the machine learning space because AI and machine learning have all these different approaches from just relating to data sets that you were alluding to but also looking at the bigger contextual piece, realizing that the data set may not give you all the information you need to create a good enough system, but you need to have the symbolics, the values, embedded into the algorithmic process.

Then of course there will always be the question: Can you really try to replace any type of human emotions? In some ways when I am listening to you I am getting this sense that we have moved so far away from our core that the radical idea is now saying, "Hey, sit down, engage, talk, relate." Am I understanding you correctly?

KATHERINE MILLIGAN: Yes, Anja, exactly. Again, if we acknowledge and recognize that systems are made up of people, then the outcomes they produce are a reflection of the individuals and the relationships between those individuals in that system, and what we firmly believe is that when you bring people in a system together into deep authentic relationships, when you create a safe space for vulnerability and even spaces for healing and feeling connected to something larger than just ourselves, that is what produces shifts in both our individual consciousness and the consciousness of that group or that collective powerful enough to transform systems.

In other words, we have to get out of our heads and into our hearts. It's not an either/or. We need both. We need to bring all of our analytical rigor to bear, but we also need to be able to engage with our full humanity as well.

ANJA KASPERSEN: I find it very interesting because of course we are entering into an age which it is all about optimizing our humanism, optimizing our connectivity, optimizing production lines, optimizing results, outputs, and impacts, which the social entrepreneurship world is not exempt from. There are also a lot of issue on how to use technology to optimize that social impact, but do you worry on a personal level that we are moving so far into this optimization space without really defining or becoming clear with ourselves optimizing towards what, losing some of that humanness that could contribute to radical transformation or the radical systems change that you were alluding to earlier?

KATHERINE MILLIGAN: That reminds me of a conversation I had the other day with a social entrepreneur, who said to me: "In all this talk about systems where are the people? Why aren't we putting people front and center in all of these conversations around systems change?"

I guess I would have the same question in all this drive towards optimization. If you just take this notion of relationships and relationship-building, that takes time. You have to invest in building relationships. A drive towards optimization, what would that exclude? What would that preclude us from even pursuing as a viable path to change?

Then there are of course all of the questions around who: Who is designing this? Who is setting the parameters? Whose ethics, whose morality, and are we properly engaging straight from the get-go and into the design safeguards against systemic bias? Because if we are designing these systems from the same mental models, then we are simply going to reproduce the same systemic biases that we see rampant in society today.

ANJA KASPERSEN: Those are very important points, Katherine. There is this risk of ivory tower-types of discussions that make it difficult for those who are most impacted by transformative change, by systems change, to actually engage with the very processes that are taking place and, like you said, the power dynamics that come into play as well. So there is a collective responsibility for anyone working in this space to make sure that these concepts are relatable, that the design parameters that go into thinking around optimizations and around change are actually tested on the ground not to repeat and not to perpetuate some of these systemic biases that might lead to more inequality and worse equity.

KATHERINE MILLIGAN: Very well put. I think for me it comes back to the who and the how.

ANJA KASPERSEN: On the website of the Collective Change Lab it says: "The prevailing view conceals a larger truth." Given current calamities and circumstances this almost seems uncomfortably true in our modern-day living. Can you explain a little bit what is meant by this term and also your views on these notions of concealing the larger truth?

KATHERINE MILLIGAN: Rob Ricigliano, who is the internal systems guru and coach at The Omidyar Group, talks about the complexity spectrum and "clock problems" versus "cloud problems." At the clock end are targeted solutions. They address urgent localized needs. They require a technical approach. You have to be efficient. It is a known problem and a known solution. An example is a population displaced by a natural disaster that he offers.

All the way on the other end of the spectrum is systems transformation, so improving the health of a system, the interactions of a system, and shifting the conditions in that system by affecting all the things that we have talked about—the mental models of people in that system, power dynamics, bringing actors in that system into relationship, and catalyzing shifts in those system structures.

So what we mean by "The prevailing view conceals the larger truth" is that many, again, well-intentioned, thoughtful, smart people in the social sector are trying to approach transforming our systems as if it is a technical problem, a known problem with a known answer, so it is all about the technical, doing the analysis, and studying it, and then, "Okay, what does the evidence base tell us?"

These things are all true. Those things all matter. Rational outcome-based approaches do generate impacts. So that is partially true, but there is a larger truth, and that larger truth is about our systems. Those are unknowns. They are complex. They are dynamic and adaptive. The pathway might not be known, and there is no one single silver bullet or solution that is going to solve it.

So this prevailing view that focusing on logic models and very straightforward, linear scaling strategies, evidence-based, focused on results, is going to shift our systems. That is where we see the disconnect. Those kinds of approaches on their own will never be able to transform systems.

What we are trying to point to is that the process is the solution. It is not looking at the evidence base and taking an evidence-backed solution off the shelf and then implementing it. The process itself is the solution, and how you design that process—who is sitting at the table, what are the conditions that you are creating for that group of people to be able to ideate and explore and address power imbalances and hold multiple truths and share different perspectives that are true based on where they sit in the system—for collectives, for groups, actors trying to work generally across sectors, certainly across organizational boundaries to try to shift systems, is where we need a different truth. The technical, analytical, rational, evidence-based just isn't fit for purpose for trying to address large complex adaptive systems.

If we think about it that way and explore this notion that the process is the solution, then again what that means is time, energy, resources, staffing devoted to building relationships and building trust and co-creating shared agendas, and then through that process trusting in emergence and allowing new ideas and new possible paths forward to emerge.

ANJA KASPERSEN: There are obviously amazing applications from technology that will deeply impact on our abilities to have social impact, for social entrepreneurs to deliver on their ideas of bringing change to societies that you were referring to earlier, but there are obviously other types of applications and maybe impacts that we haven't accounted for that we need to carefully consider embarking on this route. Do you have any reflections on that?

KATHERINE MILLIGAN: A few, yes. As I mentioned, the potential is enormous, and there is huge interest among social change leaders to pursue and integrate these technologies into their organizations and approaches, but the risks and the drawbacks are also significant. It is not necessarily so straightforward. This I am sure has been said before and debated with your listeners, but of course there are issues around data accessibility and a willingness among organizations to share their data, particularly those organizations that have large data sets, and obtaining them is very costly. Social impact organizations are essentially priced out of the market of being able to engage with that data and integrate it into their approaches, particularly those sets that have business value and are commercially available for purpose.

I think another issue is the right talents. Competition is so fierce even in the private sector for professionals with this kind of expertise. It is challenging to understand how social sector organizations are going to be able to keep pace with that. They are obviously not going to be able to compete on compensation packages, and many organizations I know actually struggle with enough general operating funds to be able to hire a chief technology officer, just to show you how wide the gap actually is.

Then of course there is the problem of change management that is actually required inside the social change organizations themselves because it is not just about the shiny new toy or bringing that expertise in-house, it is then enabling the other departments or other teams of the organization to be able to integrate AI-powered solutions or other tech solutions, and that remains a huge cultural and change management obstacle for the social sector. There are a number of wonderful resources in this arena, including a study that came out recently on the promises and perils of applying artificial intelligence for social good and entrepreneurship.

ANJA KASPERSEN: Can you say a little bit more about this study, just the broad strokes?

KATHERINE MILLIGAN: The hiring issue is an obvious one. We are talking about this like there is any chance that these guys can compete to hire one of these guys away from Google. It's not going to happen.

The interesting part was—and it is of course intuitive when you think about it, but it was great how the use cases showcased it and brought it out—that it is not just about attracting and retaining tech talent inside the organization. It is also then the change-management process and the cultural shift that has to happen between the engineering/science team and the rest of the organization so that those products that are being created are actually integrated into the organization's overall strategy.

One of the key findings that the authors point to is that this isn't just about competing for talent, recruiting, and retaining top engineers and coders. It is also about the change-management process inside the organization. If you are developing an AI-powered solution, how do you then integrate it into the rest of what the organization does. There can be a culture clash. Often there is a lot of skepticism maybe or reluctance to do things differently. It can be something where the left hand doesn't know what the right hand is doing, so ultimately this is about ways of working and managing a change-management process so that the technology gets embedded into how the organization does what it does.

ANJA KASPERSEN: Reflecting on what you have learned to date, zigzagging between trade development, social entrepreneurship, and systems change research, what would you advise anyone who finds himself or herself at the vanguard of change or even responsible for providing leadership and regulatory oversight to some of these bigger changes that we are witnessing going on? You mentioned that some of the collaborative models we have may not be fit for purpose for dealing with these adaptive, complex systems. Where do we go? What can we do?

KATHERINE MILLIGAN: A social change leader who I admire greatly working in the child welfare system remarked to me that one of the things that makes systems so rigid is the hierarchical and bureaucratic nature of the institutions in them. If you think about any system—the child welfare system is a great example. It attracts people who care, who want to make a difference in the lives of children and the lives of families, and yet when you rise through the ranks you become less and less accessible to the people who experience the problems, to those children, to those families, and there isn't any communication. There is simply a breakdown of relationships. So again, even though people come in with the best of intentions, they find over time that the gulf has become great and the disconnection becomes very hard to bridge.

I guess I would offer three things. The first is to go out today, tomorrow, but definitely this week and build relationships, not just with people who are different from you but people who are vulnerable, who are experiencing the problem. That is particularly important if you are in a position of power, or if you are designing or engineering a solution. It is not parachuting in and parachuting out. It is really building a deep relationship across hierarchy and across power imbalances at a fundamentally deep human level.

The second is—it is interesting. I was teaching a group of executives, and we were exploring these conditions of systems change, and when we got to mental models all their examples of how mental models show up in a system pertained to the people "out there," someone else, like the smallholder farmers or the homeless residents. It was really remarkable how much resistance there was to turning that lens inward and to be willing to explore: "What about our mental models? How do we show up in this system that we're working in? What are our biases, what are our blind spots, and how does this inform our worldview and the way that we construct meaning and what we look for and what we leave out?" Understanding that we are all part of the systems that we are seeking to shift and that when we are talking about mental models fundamentally we need to start with ourselves.

I think that leads into my last reflection, which is that there is nothing new under the sun. It is as old as time, but really it is that if we are trying to change the world around us, that change must begin from within. There are so many ways that that interchange affects and shows up in the way that we do our work in the world, in our insecurities, in our wounds, in painful experiences that we have had in the past, even in our egos and what motivates us, what's driving us, and all of those things around prestige, status, credit, and advancement.

Doing the work of social change, the work of systems change—and again, you can pick any system that you want—is so complex and involves so many other actors that it is really important that each one of us brings awareness to what energy we're bringing into that space, into the collaboration. What energy am I bringing? What baggage am I bringing? What are my triggers? What if someone else is triggered, does that trigger me? There is this amplification and reverberation that happens among groups. We have all been part of it in a meeting with a boss who is going on an ego trip. We are all relating to each other. I think grounding the work that you do in the world in the work that you do on yourself is the most important place to begin.

ANJA KASPERSEN: Thank you, Katherine, for sharing your time with us, your zigzag story, and entrepreneurial insights about societal impact. This has been a marvelous conversation.

Thank you to our listeners for tuning in, and a special thanks to the team at the Carnegie Council for hosting and producing this podcast. For the latest content on ethics and international affairs, be sure to follow us on social media, @carnegiecouncil. My name is Anja Kaspersen, and I hope we earned the privilege of your time. Thank you.

También le puede interesar

NOV 10, 2021 - Artículo

¿Por qué estamos fracasando en la ética de la IA?

Mientras lee esto, los sistemas de IA y las tecnologías algorítmicas se están integrando y ampliando mucho más rápidamente que los marcos de gobernanza existentes (es decir, las normas ...