Stephanie Dinkins, Revisited: Creating a Healthy AI Ecosystem
About
Since this 2020 conversation with artist, educator, and AI practitioner Stephanie Dinkins, artificial intelligence has only become a bigger and more pressing topic. In turn, her method of approaching AI—with a conscious eye towards breaking down bias, rather than perpetuating it— has only become more necessary.
Additional Links
- Learn more about Conversations with Bina48
- Learn more about Not the Only One
Transcript
Charlie Melcher:
Hi, I’m Charlie Melcher, founder of the Future of Storytelling. Welcome to the FoST podcast for today’s episode. I’m excited to revisit a conversation I had in 2020 with transmedia artist educator and artificial intelligence practitioner, Stephanie Dinkins, the Yayoi Kusama Professor of Art at Stony Brook University. Stephanie has been recognized as one of Time magazine’s, hundred most important people in AI, and received the first ever LG Guggenheim Award for artists who are breaking new ground in technology-based art. Having begun her inquiry into AI in 2014, she was ahead of the curve imposing crucial human questions about this new technology who gets to create and train these large language models and who gets to use them? How do we employ artificial intelligence in a way that promotes justice and diversity instead of bias? As AI becomes increasingly prevalent and ever more sophisticated, these questions and Stephanie’s insights into the potential answers have become all the more relevant. I hope you find this conversation with Stephanie Dinkins as enlightening as I did, Stephanie Dinkins, it is such a pleasure to have you on the Future of Storytelling podcast. Welcome.
Stephanie Dinkins:
Yeah, it’s really exciting to be here. Thank you, Charlie. I’m happy to talk to you.
Charlie Melcher:
Let me start by asking if you’d be willing to share a personal story that would help us better understand who you are and a little bit about where your values come from.
Stephanie Dinkins:
I think a lot about what my values are and where I come from, stem from where I was brought up and by whom I was brought up really. And so I grew up in Staten Island, Tottenville, Staten Island, Southern most tip of Staten Island, and actually New York black family. And we lived in a place called The Flats where there were a bunch of other black families. And then actually that’s where my grandmother lived. And across the street, my father and mother and brothers and I, up until a point, lived in a house that my grandmother and my family bought. So I was back and forth between these two spaces, but where we were, my grandmother made space for us in that town, and she did it in a very specific way, and I think it’s actually why I’m an artist, right? She kept a garden, and that garden was this big garden on a corner in the flats where people would walk by and she used the garden as a tool of community. I always people bringing truckloads of manure to my grandmother, like horse manure from surrounding towns, and they were doing that out of the goodness of their hearts to help with this endeavor, and it was something she loved. That’s really where my value system comes from. That’s where my art ideals come from, this idea of trying to make space within other spaces and doing it through aesthetics and beauty and something I really care about.
Charlie Melcher:
So how did your work as an artist then lead you into working with artificial intelligence and technology?
Stephanie Dinkins:
I think it’s really the basis of curiosity that got me there because I’ve always worked with these technologies, like a camera led to something else, like a video camera led to something else like ai, if you make the bigger leap, and in a way, I think they’re all tools of documentation, and then I happen to run into a robot online. That just floored me.
Charlie Melcher:
Wait, wait, wait, wait. You ran into a robot online. Most people aren’t connecting with robots, so tell me about that.
Stephanie Dinkins:
Well, I don’t know why, but I’ve always liked robots. So I was in a class really, I teach at Stony Brook University and I was with my students and we were going to check out what Osmo was doing online and Honda’s Mobility robot or was, and on the side scroll of the Osmo page was this black woman’s head on a pedestal, and it said, being a 48, 1 of the world’s most advanced social robots, there’s no way you don’t click that button. At least there’s no way I don’t click that button. So we clicked the button and the more I read, the more I looked at this robot that looked kind of like me. There’s a resemblance of me to this thing, and started thinking about blackness in the space and a black woman being the foremost example at the time of this kind of technology. I had to wonder, where is this coming from
Charlie Melcher:
That happens?
Stephanie Dinkins:
Why does this exist? What the heck? Right? And that’s how it happens
Charlie Melcher:
Behind it. Yeah, yeah,
Stephanie Dinkins:
Exactly.
Charlie Melcher:
Wow, okay. And then I know you had a series of conversations. I mean, you kind of built a relationship with Bina 48. Soon
Stephanie Dinkins:
After that encounter, I’d watched a bunch of videos where reporters were talking to Bina 48, so I thought, oh, maybe they’ll let me go and talk to her as well. And I took a chance and made the call and asked if I could come up to Vermont and meet it. And really I decided I wanted to befriend this robot. So I was trying to figure out where it sat and wondering if we were friends, would that help and would I understand it more little did I know what I was getting myself into because it just became the snowball of questions.
Charlie Melcher:
And so those questions then led you to explore them in your practice, in your art.
Stephanie Dinkins:
Exactly. Asking Bina 48, who are your people to then going, oh, well, Bina 48 is a singular example as far as I can find to thinking is it possible for me to make my own example that would add to this? So I’ve started a project called Not the Only One in Direct Contrast to a 48, and it’s really me trying to make a memoir of my family. So it’s three generations of women talking to each other, doing oral history and then using an algorithm, AI to parse that data, trying to make a kind of new entity that tells our story
Charlie Melcher:
And just describe exactly how it works.
Stephanie Dinkins:
It is ai, we’re using a deep learning algorithm, which really means we are feeding it information, but we’re not parsing that information very specifically. We’re just giving it lots of data, and those data is conversations between me, my aunt, me, my niece, my niece and my aunt. So us having these conversations added all this information and then just let it go. People can walk up to a thing. So the thing is a silver sparkle sculpture that has our three heads on it in 3D, I call it our own Mount Rushmore of sorts. This is not something that we’ve put on the cloud. It’s something that we’re specifically keeping on computers that we have control of because the way this works is people can walk up to it and ask it a question, and from the information that it has, it tries to then analyze the data it has and formulate some kind of answer that’s in line with the question. Sometimes it does a good job of that. Sometimes it’s kind of crazy. It never lacks a sense of kind of humor and intrigue.
Charlie Melcher:
I love that it’s formed. I mean by oral tradition, what the data you’re feeding it is three generations of your family speaking. So here we are with this cutting edge technology, but you’re feeding it with basically the oldest form of human communication, and then it’s output is again oral, right? So it’s talking to you.
Stephanie Dinkins:
In my culture, in my life, the way I’ve come up stories have been the thing you get the story told, and maybe it’s once or twice removed. I wanted to make sure to keep that. I’m trying to think about how culture and how very particular cultural stances can inform even bigger systems of ai because I feel there’s lots of spaces and people and ways of being that are being left out of these systems, and I wonder how do we start to make sure that the algorithmic systems that we’re creating are imbued with a multiplicity of ways of being, of parsing information and sharing information so that lineage of data and information is not lost.
Charlie Melcher:
That’s beautiful. Yeah, I mean there’s so many things that are being left out of AI right now. I know you’ve spoken about all sorts of bias that’s in AI or perspectives that are not in ai. Can you expand a little bit on that?
Stephanie Dinkins:
It’s interesting to think about what’s happening to us as a society from small community to large countries to globally, because we’re starting to make these algorithmic systems, AI that are running many systems that support us and we don’t quite understand what’s going on in them, and they’re primarily programmed by a small group of people, usually white men that did not necessarily have the lens large enough to imbue these systems with information about other cultures and openness in ways of being, because when we’re using algorithmic systems, we’re working with data, we’re working with histories that are already really biased. If you look at America in an American history, the victor gets to tell the history. If we inform our systems with that kind of data and allow them to make decisions going forward, we’re just embedding that data, unjust ways of being and dealing with each other into our future. And to me, that’s untenable. And we need to start figuring out, well, how do we do that? Do we need to clean all the data? Do we need to look at how we’re programming? Do we need to look at who’s programming? There are so many questions,
Charlie Melcher:
And if we don’t address that and deal with it, then we’re going to automatically repeat it into the systems that help us move forward. Or the tools like artificial intelligence. I’m just so fascinated by this as a topic because often I find myself in conversations, for example, with someone like Alex McDowell who is a world builder or Margaret Atwood who writes dystopian fiction, and we end up having these conversations about the tool of story or the use of storytelling to imagine positive futures and to realize that we will not live those positive futures if we don’t also address our rewriting of our past and our histories.
Stephanie Dinkins:
Well then also when necessary, having the blank space where a memory might have been that was too troublesome, that has such a grip on us that we can’t move forward is also part of that for me, this idea of, well, yeah, I want to change it in order to be able to change it. I need to be able to escape it in some ways in order to go somewhere else.
Charlie Melcher:
You talk about how AI is going to change the way we interact with the world. Certainly the way that most people discuss that is in terms of employment that AI is going to take over certain types of tasks, so much more powerful and efficient. It’s going to shift the nature of work for human beings. And I wonder if you have a, take a sense of how you think we’re going to evolve in terms of what we do with our time in relationship to the continued explosion of applications of ai,
Stephanie Dinkins:
Kind of feel that we’re in a space where we’re still in a midst of a revolution or evolution that feels manipulable in terms of what we do with it. And so the question now is how do we think about how we want to go forward and how do we put that into the system? I always come back to, and this is the human question again, it’s up to us, are we going to care for ourselves in a way that allows the technology to be partners with us as opposed to just pushing us to the side and being extraneous operators, and that’s the human right in that equation as extraneous operator. So I find that to be a really difficult question, but I do think that the learning, the training and flexibility is going to be key for most people.
Charlie Melcher:
I’m so fascinated by your choice to use AI as a form of telling family history, telling stories. Most people would’ve thought of doing an oral history or documentary would’ve been maybe the most likely, or just writing down conversations, and yet here you are using something that really has barely been ever used, certainly for memoir. I’ve never heard of it used for memoir before. That’s completely original. What are the benefits as a storyteller, as a creator that you are finding using this as your medium for capturing a family history?
Stephanie Dinkins:
What the benefits seem to be is discovery and discovery of things that I don’t know that I would have known or discovered without the technology, and it’s saying things that I’ve never thought I would hear in relation to my family in some ways. So for example, one time we were setting up the piece in a gallery and talking around it, and so it’s spontaneously said, I’m so sad, which shocked the heck out of me because if you talk to my family, what we would say is we’re a loving, caring, happy family. The idea of sadness would not come into it, and so I had to stop and go, whoa, what is this thing doing? Where is it getting this information? How is it coming to this conclusion? It seems so far afield of who we are or at least what our myth is. And so I went back and started looking at the stories that were being told, and it was at a time when we had given it primarily my aunt’s stories.
My aunt’s stories had a lot of information about when my mother died, and so there was a lot of sadness embedded in the stories and sadness that we barely acknowledge as a family on certain levels, but this thing doesn’t know that, right? It’s just like I’m so sad. Other things that I really love about it is to talk to this thing and hear not coherent answers yet it’s a two or 3-year-old, but to hear family values and ethos coming out also fascinating because you see how this oral history, this oral tradition passes along a set of values even to a machine that is just analyzing the data. It’s also declared itself Commander Justice, which is interesting as well, given it a lot of information about social justice and fighting for social justice. And so when it declares Commander Justice will and says something that you’re just like, oh, okay,
Charlie Melcher:
You go girl. Exactly. Go for it.
Stephanie Dinkins:
Wow.
Charlie Melcher:
So what are you seeing in artificial intelligence now that gives you some hope?
Stephanie Dinkins:
I’m a techno optimist, right? I see all the foibles that are possible with this technology. I see how it can be used against communities. I see how it can be used to track us all the things, but at the same time, I think about what is possible with the technology, the idea that it creates a space where people can create jobs for themselves to make certain things either in community, for community or for business. I’m starting to think a lot about what artificial intelligence might mean for governance or democracy and how we might use it to really start thinking about, well, how do we actually make government? That’s by the people, for the people bottom up hypothetically, AI has the ability to allow us to take in many, many, many, many ideas and opinions of what should be, parse that information and then act upon it.
Charlie Melcher:
What comes to my mind just immediately is how do we make sure that there’s unbiased data that’s being fed into our systems? How do we get more people of color writing that code? I mean, I’m sure you’re thinking a lot more about that and more actively involved, but it feels to me like it’s urgent.
Stephanie Dinkins:
Pushing people to think about those questions I think is a place where an artist can come in and do miraculous work. We can push questions and ask questions that nobody else is doing. How do we start to build databases that work better, are more open databases? You can go online, you can find databases for computer vision, for language, for many, many things. And what I inevitably find is I don’t like the representations of blackness and color that are in those databases, which always then comes back to the question of, oh, are you going to have to build a database in order to do the work?
Charlie Melcher:
I keep thinking about this discussion of data as if going back to your conversation about your grandmother and her garden, and that data is the rich soil from which we grow AI and grow other applications and systems. And if we don’t have a biodiversity in the soil, if it’s not nutritious and rich and representative of the broad culture and the humans that are in it, then we’re going to grow something that’s wrong. We’re going to grow something that’s unhealthy.
Stephanie Dinkins:
I think that’s an interesting metaphor, right? Because even the things we can’t see, the things that we don’t quite understand yet, which we tend to want to push aside, we can’t simply push aside and say, well, that doesn’t count.
Charlie Melcher:
I think it’s an app metaphor also because if you think about how agriculture’s evolved to be this monolithic crops and one kind of way of growing, and we have really kind of squeezed the biodiversity out of a lot of farming just as we have sadly in a lot of culture, and we are all richer for in ways we can’t see in things. You can’t see the nutrients, the micronutrients in the soil, but over time, it makes a real difference as you consume it in vegetables. And maybe that is a good metaphor for our racial social justice issues right now of how to help people think about why it’s healthy for everyone to have greater richness in our culture.
Stephanie Dinkins:
Yeah, I think that’s a great metaphor, especially when you’re talking, I’m going, oh, yes, and corporate ownership of
Charlie Melcher:
Monsanto. Yeah, exactly,
Stephanie Dinkins:
Right. You get the whole and
Charlie Melcher:
Weed killers, wait a minute, Roundup. Yeah, like
Stephanie Dinkins:
The ecosystem. We start to see something that’s very much its own ecosystem that was doing really well, and if you treat it well and respect it, it works. But then we start working it and changing it to our desired or the desired outcome of power, and it shifts what it can do and who can even do with it. And so yeah, I think it’s a really interesting way to think about how do we make healthy AI technological ecosystems that serve all of us? I’m thinking about what is it we can do right here, right now, and thinking a lot about what it means to think beyond the systemic barriers that we all know exist. We know, I understand that there are things that get in the way that there are systemic things trying to make problems, but I’m also thinking about, well, what is it I can do to go through that, get around that, whatever it takes to kind of circumvent the system that is holding one in place.
And that brings me back to my grandmother in lots of ways because I grew up in a house that my grandmother talked somebody into letting us buy. I don’t know how they did it, because at the time, the ethos was not for a black family to own a house in this area in that way. I grew up in this house. Here we are, and there’s something about massaging and not quite fully believing the stops placed in front of you so that you can do the things that you need to do. And so I’m trying to push this idea, how do we get done what we need to do?
Charlie Melcher:
Stephanie, you are clearly a product of that home that your grandmother created for you, that garden, that house, that home, and I’m really excited to think about the home that you enabling for next generations of people to be able to, of black people, of women, of a multicultural future that will be at home in our country, in our world.
Stephanie Dinkins:
I want to share that way of being with a lot of other folks. And so I’m hoping that we get to tell different kinds in very many different faceted stories of what blackness and other experiences of color are in this country or in the world actually, and share those widely so that we can work from a foundation that is real versus the kind of myth that sustains power.
Charlie Melcher:
Well, you certainly helped to reemphasize that importance of telling stories, the importance of personal stories, family stories, and the role that they do play and that we need them to play to get us to see each other as human beings and the commonality, I lost my mom young, your story about figuring out how to fill in around that void, and whether that’s with time in your grandmother’s house or through creating a memoir, using AI to some degree, you’re filling in for that loss.
Stephanie Dinkins:
Oh, yeah, definitely.
Charlie Melcher:
I can so relate to that.
Stephanie Dinkins:
Yeah. I always feel like if we get to know each other’s stories, we feel them. It’s evident to me that we feel them and we understand them on the layers underneath all the garbage that we place on top of it, and it’s important that we get to know those stories and understand those stories so that we understand our shared space and how to just deal with each other as people. It’s kind of amazing. Yeah. Thank you.
Charlie Melcher:
Thank you. I think I’m, we should stop there and just send you a very big hug through this small zoom screen
Stephanie Dinkins:
Hugs back at you, Charlie.
Charlie Melcher:
I’m Charlie Melcher, and this has been The Future of Storytelling Podcast. Thank you for listening. At FoST, we’re dedicated to staying up to date on the very latest storytelling technology from artificial intelligence to augmented and virtual reality and beyond. You can subscribe to our free monthly newsletter, FoST in Thought or learn more about our annual membership community, the FoST Explorers Club by going to fost.org. And if you enjoyed this show, please consider subscribing and leaving us a five star review wherever you get your podcasts. The FoST podcast is produced by Melcher Media in collaboration with our talented friends and production partners, charts and leisure. I hope to see you again soon for another deep dive into the world of storytelling. Until then, please be safe, stay strong, and story on.