In this episode, Rachel Strausman takes over the reins as host of Policy Chats from our previous host, Kevin Karami.
AllSides Co-Founder and CEO John Gable talks with students from the UC Riverside School of Public Policy about how understanding media bias and filter bubbles can help better bridge gaps amongst political polarization.
FEATURING John Gable
July 7th, 2023
36 MINUTES AND 08 SECONDS
In this episode, Rachel Strausman takes over the reins as host of Policy Chats from our previous host, Kevin Karami.
AllSides Co-Founder and CEO John Gable talks with students from the UC Riverside School of Public Policy about how understanding media bias and filter bubbles can help better bridge gaps amongst political polarization.
About John Gable:
John Gable received his B.A. in Philosophy, with an emphasis in Mathematics, from Vanderbilt University as well as Masters of Business Administration from Duke University. He has worked in a variety of fields, having previously been an Executive Director for various political campaigns, a Product Manager at Microsoft, the President of Stearns Ventures, and now the Co-Founder and CEO of AllSides. Overall, John Gable is a high technology executive focused on building, marketing and monetizing products, online services and teams that have a positive social impact.
Learn more about John Gable via https://www.allsides.com/news-source/john-gable
Podcast Highlights:
"The medium, the content, and the way we interact is driven a little bit by the medium itself. And what I thought about the internet was that it was mostly by metaphor: this is similar to that, and therefore you're a friend of a friend or I'm searching for something similar to what I'm trying to solve. I thought it would encourage us to think by metaphor or if you will, [in the extreme sense] stereotype."
- John Gable on the topic of how the internet is structured to make connections, which can initially be beneficial, but can also lead to dangerous steryotypes.
"[With the way the internet works] we see an issue, and we only hear or get information that we already agree with, which might only be 10% of what we need to know about an issue. But we hear that 10,000 times, and so we're really absolutely confident with no doubt that we're correct. But we know less about the issue than we did before the Internet.”
- John Gable on the topic of how filter bubbles can limit our access to necessary information, despite the increased access to information the internet seems to provide.
“[What we need to do is] get people out of the information filter bubble. Get them out of the relationship filter bubble, and provide people the skills and confidence to disagree, to have a conversation and not agree with each other and recognize the differences and appreciate the differences that each of us bring to the table. With that, that's how it would get to a better place.”
- John Gable on the topic of how teaching people to understand the value in disagreement can help people be more aware of filter bubbles.
Guest:
John Gable (Co-Founder and CEO of AllSides)
Interviewers:
Rachel Strausman (UCR Public Policy Major, Dean’s Vice Chief Ambassador)
Divya Bharadwaj (UCR Public Policy Major, Dean’s Ambassador)
This is a production of the UCR School of Public Policy: https://spp.ucr.edu/
Subscribe to this podcast so you don’t miss an episode. Learn more about the series and other episodes via https://spp.ucr.edu/podcast.
-
Transcript
Rachel Strausman:
Thank you so much for joining us today, Mr. Gable. We are so glad to have you here. The field of technology has been growing for decades now, and with that, so has access to information and media. But with this rise of increased accessibility of information, comes a lot of untruthful and unproductive, biased information, which can have great impacts. Which is why we're so grateful to have you here today, Mr. Gable, to discuss all of this with us. Starting off, you are the founder of a very successful and very influential media company, AllSides. Take us through what is AllSides and what inspired you to found it.
John Gable:
Yes, the background you just described quickly is what we were thinking about. I was thinking about a long time ago. So I’m umm Netscape Navigator. I was the lead product manager, the team lead from product manager for Netscape Navigator, which a lot of your readers may not remember because they're younger than I am - hopefully. So, Netscape Navigator was like the first popular web browser. The Internet was not something that general people used until Netscape Navigator hit, and that was the first thing that caught on. Now it’s code has been handed off to Mozilla Firefox today, but it's before Chrome, is before Internet Explorer, and all their later versions, and Safari. And in 1997, believe it or not, I was, when I was the same year, I started at Netscape. I gave a speech thinking about what will the Internet be? How will it impact society? And I had just read a book called Amusing Ourselves to Death by Neil Postman. And for people who are academic-oriented or geeks to read about media, that book was a great book about describing how will news change or how is it changing as a result of moving from the written word where things past, present and future, to a TV world where the medium is all about emotional context or just sensational emotional immediacy. And he was very concerned that that would turn news into something that is basically infotainment or emotional and sensational, versus valuable and useful. And he was really worried about that. And he thought that would be bad for society, bad for democracy, bad for education. He would have hated Sesame Street. Amnd umm, but I was thinking, well, what will the Internet be like? What will it do? Because his point is, the medium, the content, and the way we interact is driven a little bit by the medium itself. And what I thought about the Internet was that it was mostly by metaphor: this is similar to that, and therefore you're a friend of a friend or I'm searching for something similar to what I'm trying to solve. I thought it would encourage us to think by metaphor or if you will, stereotype. And the medium has proven to do that. The good things about that is metaphor is not such a bad thing. Let's say I had a problem with an addiction. I drank too much or something, but a friend of mine did the same thing, they got over it. So I could learn through metaphor from them what they did to get over it. And that's great. It's not so great if a bunch of my friends are drunkards. If I'm in a little filter bubble where nobody's really gotten out of that. I was concerned that the Internet, by encouraging us and even training us, almost like you training an athlete to hit a tennis ball or something. Training us to think in terms of metaphor, that we'd be more likely to think in terms of stereotype and metaphor. And in some ways it's good. Other ways we would stereotype people more. Or we now use the terminology of filter bubbles - where we only interact with people just like us or only know ideas that we agree with. And when that happens, when you're in a filter bubble, what happens then is we all become much less tolerant of any other person or any other idea that's different than me, or different than the ideas that I have. And that has been in my mind. It's the technology that has been the driving force behind the huge polarization and the huge breakdown of information online where you have really extreme bias and you have business models that improve, they encourage that bias. And that's actually the driving force behind the very problem we're talking about because it's bias and wanting to believe one thing, just like my group, that encourages more bias and permits misinformation to go and grow. And that's actually the core cause. And that's actually what led to the founding of All Sides. Ten years ago we thought it was so bad that had to start creating technologies and business models that change the information flow, that change the relationship flow. So that we could get out of these filter bubbles and actually begin to understand different ideas, understand each other, understand how people are different and ideas are different, but it might still be valuable so we can actually solve problems together.
Rachel Strausman:
That's very interesting, how you bring up how there was a book that predicted a lot of this before it really happened. I’m curious, was this, and I know you said you read this, but was this book taken seriously or was it just that people thought this was another one of those predictions that probably won't ever happen, and like let's keep going with this whole new technology, don't worry about the implications.
John Gable:
Well, the book was just about going from the written word to TV. I took that and imagined what's going to happen in the Internet. And so though I was right about some of it, there's a lot of stuff I didn't expect that happened, like I - the business models and the way we can manipulate people at such massive scale, is something that never occurred to me. But the, those other ideas are the way I interpreted it back in ‘97. And that's what got me interested in thinking about this as a possible solution, about a real problem we need to face. You got to recognize that when we, those of us who were early on the internet, we would stay up crazy hours, work seven days a week. I remember the CEO at Netscape literally locked the doors closed over one Sunday, so people would go home and stop working and take a break. I mean, that's how crazy we were in terms of building this new thing that's called the Internet and the worldwide web. And the idea was that we'd be able to get better information from across the world and make better decisions. And that we'd be able to connect with people all around the world like you and I are doing right now. I'm in San Francisco, you're down in the LA area around Riverside. But we're talking in a much more human way than was previously possible, because I can see you and see your reaction to that. And a lot of that great stuff has come to happen, but a lot of the bad stuff that was also described earlier has also happened. That's what we've been concerned about. So, Neil Postman, when he wrote that book, did understand how TV is impacting it, then I kind of theorized about what the Internet might do. And in some ways I was right, and in some ways it's a lot worse than I ever imagined.
Divya Bharadwaj:
Your journey has been so interesting. From your transition to running your own company, kind of shifting to present day, a lot of your career focuses on political polarization and media bias. Based on your experiences, has this issue been more recent and maybe unexpected, or more gradual and over time?
John Gable:
It's interesting we started this about ten years ago when I first did the first prototype for it. The company, you know now didn't start until 2016. So that's a little bit later. But, before that we would go around and explain to people this problem. How society was getting hyperpolarized, how it was causing problems for our democracy, for our families, for people's personal health, and how it's a big business problem as well. When we started doing that, I would have a meeting for 30 minutes with somebody. And I would spend 25 minutes trying to show them that this is truly a problem and convincing them that’s something to deal with. And there was you know- I was just trying to convince them of the problem. Now if I spend more than 5 minutes trying to tell them that this is an actual problem, they get impatient with me and want me to move on because everybody understands this is a problem today. So I do think the problem has been there, that's been growing, and it's been growing and growing and exasperating over time. I'm actually more excited today about addressing the problem and solving it, if you will, than I was eight years ago, or ten years ago, because now people understand there's a problem. Using a metaphor like I was using earlier about drug addiction, the first step for recovery is understanding you have a problem. Society now understands we have a problem. When algorithms started taking over, the way information flowed online and the way we search for things and the way we're getting information. Some people worried about it, but not many people were. I thought that we totally underappreciated the dangers of that. Today we have AI becoming more popular, and now everybody's worried about it. I actually think that's good news. I think there's a lot in AI that we can do and we're actually working on that makes society better, that helps our democratic society, helps us individually, enables us rather than replaces us. But the fact that we're worried about it is a good thing. The fact that we now recognize that hyperpolarization is a danger to all society and even our future of our democratic society at large. Those are good things. So it has it’s been there for a while. The problem has been growing and growing. And usually politicians and journalists are kind of last to know, and so they started on it and now it's exasperated further. So it’s, It definitely has been growing, like you were saying, but it's been kind of rooted into the fabric of the way our online and modern technology is working today. Changing and evolving technology’s something happens all the time and I think we have the opportunity to change it for the better.
Rachel Strausman:
That's very interesting how we're finally getting this break away from kind of having this, not really as something we think about or are concerned about to now, finally being able to accept the issue and as a result, address it. So building off of that, and I know this is something you had previously mentioned. But what are filter bubbles? And are they driven by this media bias or the people who use the media or something entirely different?
John Gable:
In terms of what's driving it. It's all the above, actually. But what a filter bubble really is, is when my world, I only see a filter of it. I mean think about the internet, there's way too much information out there. In fact, in one day, we create more content that was created by the human civilization up to the year 2000, the entire history of mankind created less content up to the year 2000 than we created yesterday. So there's an overwhelming amount of information and we do need technologies and systems to kind of handle that overwhelming chaos. The first version of that is search engines, and they use algorithms to kind of to try to break it down so you can find what you need. Unfortunately, most of those algorithms have been focused on an advertising model that basically, let's give you something that you like. Let's give you something that'll make you click more. So imagine like when you're in, at a grocery store and just before you get to check out, there's all the Kit Kats and Mountain Dew, my favorite thing, all the junk food, right they’re impulse purchases. And so the business models are designed around impulse purchase. They want to have more clicks and they're only looking at that at that depth level that leads to business models encourage you just to click on what you like. What do you like? You like things you already agree with. You love things that convince you that you're genius and wise and whatever you thought or felt is actually true. And hey want you to meet people just like you because we like talking to people just like us. And that makes us confidently ignorant. We see an issue, we only hear or get information that we already agree with, which might only be 10% of what we need to know about an issue. But we hear that 10,000 times, and so we're really absolutely confident with no doubt that we're correct. But we know less about the issue than we did before the Internet frankly. The old days of libraries and miscrofish, which you all probably don’t even know what that is, the ways of looking at lots of old information actually was better in some ways to get a breadth of perspectives than modern technology is, because so much of the technology has been either innocently created to filter, to get what you will make you happy or what you need, or not so innocently manipulated to get you to believe something somebody wants you to believe or to get you to be like my group or join my party or join my organization, or give me money or donate to my cause or vote for my cause. There's a lot of sophisticated technology focused on manipulating us for their ends. That stuff is vicious and that is using filter bubbles that happen both naturally by the technologies that's trying to help filter out a lot of the noise, sometimes by user behavior. I'm just going to not be a little bit lazy about this and just see what my friends think about something or just what's happening in social media. I'm not going to proactively learn about an issue. Or manipulated by powerful outsiders who want you to believe or vote or act a certain way. And that's really powerful and dangerous. And actually for All Sides, our mission and that’s been our mission from day one, it's been to free people from filter bubbles so they can better understand the world and each other. There's a lot about understanding each other, not just the information, but yeah, that's how, that’s what filter bubbles are, that's how they grow through technology innocently and quite sometimes you might even call maliciously and why it's a problem.
Rachel Strausman:
It's interesting that you bring up that it's innocently and also not so innocently. And looking forward to the future a little bit, is it even realistic to get a world where there's a way to stop these filter bubbles in totality? Because in reality, you'll experience filter bubbles when you're browsing the web, when you're on your social media app. There's just so many different places to keep building up these walls around you. So is there a future where we can help people bring those down? And help people be more aware of what's happening.
John Gable:
Bringing them down or giving people the opportunity to have part of their lives outside of filter bubble is the right idea. Getting rid of filter bubbles is not. It's good. Like if I'm having lunch with my friends, maybe I don't need to be with a bunch of strangers or people who are different than I am all the time. It's okay. The problem is that we can't escape when we want to. We can't escape when we need to. One of the great things that's happening today is if you look at all the data, there is a huge pushback against slanted media, 78% in some old data. I've seen a lot of data that's suggesting it's much higher of Americans want to get news that doesn't have a biased slant to it, that isn't pushing a partisan narrative or not. That's more than three out of four people. That's older information I think I'm seeing is actually more aggressively. Tired of being inflamed and being manipulated to be ultra partisan or ultra one side or the other. What's happening, it's actually happened in human society before. When a technology changes things in ways that our society grapples with. If you think about history, there was this new technology called the printing press, When people first started having books that were small enough to go on horseback and spread around society and people could read them or less people are capable of reading. That changed history. We think of that and we think about the Reformation and the Enlightenment and all the things that have come after that. But we forget that the first hundred years or so after that it was utter chaos. Books are being written or pamphlets are being written, inciting people to revolt against authority, whether it's the church or the government. And there was bloodshed. There is a lot of pornography like the internet today. There's a lot of completely made up stuff, completely false misinformation, if you will. That was just overwhelming. There were even people writing articles about how people's heads were so into their books. I'm looking at my phone right now. So down into their books, that they would lose the ability to interact with each other effectively. The analogies to the printing press and the adoption of the Internet and modern technologies, Online technologies are so similar. Now, we moved from that chaotic period of time with a printing press to a better place after about 100 years. After about 100 years, we had a whole new generation using it. And there's a media literacy that came with that. They grew up with books. They began to understand that they shouldn't necessarily believe anything they read or they shouldn’t necessarily respond to something upsets them by throwing rocks or storing the castle literally, or storming a church or destroying things. People began to understand that there's also new technologies, more libraries or credible book publishers. Then as a result, once you got some of these things, media literacy and new technologies, then the great promise of printing press began to impact our society. I think we're in the same place today. We just happen to be born into a time period when radical technology is changing how society functions and how we interact with each other as human beings. And it's a big deal. I think our job, when I say I'll just mean all sides. I think a lot of other bridging organizations, civic organizations, etcetera, there are over 5,000 bridging organizations according to Princeton. In the US alone, I think it's our job to reduce 100 years down to about 20. And I think we're about ten years through that. We want to enable kids and later generations, both students and adults, to understand the media, have a little bit of media literacy. So they aren't sucked into these hate mongering filter bubbles that actually create depression for themselves. It's very much tied to the increased suicide rates among youngsters, particularly amongst younger women. And bad for society, bad for themselves. Let's teach them how to use the new medium, the new technology, in a more healthy way. Let's give them the technologies to make that easy. To make it easy to get out of the filter bubble. To make it easy to interact with people you don't know yet, who are different than you are. To make it easy to understand issues across all different perspectives. And that's part of the role that All Sides would like to play, is to help make it easy for everybody at every generation to get out these filter bubbles and understand the world, understand each other better, and start solving the problems that we're suffering with right now.
Rachel Strausman:
That's an interesting metaphor you bring up. I've actually haven't heard that yet, and I think it provides a little bit of hope for the future that we're making this progress, unlike in the past. And we can use that as a learning opportunities. That's very interesting.
John Gable:
The other exciting thing to point to is actually the data we've been seeing the last two or three years. The first step to solving a problem is having people recognize the problem and the people really thinking it's important to solve. Everybody understands this is a problem. Now a lot of people in the younger generation coming all colleges, there's a group that's been taught that disagreement is scary, that you should avoid that. That's really dangerous because if you can't have disagreement, if you can't disagree, then you are stuck in a filter bubble. That's terrifying, that self righteous, and actually makes you more depressed because you can't be yourself. You can't be different. But there are also a growing percentage of people, particularly younger, who are like, it's ridiculous, that's stupid. Let's get out of these bubbles. Let's push back against this manipulation. And then what we're finding, because we've done lots of work with Stanford and Princeton and nonprofits, and USA Today and other groups to experiment with how we can have conversation with each other, in mass, like online, with video. In a way that we're actually learned that it's easy and comfortable to disagree and actually talk in a healthy way. There's a book called Healthy Conflict that talks about this a little bit, but that's part of what we need to overcome for our generation. Get people out of the information filter bubble. Get them out of the relationship filter bubble, and provide people the skills and confidence to disagree, to have a conversation and not agree with each other and recognize the differences and appreciate the differences that each of us bring to the table. With that, that's how it would get to a better place.
Divya Bharadwaj:
I have a follow up to the filter bubble conversation. Actually you touched on a little earlier. Do you mind explaining the impact that this has and maybe will have on filter bubbles, if any?
John Gable:
Yes, I think filter bubbles are the cause. Ai is like a super powerful tool that can make it worse or better. Most technology can make things worse or better. One of the problems with the filter by one of the ways that people make it worse is that even before Chat GPT or more popular use of AI, they're trying to figure out what would make you happy. Or what information or experience can I give you that gets you so angry or so involved that you'll give me money or evoke for me or attack these other people. Ai lets people do that even more effectively. Ai enables the big powerful manipulators and politics and business, even in nonprofits or self righteous groups, to inflame you further. That's scary as all hell. And it's going to be harder to identify. It'll be easier to lie more effectively or more convincingly. But misinformation and false information only work because people want to believe that false stuff. And it only works if you cut off conversation, if you cut off alternative points of view. The best way to deal with any kind of misinformation, whether it's AI or otherwise generated, is to have an open form where people are saying that's a lie, here's proof that's a lie, are heard. The temptation is in a world overwhelmed with manipulation and misinformation, is to trust somebody outside to stop it. To forcibly prevent anybody from saying something or delivering something false. That actually, historically, even in recent history, leads to more problems. Because whatever entity you trust will either be innocently wrong or manipulate you intentionally in really bad ways. The best solution is like the solution we have as a democratic society is an open form. The solution isn't necessarily administrative truth, which is the first thing I said, which is a danger when you have one trusted source. It's also not anarchy or is fare because what happens to it is completely open. Without any systems said, whoever has the most money or the most power or most popularity wins. And so then you're just being manipulated by a different group. Whether it's a one you decided or one that you accidentally gave, Ministry of truth or Anarchy leads to that. What you need is a system that enables people to hear the different sides, to decide for themselves. We need to give people the ability to figure things out for themselves and see those different perspectives. And I'd like to make that analogy between the difference between a dictatorship in government versus a complete direct democracy, which is anarchy. And really no government, or a democratic republic like we have in the United States, where you have powers that are balanced and you have ideally, that we have less of this now than before conversation. And it works itself out. And not the beautiful way, but actually a beautiful way because it actually is the best way of dealing with that breadth and huge diversity of opinions and people. I think we need that as a technology system as well to combat misinformation, combat AI use for bad, to propagate more misinformation or propagate more manipulation. The only way to do that is to make it transparent and open and systems that enable people to see the reality.
Divya Bharadwaj:
Yeah, that's really interesting. And I know that all sides lets you see the left, right, and center points of views going off. With that, could you explain what causes confirmation bias and cover some of the immediate and long term effects of this phenomenon?
John Gable:
The simplest way to describe why confirmation bias exists is don't we all love being right? It's a human nature thing. We like being right. There's a great book by Jonathan Y called The Righteous Mind. While good people in politics and can't agree or are divided by politics and religion. The Righteous Mind by Jonathan Haidt, and he's, he's a social psychologists breaking down values of different groups of people. He notices that all people have pretty much the same set of values. Republicans tend to put more emphasis on, some Democrats put more information, others actually Republicans put value in all values. And people on the left tend to focus on a few more than others. And coming from there, you see why there are differences in both politics and religion, the reality of human nature. I'm a philosophy major who was one course short of all skinny math major. My brother still can't believe I didn't take that last class and I went to business school. And I'm very much a geek and intellectual guy. I love making things reality. But the truth of the matter is I, and all the rest of us are driven mainly by our intuition and emotions. And then we use our big old brains to convince ourselves that we were right all along. That's how human psychology actually works. Now if you are working with other people, if you're listening to other people, if you have grown emotionally to truly listen to others and get outside of your bubble, then you can combat your own biases and actually find out what's really going on and come up with better solutions and live a more fulfilling life. Life is much more interesting with the diversity and beauty and inspiration of all the different ideas and people there are, but psychologically speaking, we’re wired to just react emotionally and then convince ourselves to right all along. And you need to grow in ways some people are very worried about. This is like human nature or human beings damned for this. There are lots of good things and bad things about human nature. We are by nature tribal. We like to create our own little tribe in us versus them. That's a scary thing, but we're also wired to connect, to grow as a people individually. By connecting with others, particular others that are a little different than we are. It's just really a matter of what part of your human nature are you going to lean into. What in our society can we do to help us highlight the strengths of our nature and control and limit the negatives for our human nature?
Rachel Strausman:
It's a good point that you bring up that also isn't talked about a lot, that the issue isn't just with technology and the way these things were created, but the issue is with the way that we as people think and we as a community interact with each other going down that route. And you've touched on this a little bit, but how important is connecting with people who have different beliefs? How can people in this highly polarized, highly filter bubble created society make these connections?
John Gable:
That's a great question, and that's what we're focused on. When we started, when I first started at All Sides in that prototype area, way before the current company, I was just thinking about information flow. Then I met Joan Blades, who's the head of Broom conversation. She's the co founder, also Von.org a very left oriented political group. I used to work for Mitch Mcconnell back in my early days in the '80s in politics, which makes me really popular in San Francisco. He's the Republican leader of the Senate. She and I are completely opposite sides of the political spectrum. I was focused on changing information flow so people could see the different perspectives and decide for themselves. She was interested in fixing relationships across differences. She cares a great deal about climate change and some other causes, generally more interesting to progressive. She found that we could no longer make progress on these because people just wouldn't talk with each other. She was building systems to help all of us conversation guides based on mediation expertise and other things that makes it easy for anybody to talk to somebody opposite than they are and have a conversation where they both begin to understand each other and listen to each other and appreciate each other. She and I met at a conference in Reston, Virginia, like, I don't know, maybe eight years ago now or something like six years ago. We both realized that we were incredibly complimentary, that everything we were doing was to solve the same problem from two different contexts, and together anything was possible since that walk. She and I did a Ted Talk together. Joint Ted Talk. One of the things I say in the Te talk is a moral stories. Joan Blades ask you, go on a walk. Go on that walk. We went on a walk, and since then, we've been talking every week and working on stuff together for years. As a result of that, there are tools available. There are lots of organizations available now, Braver Angels, it in the political world, there are a lot of tools at living room conversations.com which are simply conversation guides that you and somebody thinks differently from you. The idea is that you invite a friend who has a different opinion on something and each of you to invite two friends and you have a little conversation with each other. You follow this guide. There are over 100 guides who could be about any topic you can imagine. Including how to approach Thanksgiving together, which is when families get particularly interesting. But what I always like to describe it as is, while people want to talk about the issue by following these steps, you accidentally discover the humanity of the other person. Once we understand the humanity of the person where they're coming from, even if we're complete different sides, we realize that we care about the same things. We're trying to accomplish the same things. And the room and the tone and everything changes. There are a lot of people just loved that experience. And there are other people who just see it as a practical matter. Like this is how we get things done, but it's very doable. What's interesting is human society as to grown in technology, we've also grown in our knowledge of psychology. We actually know as a race, more about how to get people who are completely opposite sides of the spectrum and hate each other. They're literally shooting at each other in the past to talk to each other. Listen to each other and appreciate each other and work to eachether know more. About how to do that today than we ever have in the history of the human civilization. We know a lot about how to do that. We also know how to separate people better than we ever did right now. The technology is leading into the separation of the tribal warfare. So some people can make, have more power, more money. But we also have the tools to connect even across those divides. Part of what we have to do is convince folks like you all because I'm not sure how often you've had these conversations that it's okay, that's actually good. It can actually be honestly fun to have a discussion about a tough topic like racism or the border, or choose whatever topic you're concerned about. And to really listen, Listen first to the other person to understand where they are. Once that other person hears and sees that you're truly listening to them and understanding who they are, nine times out, ten, they'll return the favor and truly listen to you. If we go into the conversation saying I want to be heard, you have a bunch of people who all want to be heard and aren't listening to anybody else. If you start by truly listening to understand the other person, that changes everything we have at All Sides, afraid of technologies that we're doing in schools, in a program called Mismatch. And we're going to more thoroughly throughout the nation as we develop the technology, make it more scalable. Making it easy for anybody to do that on demand is what we ultimately want to do. So you might be upset about an issue you get online, you can talk to somebody different than you are in a small group and actually begin to understand it. That's what we're hoping to do and that's what we're planning on doing. I think that as people see the problem with our current environment, there's could be more and more solutions and more and more adoption of solutions like that so we can understand each other.
Rachel Strausman:
Yeah, I think that's a great place to end where at the end of the day, technology is causing a lot of these issues, but so are we. The good thing is we're making change and we're making progress. And hopefully all of this progress in terms of engaging with people of different beliefs and helping be more aware of our filter bubbles can help prevent this from happening a third time in the next. Who knows where technology is going to go, but technology boom, where we experience this again, hopefully we can have more of that awareness. I think if anything, your interactions with Joan and the way you transitioned All Sides from just being about information to be information and connection is a testament to that and how you can be going on a path and engaging with people that have completely different views as you in the end in a respectful way And if you engage correctly, can only be beneficial and can open your eyes to doing even more good for the world because you already were starting to do so much greatness. And by getting to interact with Joan, and I'm sure the same goes for her with you, got to really expand that. Thank you so much for joining us today. It has been an honor to get to speak with you and we really appreciate you being here today.
John Gable:
Thank you so much. This has been fun.