For our final episode of season 2, this time it’s Jim who sits on the other end. Our special guest host is his longtime mentee — and CEO of Nexleaf Analytics — Nithya Ramanathan.
What led to the decision to prioritize social impact over the stereotypical Silicon Valley (attempt at) wealth for its own sake? Guided by Nithya’s insightful questioning, Jim’s narrative takes us through the early experiences that shaped his sense of empathy without ever losing his innate curiosity for science and technology. As they discuss the evolution of social entrepreneurship, the significance of data sovereignty, Jim and Nithya explore the foundational layers of tech that are needed for the future of systems change.
This episode is a call for all tech professionals and leaders to harness their skills in service of the world’s most pressing challenges. Join us as we close Season Two with a conversation that’s as much about looking forward as it is about looking back.
Transcript
Jim Fruchterman [00:00]
Welcome to Tech Matters, a, weekly podcast about digital technology and social entrepreneurship. I’m your host, Jim Fruchterman. Over the course of this series, I’ll be talking to some amazing social change leaders about how they’re using tech to help tackle the wicked problems of the world. We’ll also learn from them about what it means to be a tech social entrepreneur, how to build a great tech team, exit strategies, the ethical use of data, finding money of course, and finally, making sure that when you’re designing software you’re putting people first.
Nithya Ramanathan [00:38]
Hello, everyone, I’m Nithya Ramanathan. And today I’m going to be your host. I wanted to do this interview with Jim because he’s actually been a mentor for me for decades. So actually, even long before he and I even knew each other.
Jim and I are going to cover a lot of ground from Jim’s childhood to how Jim has actually been a mentor for the entire social sector, and of course, to covering big tech and AI. So stay tuned.
—
Nithya [01:11]
Thank you so much, Jim! I’m so happy to be here. And in addition to having the honor of being Jim’s board chair, I’m also a CEO over at Nexleaf Analytics, which is what has given Jim and I the opportunity to work together for so many years.
So I am so thrilled to be here to flip the tables on you, Jim, and actually get a chance for your audience to hear more about you and your views and kind of what got you here in the first place.
So as you know, Jim, I’m always interested in kind of understanding what motivates people, what really drives them. And I’m a big believer that a lot of that comes from our childhood. So I’d love to kind of hear from you a few of the experiences from your childhood that really kind of stand out to you as being formative.
Jim [02:09]
Well, I basically grew up at the public library, and so, you know, partly because I was a lot more comfortable with books than with people. And so, you know, I would be, you know, as a third grader reading the, you know, adult astronomy textbooks, and just, you know, diving deep, which made me the most obnoxious kid in the class, because, you know, in elementary school, I would be, you know, correcting my teacher, it’s like, well, you know, actually, at this moment, Pluto’s not the furthest planet — it was back when Pluto was a planet, and it was not appreciated.
And I think one of the more formative moments for me was in middle school, where I was walking home, and my path home led through what’s called a retention base. And in the Midwest, it was like a depression in the ground where floodwaters would go rather than flooding your neighborhood.
And a group of kids held me down so that a kid who had a broken leg could throw a cinder block at me, which was not a very pleasant moment. And I remember running away, and there’s a circle of kids, you know, around me. And, you know, I was running away, and remember being grabbed by this girl to try to keep me from running away. And I looked at her, I didn’t even know a path of going, maybe being right all the time, and being the smartest kid all the time, maybe, maybe I should be thinking about, you know, other people a little bit more.
Nithya [03:47]
Yeah. Thanks for sharing that. And I think it’s really interesting that you came out of that experience and actually turned it around and said, hey, I need to think about others more. I think a lot of people might have gone into more of a self-protective mode from that experience.
And so I’m kind of curious, what do you think it was about that experience or about you that actually turned, where that turned you outward?
Jim [04:14]
I don’t know, I just, I didn’t want to be hated. That was, you know, that was — that’s enough of a motivation I think for most human beings to go, hmm I’m doing something where people are hating me. Maybe… I don’t think I thought, oh, I deserve it, but I thought oh, maybe I could change my behavior a little bit. But it doesn’t doesn’t mean that I changed overnight, right? I mean, you know in high school I was reading a science fiction book a day, you know, cuz it was easier than getting dates.
And I went to a Catholic boys high school, and that was that was another key turning point for me because, you know, I wanted to become a scientist, you know, that was that was kind of my vision. And we had brothers and fathers and sisters running around this high school, and they were into computer software and physics and math. And so that made kind of an impact on me that the idea that you could both be dedicated to a life of service and like writing software programs or geeking out on calculus or something. And it was in that high school that I I remember meeting the head of the math department, father Purim, and he had the Caltech Beaver Cheer on the wall.
And I actually really hadn’t heard of Caltech. I’d heard of MIT, but Caltech? So that’s where I ended up going, and that was actually a deliberate choice, because I got into one of the Ivy Leagues and I thought, do I want to be three sigma weird — to pick the statistics term — at you an Ivy League school, or do I want to be an average person at Caltech? And it turned out that actually I was above average in socialization at Caltech.
Nithya [06:03]
Yeah, it’s sort of this empathy plus engineering, is a theme that I think comes up a lot in our conversations that is what makes them stand out to me.
And, you know, so a lot of people don’t know this, but Jim, you were the person who actually inspired me when I was in a really similar turning point in my journey, where I knew I liked engineering, I worked in Silicon Valley, but found it pretty soulless and, you know, lacking the empathy part. And so when I went to grad school in computer science, I was still kind of searching for that North Star. And you really gave me that.
I mean, you didn’t know it at the time. And so, you know, as I was kind of like searching the early version of the Internet back then, right, like early 2000s, I came upon one of your earlier ventures, Benetech, and really kind of kept that as my inspiration.
And it really kept me going through grad school as I was, you know, building up time synchronization algorithms and writing driver code for actuators. I kept Benetech in mind. And I was like, that’s where I could end up. Like, that is a possibility in the world where there is somebody else who brings that engineering plus empathetic mindset. And so that is something that really kind of inspired me into what is now the tech for good field.
And of course, I didn’t know it, but you are really one of the major pioneers in kind of helping to define and really bring forward tech for good. And so I’d love to talk a little bit more about kind of your thinking and your practice when it comes to this field.
Jim [08:01]
Well, you know, at Caltech I learned about pattern recognition, the idea that you could have a computer recognize an object in the real world. It was very primitive technology at the time. And the example that the professor was using was making a smart bomb, that you had sort of a template of the target and a camera in the nose of the missile, you’d fire it, it would find it and blow it up.
And so years later, when I got invited in to talk about starting a startup that could make a machine that could use that pattern recognition, I went back to my college days and remembered that I thought the socially cool application of pattern recognition was helping blind people read, as opposed to making bombs.
And so I had the very first meeting of the three founders and said, hey, you know, if we build this, not only can we make money, but we can help blind people. And then, when my board vetoed the idea for, you know, great business reasons, but terrible social reasons, I thought, oh, I should start up something that does tech for good. And I actually called it initially “good tech”.
And then we settled on Arkenstone as the name of the nonprofit. This was the original name of Benetech, which was gonna make reading machines sublime, “the stone that sees”, again, you know, my geeky Tolkien connection came out. And I told my wife I would do this nonprofit thing for a year. I started a for profit at the same time. Actually, I started for profit a month before Arkenstone. And then I ended up running them both for five years and they both grew. And eventually at one point I had to choose to be the CEO of one or the other, because they were both $5 million a year kind of organizations and they needed full-time CEOs.
I chose the nonprofit. I was having so much more fun running a nonprofit; the people were nicer, blind people were very appreciative of having a tool that helped them read independently, freeing them from depending on sighted people to read to them. So I just, I kind of stumbled into this as what I should do because, you know, I just thought I would set up this charity as a sideline and keep on working on making money in Silicon Valley.
Nithya [10:18]
What did that feel like at the time? Do you remember even?
Jim [10:22]
Well, you know, my wife and I had to talk about not making as much money. And my wife’s a musician. [laugh] So the idea of not making much money was already part of kind of, you know, her vision of some of the trade offs. And I think, frankly, I think she thought, you know, that I was happier working in the nonprofit sector than I’d been working in the for profit sector.
You know, running a Silicon Valley venture backed startup is very stressful. And I had some very miserable moments, you know, at the startup. I got demoted, finally, at one point, it was just… it was just really, not fun. And here I was doing something technical, but having a blast.
And so, yeah, I mean, we were gonna make a living either way, it’s just, you know, we weren’t gonna become billionaires. And, you know, many of my peers here in the Silicon Valley area just fundamentally did not understand. It just didn’t make any sense: “Okay, so you start this nonprofit, and then you take it public, and you get rich, right?” I’m like, oh no, we’re kind of already public. And no, I would go to jail if I tried to get rich off of it. [laugh] They were like… so why are you doing it?
Nithya [11:39]
[laugh] Yeah, yeah. How have you seen that conversation change over the years?
Jim [11:44]
It’s become more socially acceptable to talk about social good, right? I mean, and of course, who the heroes have, who the heroes are has evolved over time. I mean, when I started here, Hewlett and Packard were the philanthropists that we looked up to and then Gordon Moore of that sort of generation. And then Jeff Skoll and Pierre Omidyar, the two key leaders at eBay, you know, gave pre-IPO stock to the community foundation and that became the foundation of their philanthropic efforts. And so over time, it’s gone from being like, oh yeah, there’s like, well, we know two guys who do it.
Nithya [12:29]
So this is a good point — I’ve got a little surprise here for you. I have a really interesting quote that I’m going to read to you from somebody at Omidyar.
But before I do that, I just want to reflect something that is one of the great delights of being in conversation with you, Jim, is that you are able to kind of be doing something, but you’re able to observe the entire system. I think another way to put this is like, you’re able to be on the dance floor, but also simultaneously be up in the balcony looking at the dance floor as well.
And so we’re going to come back and talk about this theme, you know, but I wanted to just kind of set that stage before I read this really nice quote that was unsolicited, I should mention, but I did get his permission to share it publicly. I don’t know if he’ll realize how publicly we’re going to share this. [laugh] So I was talking with Anamitra Deb at Omidyar, and he said this really delightful thing about you that I found so true, which is that:
“Jim plays a senior mentor who other people look to as being a pioneer and who value his expertise. And he also brings strengths as a problem solver, and is brought in frequently to recommend solutions and advise others. And the value he brings is somebody who has, over the course of a couple of decades, built institutions and a body of work and a body of knowledge that other people find to be valuable.
And he’s generous with his time, with his insights. There’s lots of people who have built their legacy assuming a zero sum game. Jim is the absolute opposite of that. He has built his life to bring oxygen for everybody.”
That’s the end of the quote. So I really love what he said about you, because I think it just captures so much of the imprint of you that you are leaving on this world.
And I should mention, so Anamitra is the Managing Director for Responsible Tech, of course, at Omidyar. So he’s seen a lot of people like you, and yet you really stand out to him as this kind of tower in the space.
Jim [15:03]
Uh, it’s a, it’s very nice. Brings tears to my eyes. So…
Nithya [15:09]
Good. It has the benefit of being true. [laugh]
Jim [15:14]
[laugh] Well, you know, it actually connects something to the geek in me, right? I’m always building mental models of how things work. And then testing those models, does it work this way? Does it work that way? You know, I often will role play in my head a meeting with someone important to try, okay, maybe it’ll go this way, or go that way, you know.
And the breakthrough shows up in a lot of different ways for me. The first breakthrough was, for 10 years I didn’t know I was a social entrepreneur. I didn’t know I was running a social enterprise. I thought I was the one lone weirdo in Silicon Valley not trying to become — at that time — worth $100 million. Now they all want to become billionaires. But you know, back then, it was only $100 million.
And then I met a guy who worked in homelessness in San Francisco, Jed Emerson. And people kept saying, Oh, you should go talk to Jed Emerson. What does he do? He works with homeless people. Oh, okay. And I wouldn’t go. And then finally, like, after the fourth or fifth person said that I should go, I went and I talked to Jed and it turned out that he was funding social entrepreneurs who were doing business and doing social good.
And that’s kind of what I was doing. And, and he introduced me to this whole crowd of people who are my peers, very few of them were tech people, but they were all like, you know, running a business-like operation and maybe employing people who are difficult to employ. And he pointed out to a Harvard Business Review article. And the idea that someone had written an HBR article about this field, I mean, that was eye opening because suddenly I’m like, Oh, that’s why that didn’t work, right? Because someone had written, you know, basically had synthesized a theory. And so I think it was at that moment, I went, boy, I should spend more time and figure out why things work and don’t work rather than just like, you know, trying things.
And what’s happened over the last, it was about 20 years ago, it was actually over 20 years ago that I met Jed, that I’ve gotten to meet so many other great tech for good leaders. And so, you know, the insights that I’m bringing are mainly not from my experience, it’s from all these people that I’ve seen, all these projects, I’ve seen that, you know, again, fail — most of the time people fail — that’s normal in tech, right? And so with all those examples, both good and bad, you start building up a theory of what works and what doesn’t work on the average.
And I think that, you know, part of this podcast is, you know, not only to bring the theory, but also bring the personalities of these great founders in the hopes that, you know, like, like when I interviewed you, someone will see themselves in Nithya and go, Oh, I can be like Nithya when I grow up.
Nithya [18:05]
Mm-hmm. So, you know, that’s something that you and I have talked about, the ways in which tech, and also the platform that, you know, we have and have access to can help amplify voices. There are now, likely, thousands of social enterprises out there, and increasingly more of them are bringing tech to the fore in terms of what they do and how they do it. And so, in some ways, you’ve built this tech for good field.
And so, now as you kind of enter into this next, I’d argue next phase of your work, maybe we could start by just defining systems change. So, you know, I think this term, systems change, is thrown around a lot, but how do you define systems change?
Jim [18:57]
Well, you know, normally I’d go off and study up, and come up with a more carefully drafted definition. But this time we’re going to wing it. [laugh]
I think of… I kind of take a problem-centric sort of vision on systems change, right? It’s: There’s this social problem, whatever it is, you know, people with disabilities don’t get equal access to an education… the human rights of this minority group are relentlessly exploited… maybe to the point of genocide. So you, you pick this kind of problem and I believe that what systems change is trying to do is take the current state of that system where these bad things are happening, people are not getting the opportunities they need… And, and it’s not that people aren’t working on the problem, it’s just they’re working on the problem using, let’s say, the current tools that they have. I think systems change is like going up to the next level.
The nonprofit sector is 10 or 20 years behind the times on technology. So for me, if there’s a drive for reform in an area where people are like, this isn’t good enough, and they have a vision of what would be much better, the idea that I’d show up with some technology and say, Hey, I think this technology can make that systems change effort a lot easier. So usually it’s, it’s not technology alone, but it’s technology in partnership with a change in behavior throughout this field of working on this social problem where they went from this kind of thing to that kind of thing.
Now that’s very theoretical. But there are lots of practical examples out there. So, when I started in the reading machine area, you know, there was a $40,000 reading machine, but very few people could afford it. So blind people were read to, like someone sat next to them and they had to convince someone to sit next to them and read something to them. Or they got a book on tape, and there were very few books on tape. Maybe there were a few books in braille if they were a braille reader. And what we did is we created a reading machine where any blind person or their teacher or family member could scan in a book and suddenly the number of books that were available were far more. It costs a lot less to make an accessible version of a book.
And then the next thing we did is we made a online library where people could share those books. Effectively, this was the move from printed books, braille books, books on tape, to eBooks, digital books, because you can turn a digital book into anything. Suddenly, digital books, they’re 50 times more cost effective than books on tape.
So, I mean, that’s a very easy example. It’s well-suited to tech, but the thing people don’t really realize is that so much of social good is not about delivering bags of rice and tangible objects, it’s about information, knowledge. And when you talk about moving information and knowledge about, well, then suddenly IT or a new invention can, can really move — you know, a new drug, whatever it might be — if you use the power of, or knowledge to do systems change, suddenly you might have a new vaccine that really is effective against the new disease. I mean, amazing things like that can happen.
Nithya [22:21]
So, okay, there’s so much that I want to dig into there. But what I’m interested in is something that you and I have talked about quite a bit, which is the critical infrastructure that’s needed. And the, you know, you could say the systems and the ecosystem and, you know, whatever you want to call that, right, but the core aspects that are needed so that an entire sector can work better to leverage tools like AI and machine learning. And, you know, as engineers, I think this metaphor really works for us. We’ll see, you know, if we can come up with something a little better. But of course, you and I naturally think about the internet stack and how without TCP/IP there would be no HTTP. And without that, there would be no Google, no Uber.
So one of the things that I’m wondering about is as we now think about this next phase, what is that TCP/IP equivalent for the tech for good space?
Jim [23:26]
Well, the tech stack analogy is one I like, I’m not sure how accessible it is to everybody. But the listeners to our podcast kind of have a tech bent. So, you know, when if I was starting a for profit company right now, my tech stack would be 97 or 98% done, right? If I want to create an Uber, all this infrastructure works. We’ve got storage, we’ve got process in the cloud.
But I think that because the nonprofit sector is kind of far behind, our tech stack’s more like 50 or 60% done. And when you have to build most of the or half the tech stack, instead of a couple percent, you spend an awful large time doing stuff that is prone to break, is expensive, and the non profit sector really can’t afford that.
So I often think, what’s the other missing infrastructure that the nonprofit sector wants, or needs, that doesn’t exist? And it turns out that we’re working on sort of two threads, right? One is, what is the software stack that you need to run, you know, in our case, an effective contact center for a helpline, right? So we, we built that. Or what’s the technology you need to collect data around climate change, climate adaptation, regenerative agriculture, all these sort of things. So we’re busy building, in some cases we’re finding the plumbing already built, and making it more accessible to people. That’s often a piece of our solution. And then we’re actually trying to build the solution itself, much as a for profit startup would, but we’re going it for people who are not going to make a billionaire, because the market’s just not that big in that field.
And then the other thing we’re also working on is, how important data is in this? And it turns out that the data infrastructure in the social good sector is wildly deficient. And if that’s the case, then that means that, for example, all the AI of today is trained on data that’s more or less from middle class and above Americans, you know, Canadians and a few Europeans. But I mean, the thing is that it’s not representative of the people, it’s not representative of the planet. And yet, why? Because the data about them is just not considered important. Their languages are not considered important. Their neighborhoods and their fields are not considered as important as Iowa cornfields, right?
So overcoming that is all part of enabling any systems change effort in any field. We’ve got to, like, build the tech, and we have to build the data infrastructure. Oh, and then we have to come up with, well, what are we doing that’s better that the tech is helping enable? Because automating an existing broken system, and making it cheaper, just means you get to broken faster and cheaper. That’s not really systems change.
Nithya [26:21]
Mm hmm. So you’ve brought up kind of this data infrastructure, which if you think of it as like a layered cake, right, like, let’s call it the middle layer, this data infrastructure. And then there’s, of course, the applications, like once that data infrastructure is working, what that looks like is the data is available, it’s reasonable quality, it’s tagged in ways that can be used to feed training algorithms for AI or decision making, you know, business intelligence, like all the sort of ways that we know data can unlock change, like that top layer of the cake, all of that becomes possible once this kind of core data infrastructure is working.
Another thing that I think is really interesting about the Uber, Meta, and you know, we don’t we don’t need to throw anybody under the bus, but like just any one of these kind of big tech companies has built their own privatized stack. And they can afford to do that because of the share market and the market capitalization that they’re able to access. And that’s something that we can’t afford in our sector, right? So, you know, whether it’s Tech Matters, or, you know, my organization Nexleaf, or the dozens of organizations out there, we can’t each be building our own entire data infrastructure. We can’t be building our own three layer cake.
Jim [27:40]
Mm-hmm.
Nithya [27:41]
And so… then what we are talking about is cooperation at a very large scale, and that cooperation needs to be seen at the donor layer — I mean, with the donors — with implementers, with doers, with communities, with countries. And yet that is probably the only way forward. And so, I guess I’m kind of thinking about how do we approach this next phase of work, where this type of core, shared, data infrastructure is going to be needed in order to enable the types of innovations in applications of machine learning and AI that, that we know are needed and can be transformative?
Jim [28:33]
Uh, you know, Benetech, Tech Matters, you know, Benetech used to have its own servers in our offices. And then we had servers in a data center and then we stopped having our own servers and we basically paid Amazon. And the reason that we pay Amazon for that critical data infrastructure is that it’s cheaper, and it’s better, and it’s more secure. And so, as a small organization, it makes far more sense to rent quality data infrastructure. And of course, Amazon has made a big business out of that, as has Google and Microsoft. And we believe that that infrastructure is trustworthy and that they’re not using it to hurt us. Maybe that’s naive. I don’t think so.
So we, we’ve now outsourced all of that infrastructure. Now this is great if you’re an American organization… other countries hate this. They hate that their data infrastructure is stored in Virginia.
Nithya [29:34]
And that’s justified!
Jim [29:35]
Oh, yeah. I mean, the U.S. government — we know that the U.S. government is regularly asking for data about non-Americans from American companies, right? And of course, we’re terrified that the Chinese are doing the same thing with TikTok that we’ve just been doing forever. But hey, we don’t have to go there.
So we have a data infrastructure problem. And then we have the questions of how open source are you, right? We build open source software, but often we’re building on top of Amazon Web Services, which is not open source software. And so, are we fully open source? Not exactly.
And then and then there’s the whole other issue that you and I have been talking about, which is, what’s the what’s the data norms infrastructure? Because the norm in surveillance capitalism is, I go to a website, a commercial website, and any one of seven thousand companies may be spying on me for Meta. So I may not use Facebook, but, you know, there’s still all these other companies collecting data and providing Facebook… So they have a profile on me, even if I’ve never used any of their products.
The fact that this infrastructure works as well as it does and makes a lot of people money is amazing. But the nonprofit sector doesn’t have anything like that, which means that instead of being concentrated in one of the 10 gigantic companies, the data is in a million tiny silos. And the power of data for machine learning, for insights, is when you have a whole bunch of it.
So, you know, one of the things that you and I worked on, Decolonizing Data, the idea that data ownership and control and the benefits of data should flow to the communities about whom the data is being collected, you know, just recently we launched the Better Deal for Data. Which is to create kind of a legal and norms, you know… hey, if you’ve got confidential sensitive data, it should be encrypted so that people can’t use it. If you’re going to do it for research, you should take out the personal information. If you do research, you should publish the research open access so anyone can get the research for free.
So we’re not only working on the tech infrastructure and the plumbing. We’re also trying to set this norm where the control and the ownership of the data is shifted away from, let’s say, the tech industry and towards people and communities and countries. And also, ideally, making the data compatible as long as you’re going to use it for social good as opposed to selling the data to Meta.
Nithya [32:10]
You know I’m a huge fan of the work that you’re doing with Better Deal for Data. And I really love how, you know, where you and I started in terms of writing about decolonizing data and writing about the ways that the value from data had been extracted from communities, much like the sort of physical assets like oil and jewels and natural resources had been extracted from communities for decades.
And so, you know, we decided to call that data colonization. And of course others had written about it before us as well. But as we took a look at the tech for good space, you and I decided that we wanted to apply that kind of same lens in looking at these different technical ventures. And I really love how you’ve now taken that kind of more conceptual framework and built to this very practical set of norms and guidelines so that people can now basically turn this into new behavior and practice.
So I guess on that note, I’d love to kind of hear from you as, like, kind of a final sort of conclusion here, as you work to bring in more people into this space, bring more solutions builders into this field, you know, what is your call to action? What do you see as the big opportunity? What is your kind of hope for the future here?
Jim [33:40]
I think that a lot of what I want to do at this stage in my career is to make it a lot easier for thousands of leaders and hundreds of thousands of tech people to go off and tackle the world’s biggest problems. Because I think that’s actually what tech people love doing, is solving problems and they then get pigeonholed.
And I think that society is waking up to the importance of technology and data. Technology and data have reshaped society extensively. I’m researching for a book and I looked at the Skoll Award winners the last five years: 60% of them have tech teams. Anybody who’s ambitious about systems change, anybody who wants to move the needle on any social problem… that involves changing the lives of millions of people, and how you can possibly do that without software and data, and maybe a technical innovation or two in the hardware side? That’s it’s just unimaginable.
And so, how do we create the same kind of infrastructure standards, norms of behavior, business models, AI models that are more representative of the world? Trying to get all of those things in line will create a much more fertile space for innovators to go off and, instead of trying to solve half the technical problems that they’re tackling, to focus on the the two or three percent that is their magic because all the other stuff just works. I’d like to have that happen.
Nithya [35:25]
I love that. Well, thank you Jim for letting me co-opt your podcast episode, and letting me kind of turn the tables on you. I’ve really enjoyed this conversation.
—
Jim [35:40]
Many thanks to Nithya, not just for being a great host, but also for helping us with the final interview of the season. That’s right, this episode concludes our amazing journey through season two, and it couldn’t have ended with a better guest host, who has also made amazing contributions to tech for good.
But the story doesn’t end here. We’ll be back with some brief updates next week. In the meantime, as always, please rate, review, and share this podcast with anyone who you think might relate to the story that we told today.
Thanks again for listening, and don’t miss out on our final epilogue to the season next week.




