Redefining Society and Technology Podcast

Book | Manipulated: Inside the Cyberwar to Hijack Elections and Distort the Truth | Updated Edition, April 25, 2024 | A Conversation with the author, Theresa Payton
| Redefining Society with Marco Ciappelli

Episode Summary

In the age of technology and social media, the proliferation of misinformation and manipulation has become a pressing issue in our society. Teresa Payton's book, "Manipulated," dives real deep into the intricacies of cyberwarfare, election hijacking, and truth distortion, shedding light on the intersection between politics, technology, and human behavior.

Episode Notes

Guest: Theresa Payton, Author & CEO Fortalice® Solutions LLC [@FortaliceLLC]

On LinkedIn | https://www.linkedin.com/in/theresapayton/

____________________________

Host: Marco Ciappelli, Co-Founder at ITSPmagazine [@ITSPmagazine] and Host of Redefining Society Podcast

On ITSPmagazine | https://www.itspmagazine.com/itspmagazine-podcast-radio-hosts/marco-ciappelli
_____________________________

This Episode’s Sponsors

BlackCloak 👉 https://itspm.ag/itspbcweb

Bugcrowd 👉 https://itspm.ag/itspbgcweb

_____________________________

Episode Introduction

Theresa Payton is one of America’s most respected authorities on Internet security, net crime, fraud mitigation and technology implementation. As White House Chief Information Officer (CIO) at the Executive Office of the President from 2006 to 2008, she administered the information technology enterprise for the president and 3,000 staff members. Prior to her time at the White House, Theresa Payton was a Senior Technology Executive in banking, spending 16 years providing banking solutions using emerging technologies. Payton founded Fortalice in 2008 and lends her expertise to government and private sector organizations to help them improve their information technology systems. In 2010, Security Magazine named her one of the top 25 “Most Influential People in Security.”

In the age of technology and social media, the proliferation of misinformation and manipulation has become a pressing issue in our society. Teresa Payton's book, "Manipulated," dives real deep into the intricacies of cyberwarfare, election hijacking, and truth distortion, shedding light on the intersection between politics, technology, and human behavior.

Exploring Technology's Impact on Society

In a recent conversation with Marco Ciappelli on his Podcast Redefining Society, on the ITSPmagazine podcast, Teresa Payton discussed the updated version of her book, "Manipulated," which addresses the rapid evolution of technology and its implications on societal manipulation. With a background in cybersecurity and extensive experience in the field, Teresa brings a unique perspective to the challenges posed by advancements in AI and deep fake technology.

Unveiling Fictional Scenarios Inspired by Reality

One of the remarkable features of "Manipulated" is the incorporation of fictional writing  that serve as cautionary tales about the consequences of unchecked manipulation. By weaving together fictional narratives with real-world threats, Teresa prompts readers to consider the potential implications of technology in shaping our social discourse and democratic processes.

The Power of Dialogue and Critical Thinking

Through her book, Teresa aims to inspire critical thinking and open conversations about the role of technology in shaping our perceptions of truth and reality. By encouraging readers to question biases and engage in dialogue with varying perspectives, "Manipulated" serves as a catalyst for deeper reflection on the impact of manipulation campaigns on society.

Advocating for Better Awareness and Regulation

As lawmakers grapple with the challenges of regulating social media platforms and combating misinformation, "Manipulated" calls for a more nuanced approach to safeguarding freedom of speech while preventing the spread of harmful manipulation tactics. By fostering awareness and advocating for stronger regulations, Teresa highlights the importance of collective action in combating the threats posed by digital manipulation.

Embracing a Multifaceted Audience

"Manipulated" is designed to resonate with a diverse audience, from policymakers and experts in the field to everyday individuals seeking to navigate the complexities of the digital landscape. By offering insights and perspectives accessible to all, Teresa invites readers to join the conversation and take a proactive stance against manipulation in all its forms.

This books emerges as a compelling narrative that illuminates the intricate relationship between technology, manipulation, and societal impact. As we navigate an increasingly digital world, Teresa Payton's insights serve as a reminder and guide, urging us to engage critically, foster dialogue, and advocate for a more informed and resilient society.

The updated paperback edition of her bestseller book “Manipulated” includes new information on real world cases of AI, chatgpt, tiktok, and all the latest and greatest exploits of manipulation campaigns, will leave readers both captivated and chilled to the bone.

About the Book

Cybersecurity expert Theresa Payton tells battlefront stories from the global war being conducted through clicks, swipes, internet access, technical backdoors and massive espionage schemes. She investigates the cyberwarriors who are planning tomorrow’s attacks, weaving a fascinating tale of Artificial Intelligent mutations carrying out attacks without human intervention, “deepfake” videos that look real to the naked eye, and chatbots that beget other chatbots. Finally, Payton offers readers telltale signs that their most fundamental beliefs are being meddled with and actions they can take or demand that corporations and elected officials must take before it is too late.

The updated paperback edition, including new information on real world cases of AI, chatgpt, tiktok, and all the latest and greatest exploits of manipulation campaigns, will leave readers both captivated and chilled to the bone.

_____________________________

Resources

Manipulated: Inside the Cyberwar to Hijack Elections and Distort the Truth (Updated Edition, April 25, 2024): https://www.amazon.com/Manipulated-Inside-Cyberwar-Elections-Distort/dp/1538188651

____________________________

To see and hear more Redefining Society stories on ITSPmagazine, visit:
https://www.itspmagazine.com/redefining-society-podcast

Watch the webcast version on-demand on YouTube: https://www.youtube.com/playlist?list=PLnYu0psdcllTUoWMGGQHlGVZA575VtGr9

Are you interested in sponsoring an ITSPmagazine Channel?
👉 https://www.itspmagazine.com/advertise-on-itspmagazine-podcast

Episode Transcription

Book | Manipulated: Inside the Cyberwar to Hijack Elections and Distort the Truth | Updated Edition, April 25, 2024 | A Conversation with the author, Theresa Payton
| Redefining Society with Marco Ciappelli

Please note that this transcript was created using AI technology and may contain inaccuracies or deviations from the original audio file. The transcript is provided for informational purposes only and should not be relied upon as a substitute for the original recording, as errors may exist. At this time, we provide it “as it is,” and we hope it can be helpful for our audience.

_________________________________________

[00:00:00] Marco Ciappelli: Hello, everybody. Welcome back to another episode of, uh, ITSPmagazine podcast. And this in particular is Redefining Society, where we talk about society and technology. I'm very excited to have, uh, back Teresa Payton with me it wasn't that long ago since when we talk and we touched on a few things related to the TikTok ban and, uh, I would say, what does it mean when politics and technology cross and relationship, uh, international relationship and all of these, because there is somebody that studied this and practiced this for a long time and, uh, she wrote a book in 2020, if I'm not wrong, we talked about it on my show and now there is an updated version of the book because, you know, four years in this world of technology is like 25 years, maybe even 30, uh, with what it was used to be. 
 

And so. She's here to present this new book, which is not really new. It's an updated version, but she brings in a lot of scenario what if situation that obviously with all the things that have changed and the things that It didn't change. Uh, needed to be adjusted a little bit. So enough me. Theresa, welcome back. 
 

[00:01:27] Theresa Payton: Hi, thanks for having me back.  
 

[00:01:29] Marco Ciappelli: Always a pleasure. Always a pleasure. So as the book demand a little introduction about yourself, even if maybe people already saw not too long ago, the other episode, but, uh, we always assume there is new listener. And, uh, we want to let them know who you are.  
 

[00:01:48] Theresa Payton: Sure. Uh, real quick about me, Teresa Payton. 
 

I spent 16 years in the financial services industry and, uh, just, you know, delivering cutting edge technology that customers would want to use that enabled the bank, but also had to be secure, private, confidential. And then I had that opportunity. To work for the White House under President George W. Bush. 
 

And so I served under his second half of his second term, so 2006 to 2008. And while I was there, uh, realized that the country had invested a lot in training me on what the adversary's capabilities were. And I realized that I wanted to dedicate the rest of my career in protecting and defending the digital footprints of businesses, governments, and individuals. 
 

So I had the opportunity to actually stand up, uh, my company, Fortalist Solutions, and I've got two partners in my firm, Melissa and Bridget, and, uh, we actually met at the White House. And so I've been, uh, running fortaless solutions and serving, you know, the fortune 150 through 500 privately held large, um, enterprises, uh, midsize firms. 
 

And, um, you know, we do a lot of government work as well as protecting, um, high net worth individuals and, and VIPs. So it's, uh, it's really great to be here with you, Marco.  
 

[00:03:16] Marco Ciappelli: And it's again, really great to have you. And I am extremely excited about this topic because I love everything sociology. I love the way that it changed people, uh, mind and technology. In the end, I'm realizing more and more on cybersecurity. It's all about the human brain. As much as It's, you know, some technology actually comes into place, but when it comes to, uh, conditioned people and what is called social engineering, it's, it's something that I think in the end, it's, uh, it's very relevant and the human side of cybersecurity is more and more relevant, but in this case, you're, you're talking about this book called Manipulate It, and he talks about cyber war, um, um, hijacking election, distort the truth, which is part of your title. 
 

I was reading that. I'm not that good at reminding them. But uh I am curious to start with this. Number one, as I mentioned at the beginning, four years you felt the need to update this book. So let's start from that. Why couldn't you just leave That book, the way it was and write a completely new one.  
 

[00:04:32] Theresa Payton: Yeah, I, it, some things are still exactly the same. 
 

And, you know, the, the first thing to mention that still exactly the same is. Hmm. Disinformation and misinformation is still alive and well on the internet, on big tech and social media. So sadly, uh, a lot of the original book is the same. Maybe things are a little worse. Um, but I, I do have a lot of optimism based on The amount of awareness and conversation we've had the last four years. 
 

So, but we definitely, I needed to update the book because gender to Bay, I was not out in the kind of public domain when the first version of the book came out. And now what used to take manipulation campaigns a little while, um, you know, you had to have some technical know how, But also you had to have some really great language skills or higher translators to really have a narrative pop and become viral within a certain demographic. 
 

And now with generative AI, you can say, you know, I really want to reach. the voters of Texas. And you don't even have to be a native Texan to sound like a Texan and not sound like, Oh, you Hollywood Texan. So for example, instead of saying, um, you know, sounding like John Wayne, when you do your, um, your, your translation, you could actually say to generative AI, okay, I want this written in How Americans speak English. 
 

As a matter of fact, model it after Matthew McConaughey, who is a, you know, he is from Texas. And, um, You know, make it sound like somebody from, and then you can pick different parts of Texas and then generative AI will find the colloquialisms, the sayings and integrate that into the conversation. So now the technical barriers are super low, um, very inexpensive. 
 

And now the language barrier as in the spoken language, uh, American English doesn't have to be your first. language anymore. So you could speak whatever language, maybe not even speak, you know, British. Um, cause let's just, let's be real Marco, right? So people in the UK speak English, people in America speak English, but I think it sounds very different. 
 

So you could definitely get down. To a town and get very, very localized in your speech. So now suddenly you've got this opportunity to do manipulation campaigns at a speed and a scale and a level of economies that wasn't available back in 2020.  
 

[00:07:26] Marco Ciappelli: Translation. I was actually looking at some video the other day where I can take my face I'm talking, like having a speech like right now. 
 

Not only use my voice to translate in, I don't know, 25 different languages, 10 seconds each, but also adopt the movement of my mouth and my lips to match the translation. And so it's not just an overdub like you watch the movies and you're like, Oh, you know, that's not the original language, but it's, It's, it's, it's as real as the real can get. 
 

So scary. I mean, fascinating. And, Great tool. What do you use it for? Right? So any other major changes that that you think AI is actually affecting the scenarios? And are we using it? Also, the good people, the good side to improve the detection of all of this?  
 

[00:08:34] Theresa Payton: Yeah, you bring up a great point because the Generative AI and AI algorithms aren't inherently bad. 
 

They're tools. It's just that when they're in the wrong hands of people who have, you know, kind of, um, nefarious intent, that's the misuse of that technology, right? So the technology itself isn't inherently bad. It's the lack of guardrails, And it's the lack of governance that allows bad people to use the tool and in a bad way. 
 

Um, but I mean, I think, you know, the good news is, is there are deep fake uh tools where you can actually Put a deep fake either a photograph or a video, um, into the tool. And it'll tell you whether or not they deem it to be potentially a deepfake or at least an edited photo or an edited video. So I think that's incredibly helpful. 
 

Um, you do have where, uh, different nations have had conversations around how do we leverage this technology for good? And if it's being leveraged for bad purposes, How do we as a global community spot and stop that and make it easy for people to detect it and report it and have it taken down? Um, and, you know, I also want people to take away like the, There's such power with this technology because, and I talk about that in the book. 
 

So even though I talk about the misuse of the technology, I also talk about how it can actually help us, right? So generative AI, you know, besides the fact that it can help us with training videos. And if you are onboarding employees or vendors in multiple countries and multiple languages, the opportunity, To, um, in real time do translation for them. 
 

That's, you know, pretty, pretty close, um, to what they're used to expecting and hearing, reading and seeing. Um, so there's really a lot of good that can come out of it, including healthcare. We could end up having. Individualized, customized health care that could be incredibly helpful. I just saw something and I haven't decided how I feel about it, but I, I did just see something about the ability to clone an expert. 
 

So if you almost think of it as A person who has a career of working in a particular field has a body of knowledge. And so how do you share that body of knowledge? You can do a one on one by having a conversation. You can do consulting. You can give a speech and maybe reach 50 or a hundred or a thousand people. 
 

You could write a book. You can do a podcast like you do Marco, or in this instance, you can start to train your body of knowledge into a data lake. And train a large language model and AI algorithms and generative AI. So now you can have the wisdom of an expert at your fingertips. And so, you know, think about the possibility. 
 

I'm a huge fan of Bob Iger, for example. And the opportunity to say, I'd love to have a one on one session with Bob Iger. And they say, well, you're not going to get Bob because he's kind of busy. And I doubt you could afford him, but Bob has trained for entrepreneurs like you. He has trained a very specific advisor model and where the advisor doesn't have the answer. 
 

Bob will, you know, follow up. And now you have the opportunity to get the wisdom of Bob Iger and do it in an interactive way. But Bob doesn't have to be present for that. I'm not sure exactly how I feel about that, but I think the technology is very compelling and very interesting. And, and again, I get back to the reason why we have the problems we have is we don't have the right guardrails in place. 
 

And, you know, just like cybersecurity, it, it always feels like everything rests. Very heavily on the shoulders of the user, right? So it's like, we, we bought all these things. We've got a security operations center. You think about cybersecurity, it's like, but it comes down to what? Don't click on a link and open an attachment. 
 

Don't give away your multi factor authentication code. That's really not, uh, the authentication platform. You're, you're doing the wrong thing here and you be aware and be vigilant and report everything. And it's like, wait a minute, there's like a, a lot of technology going on here. Why are you relying on busy old me who's just trying to get my job done or live my life? 
 

It's really the same thing with these manipulation campaigns, Marco, is there's all this technology involved, but at the end of the day, right now, for the most part, we're saying to users, be aware, be vigilant, don't promote misinformation. If you see something, you should report it. And that didn't work in 2020 during major global elections. 
 

And that's not going to work in 2024.  
 

[00:13:42] Marco Ciappelli: And that's, I guess, the scary, the scary part. So, uh, you use Um, scenarios, like when you start talking about a certain topic, you have a what if, what if it works, what if it doesn't, and what could happen here. So you, you kind of go into creating some fictional inspired by reality, um, situation. 
 

So did you, uh, updated all of these, many of these in the, in the new book to reflect the new slash old reality?  
 

[00:14:16] Theresa Payton: I did. And, um, you know, so I. I like creative writing and, uh, so it was really great to have the opportunity to open up or close a chapter with a fictional vignette. And I always let people know at the bottom, you know, this is fiction for now. 
 

And the fictional vignettes that I added to this, um, update and new material in the book Really present sort of, if we don't change course and do something different, this is what I believe could happen. But if we do change course, here's what we can do and that it's not too late. And, uh, and oftentimes the changing course doesn't require. 
 

A lot more effort. It's just thinking and talking about this differently. And, uh, yeah, so I do have an updated, so I've got, let's just say Francesca, uh, from Italy plays a central role. During a European election, there's very important European elections going on in 2024, and she happens to spot a manipulation campaign unfolding on Telegram. 
 

And there's one scenario where we haven't made the investments we need to, and that plays out a certain way. And then there's another scenario where we make course corrections now, uh, globally. And it plays out a different way. So, um, you know, I don't want to spoil it for people who haven't read the new version of the book yet, but yes, I've updated those fictional vignettes. 
 

And there's also, um, a, an interview with a real person at the beginning of the book.  
 

[00:16:05] Marco Ciappelli: Really cool. I am a big fan of using stories and metaphor. I mean, I do have a podcast all about storytelling because I think we, we always tell a story no matter what. So the fact that you incorporate this into a This is a very, very serious topic, like the one you cover in the book. 
 

It's, uh, I think it will help the everyday person to, to understand more all of this. Um, so let's get to, to the point that you focus a lot on election and it happens that, whoops, it's election years again. in the United States. It's election year somewhere in the world all the time. Um, I think there are plenty. 
 

I think you focus on the United States one, but obviously this kind of manipulation, we can see it happen everywhere. Even probably some places, even more, I'm assuming, where the regime may not be as Democratic as arson, we still are paying very dear consequences for the, the power of this, um, cyber criminal. 
 

So, are we in a better spot than 2020 then? Or do you think that technology is giving advantages to the bad guys now?  
 

[00:17:25] Theresa Payton: I think from an awareness standpoint, we're in a better spot. But the manipulators don't give up. And so they're leveraging the newer technology to basically work around what few guardrails exist and you bring up a good point. 
 

So my, my book is focused on the manipulation of. All kind of social issues globally to include how people think about voting or not even showing up in voting in elections. And, you know, really if social discourse is unhealthy. Then politics will be unhealthy. So if we can't even have just a basic conversation around, you know, what constitutes being healthy or not being healthy, or what constitutes we're in a pandemic or not a pandemic or, um, a boat crashing into a bridge and whether or not people choose to believe what the government says happened, uh, in an investigation. 
 

So things that seem very simple on the surface. Can be manipulated at speed and scale. And now all of a sudden on just regular social discourse issues. It's chaotic. We don't get along. People aren't speaking to each other. And that plays out in in democratic voting and elections. So what's interesting about kind of the conundrum that the United States and even countries in the EU find themselves in is you brought up a point, Marco, about some countries are, you know, just constant manipulation campaigns. 
 

And One of the things you have to look at is many of these countries perfect manipulation campaigns, disinformation and misinformation on their own citizens. So in order to keep the peace, they may unplug them from the internet. In order to keep the peace and promote, we're a great country and the United States is a dumpster fire, or we're a great country and the EU countries are a dumpster fire, or look at the UK. 
 

They promote this propaganda using social media, using traditional news media. And, uh, they perfect these disinformation, misinformation campaigns to control their citizen populace. And then they basically use those techniques. and take that test bed of learnings and then deploy it against other countries. 
 

So if you find, um, you'll notice a lot of times in the news media, when a disinformation or misinformation campaign is uncovered, a lot of times, not always, it does tie back to Russia, China, North Korea, Iran. And They do a lot of this to their own citizens. And, uh, again, for their part, all of these countries, by the way, Marco deny that they do this. 
 

Um, yes, they deny that they do this. Um, but I always say, if you deny that you do it and you don't condone it, then you should condemn it. So you should actually participate in an investigation that brings citizens on your soil to justice. If they are indeed found guilty. Um, but of course, nobody ever gets extradited for doing these campaigns. 
 

[00:20:59] Marco Ciappelli: Well, the thing that it's what it's at the core of this conversation, in my opinion, is I mean, propaganda has always been happening, right? Now it's much more powerful, obviously, because of the Internet is not about flyers that come down from an airplane in a war zone. Um, like during the second world war or something like that. 
 

Um, but it's still understandable. Definitely you don't want to see that happen, but you know, if the regime is not a democracy, but when you can have control in the democracy where by control, I mean, having a line, draw a line where you can really tell, um, people this is how I mean, this is the truth and this is not. 
 

And it blows my mind when you're like, well, I believe what I want and it's my right and, uh, and I don't care. Pigs fly and you can tell me what you want. I'm going to stick with this. It's very philosophical. I mean, you can impose, you can impose, like impose to someone the truth. You know, one may wonder from a philosophical perspective, what is the truth? 
 

[00:22:28] Theresa Payton: Well, and that, that is a big question, which is who is the arbiter of truth and, and it used to be, okay, let me see it with my own eyes, hear it with my own ears. Let me read, let me experience, and then let me decide for myself. But in the age of deep, fake audio and video, how can you believe what you see with your own eyes? 
 

And we, we really missed it on this one, but we have the opportunity to kind of stand up those guardrails and, and get that back where people can know. So for example, there's an opportunity here where news media, trusted vetted news media could be watermarking their audio and video. So that the moment it is edited, the watermark changes, and it says this video has been edited. 
 

So there is the technology that exists to let you know. I think about it, and it's going to have to change and evolve over time. So think about it like counterfeit money. So there is a way. To mark things as this is legit money and to look for counterfeit. It's not foolproof, but it does stop a lot of the, uh, counterfeit money from being presented. 
 

It has to evolve and change just as counterfeiters evolve and change, but there is a technology opportunity here and policy and guardrails to, to, to make a difference. The question is, is like, Do we have the grit and, um, the tenacity to, to follow through with what needs to be done?  
 

[00:24:14] Marco Ciappelli: On top of that, there is not, there's nothing harder than try to make people understand something that they don't want to understand. 
 

And they want to stick with that.  
 

[00:24:25] Theresa Payton: You know, that is one thing in doing my research for the book, uh, the first time around, and it was confirmed the second time around. I was fascinated to see researchers talk about the confirmation bias effect, even if you don't actually know that you're looking for confirmation of your biases and your beliefs, that there's something about. 
 

Kind of the, the visual audio effect of social media that plays to, you know, one, we know that if your, your post gets likes, there's an endorphin rush. And so we, we know that from the research. And so then the next piece is. You know, you're looking and you know, I always remind people, the internet gives you more of what you came for. 
 

You know, sometimes it'll test you out. It's an echo chamber, right? Yeah, it really is. It gives you more of what you came for. And I, like, I'll just a quick segue because you know, I'm always looking for an opportunity to make sure when I go online that I get positive things sent to me in my feed because Fighting cyber crime can be fairly dark. 
 

So I really don't want a lot of negativity if I'm going to go to social media, right? So I like Marco, for example, I follow people like you, because a lot of your posts talk about serious issues, but you're great. And a lot of the people you interview are just really great and interesting people. And they're trying to do really good work in the world. 
 

And so the internet will serve you more of what you want. And so I did this test. with tiktok where I don't have a tiktok account. I just went to tiktok. com and I just spent a few minutes looking at rolling pandas because it's really hard to be sad if you just watch a few minutes of rolling pandas. I mean, they're hilarious, right? 
 

And so I wasn't, you know, logged into tiktok because I don't have a tiktok account. I did have other tabs open, um, and then for the next week, every time I went to the internet, it served up more rolling pandas, which really made me very happy. So let's go back to confirmation bias. So you're looking, you're researching, you're trying to be a very discerning reader and thinker on issues that matter to you. 
 

And then the challenges is disinformation and misinformation. They'll take a kernel of truth. They'll take part of what you're searching for, and then they'll twist it. And then it gets sent out there. And then because the internet's trying to give you more of what you're looking for, so that you'll stay on that particular platform longer, and you'll invite other people to come there and you'll cross post things from there. 
 

It just keeps serving you up things. And you don't even realize because you're like, see, I knew it. I knew I was right on this issue. And look, three different things came to me today telling me I was right on this issue. Now your biases have been confirmed and you don't even know it. You literally been manipulated into thinking I was right all along. 
 

Now it's time to share this, this bad boy and this bad girl information with other people. And so they, that's the part that manipulators. People who do this for a living, that's the part they understand. They probably understand how algorithms hit any demographic better than the social media big tech platforms do themselves. 
 

[00:27:46] Marco Ciappelli: I mean, let's say, I said that at the beginning, technology, yes. It helps because now you can tell AI to say, Hey, I want you to pinpoint the factor that cross reference data and tell me, you know, how people react to these. And I'm sure, you know, a lot of social media, they do just that. But it comes down to psychology. 
 

I mean, all of this is psychology and, and I think to, to, to talk about the truth a little bit more is that truth, it's, it takes time and an effort and I remember studying sociology of communication back in the days where internet was not for everybody, was not commercial at that point or not enough and, uh, the professor of journalism will say, read three newspaper. 
 

One from the left, one from the right, one independent in the middle, and then Make your own conclusion. People now, they just want the headline and that's the only news that they get. So it's kind of made things easy for for the manipulator, I'm afraid. Um, let's let's get into the book itself. Um, one of the questions I like to ask when I'm on audio signal when we talk about stories and storytelling is who do you have in mind? 
 

When you write this book, who is it for? Your audience? Is it experts? Is it politicians? Is it the everyday people? What's your message?  
 

[00:29:14] Theresa Payton: Yeah, I have a little something for everybody in there. Um, I really want lawmakers to read this book. Um, it's a very complicated issue to understand. And there's a lot of jazz hands going on around freedom of speech, and I take freedom of speech very seriously. 
 

And around how do you regulate social media if they're quote, not news organizations. We have to do better. We deserve better as a generation and we, our children and our grandchildren deserve better. And so just throwing up our hands and saying in the name of freedom of speech and in the name of, no, they're not really news organizations. 
 

There's nothing more that can be done other than just be aware, be vigilant people and report it. Parading big tech and social media up on the hill. Incredibly ineffective and actually quite frustrating. And again, I'm just going to say, lawmakers, we deserve better from you. And I mean that for citizens all across the globe. 
 

You can protect freedom of speech while also protecting us. And you just have to study the issue and get more into the issue. And so that's why I, I wrote this book for everybody. It's not a super technical book. And I tell a lot of people, look, this book may not be for you personally, but I want you to read it. 
 

And if you have a friend, a family member, a colleague who is falling prey to misinformation, disinformation over and over again, This is for you to give it to them, ask them to read it, show them love, space, and grace. And then open up a conversation based on the book. I hate hearing that people no longer speak to somebody because, you know, they believe these conspiracies and I can't get through to them. 
 

And so I'm hoping the book can actually open up a dialogue. I would like teachers to teach this in their civics and history classes. This does not have to be a technology class book. The generation coming up They don't have the protections around them that they need, and we need them to, Marco, I had sort of the same thing. 
 

I had an economics professor who would tell us we had to bring in articles on both sides of the issue, and it had to be both national and international newspapers. And so I had to old school go to the library and photocopy articles on both sides of the issue, and then I had to write an executive summary of what I read. 
 

And so I had to old school go to the library and photocopy articles on both sides of the issue, and then I had to write an executive summary of what I read. And both sides of an issue, uh, uh, for my economics professor. And that was really good because it got me in the habit of saying, well, that's an interesting point of view. 
 

I wonder what somebody who has an opposing point of view would say what research supports their opposing point of view, but Marco. The younger generation, they don't have that opportunity. They're just bombarded. They're bombarded with algorithm driven information and they deserve better. And we need laws. 
 

We need guardrails to, I'm typically a less laws are better, but in this, in this, We need help and Superman's not coming to save us. So I wrote the book that while lawmakers struggle to figure this out and give us better laws to help us with this, that we can each help each other, that we can support each other, that we can, we can. 
 

Be friends and colleagues, and we can agree to disagree on social issues, but at the same time with love and grace and space, not allow each other to be manipulated.  
 

[00:33:08] Marco Ciappelli: Yep. Well, I'm into that. Agree to disagree and still be able to have that conversation. I think that's, that's what democracy is about, right? 
 

51%, you can't assume that 49 percent are losers. And 51 is the winner. It's you're still trying to accommodate everybody except you cannot have the democracy of Everybody like in the small polis in greece where democracy was invented. It's just impossible But if you can pursue that final goal, I think books like this should say I mean should make people think it's better to have this conversation that to just shut down and cover your ears and, and la la la loud as much as you can because you don't want to hear, and then you're going to go on your social media so that everybody's thinking at the same way you are. 
 

Then we are just living in different bubble, but that's not reality. So, uh, there is a, I could talk to you all day long, especially on this kind of topic I get really, really passionate about. And I, I, I do suggest definitely people to listen or I don't know if there is an audio version of this. Is it?  
 

[00:34:27] Theresa Payton: We did not get a chance to update the audio book. 
 

There is an audio book and for many subscribers of Amazon Prime, it's free right now. So if you want to listen to the audio book, the paperback and ebook will be out by April 22nd. Um, so then, you know, you can just kind of catch up on kind of the new stuff.  
 

[00:34:48] Marco Ciappelli: Would you actually suggest that for people to listen or read the first one and then the second to see what did change? 
 

[00:34:57] Theresa Payton: Yeah, I definitely, if you're a big podcast, um, listener, which if you're listening to Marco, you must be, I think so. Um, they, we recorded, um, we recorded the audio version of the book very much in a podcast style format, so it should be fairly easy for people to listen to. Um, I'm actually reading most of the book, but where there need to be, uh, voice characters. 
 

That are male characters. There's a voice actor who joins me, um, for, for those pieces. And so, um, that can be a great way while you're waiting for the paperback and the ebook to come out. Um, as you could listen to it on your drive to work or the grocery store, um, or in the background while you're getting ready for your day. 
 

[00:35:40] Marco Ciappelli: Sounds good. I think it's a great idea. So, uh, for everybody, there will be links to all the way to connect with Teresa, her website, the book, the link to the Amazon page or wherever the book is sold. And, uh, for everybody else, uh, also stay tuned, subscribe. There is going to be a lot more stories and I'm sure Teresa will come back because I know So maybe there is an actual fictional book coming up. 
 

So I don't know if you want to throw an idea there. You kind of mentioned a character there.  
 

[00:36:16] Theresa Payton: Oh, well, she's in, uh, Manipulated. So, uh, in the new material of Manipulated, you'll read about Francesca, um, who was a young reporter and she's been trained to spot manipulation campaigns and It's between her work and a race against time to save, uh, elections in the EU. 
 

[00:36:38] Marco Ciappelli: So a little bit of adventure and fictional fact based, unfortunately, I'm going to say, on something that is very real. So again, everybody stay tuned. We'll talk again with Teresa and, uh, there are many more stories that will come up very soon on Redefining Society. Thank you.  
 

[00:37:01] Theresa Payton: Thanks, Marco. Be safe out there. 
 

Good to be with you.  
 

[00:37:04] Marco Ciappelli: Same to you.