IE 11 is not supported. For an optimal experience visit our site on another browser.

Meet the Press - January 1, 2023

Frances Haugen, Sen. Amy Klobuchar (Minn.), Rep. Mike Gallagher (Wis.), Carlos Curbelo, Jeh Johnson, Cecilia Kang and Elizabeth Nolan Brown

CHUCK TODD:

This Sunday: democracy disrupted.

PROTESTORS:

Stop the steal.

CHUCK TODD:

Social media platforms are reshaping our politics.

MARK ZUCKERBERG:

We try to do what's best for our community and the world, acknowledging that there are difficult trade-offs.

CHUCK TODD:

The algorithms of Silicon Valley are now some of the most powerful forces fighting for our attention.

SUNDAR PICHAI:

We approach our work without political bias, full stop.

CHUCK TODD:

It's technology that critics say is fueling misinformation and polarizing content for clicks and profit.

FRANCES HAUGEN:

A safer, free speech-respecting, more enjoyable social media is possible.

CHUCK TODD:

This morning, we're going to look at how the technology companies built platforms connecting the world that are now challenging the very foundations of democracies.

FRANCES HAUGEN:

I am very scared about the upcoming generation.

CHUCK TODD:

Frances Haugen, a data scientist who became known as the “Facebook whistleblower,” joins me to discuss the solutions.

CHUCK TODD:

Plus:

SEN. LINDSEY GRAHAM:

Social media is out of control.

SEN. RICHARD BLUMENTHAL:

Big Tech now faces that Big Tobacco, jaw-dropping moment of truth.

CHUCK TODD:

Congress is concerned about how Big Tech controls what content we see.

SEN. MARSHA BLACKBURN:

You have used this power to run amok. You have used it to silence conservatives.

CHRISTOPHER WRAY:

We do have national security concerns, obviously, from the FBI’s end about TikTok.

CHUCK TODD:

Democratic Senator Amy Klobuchar of Minnesota and Republican Congressman Mike Gallagher of Wisconsin will discuss what Congress can do to regulate social media. Joining me for insight and analysis are: New York Times technology reporter Cecilia Kang, former Republican Congressman Carlos Curbelo, former Homeland Security Secretary Jeh Johnson, and Elizabeth Nolan Brown, Senior Editor at Reason. Welcome to Sunday and a special edition of Meet the Press.

ANNOUNCER:

From NBC News in Washington, the longest-running show in television history, this is a special edition of Meet the Press with Chuck Todd.

CHUCK TODD:

Good Sunday morning, happy New Year, 2023 is here. This morning we are taking a deep dive into the social media platforms that profit from grabbing onto and monetizing our attention to the tune of billions of dollars a year and with almost no regulation. A majority agree: social media's influence on our democracy and our national security is a big problem. Sixty-four percent of Americans believe social media has been a bad thing for our democracy — two-thirds of the country — creating polarization and division, and eroding civility in our politics, all of this according to a new Pew survey. It's an attention economy whose business model depends simply on persuading you that you and your way of life is somehow under attack in order to buy your time and attention, as whistleblowers from these companies have come to Capitol Hill to warn us about.

[START TAPE]

PEITER ZATKO:

I’m reminded of one conversation with an executive when I said, “I am confident that we have a foreign agent,” and their response was, “Well, since we already have one, what does it matter if we have more, let’s keep growing the office.”

BRIAN BOLAND:

And rather than address the serious issues raised by its own research, Meta leadership chooses growing the company over keeping people safe.

FRANCES HAUGEN:

During my time at Facebook, I came to realize a devastating truth: Almost no one outside of Facebook knows what happens inside of Facebook.

[END TAPE]

CHUCK TODD:

Eighty-five percent of Americans say social media makes it easier to manipulate people with misinformation. We’ve seen it: from Russian efforts to influence the presidential election, to QAnon, in fact one 2019 report tracking a dummy social media account set up to represent an anonymous conservative mother in North Carolina found that Facebook’s recommendation algorithms led her to QAnon in less than a week. And then there's the thriving anti-vaccine groups that the president himself called out last year.

[START TAPE]

PETER ALEXANDER:

On Covid misinformation, what's your message to platforms like facebook?

PRESIDENT BIDEN:

They’re killing people.

[END TAPE]

CHUCK TODD:

Facebook was used by members of Myanmar's military in a systematic campaign as a tool for genocide, and social media platforms from Facebook to Twitter were "gasoline on the fire" of the Capitol attack on January 6th. A whopping 79% of Americans say the internet and social media have made Americans more politically divided. Growing shares of both Republicans and Democrats say members of the other party are more immoral, dishonest and closed-minded than other Americans. Perhaps it’s because they only hear about the other party via social media, and not normal interactions like we used to have in the pre-social media world. And social media companies are profiting off of Americans' anger online. Starting in 2017, Facebook’s ranking algorithm treated angry emoji reactions as five times more valuable than “likes.” Why? Well, anger generates clicks and clicks generate profit. What's happening on social media is the equivalent of using the same pipes for your drinking water and the sewer system.

[START TAPE]

TRISTAN HARRIS:

The better you are at innovating a new way to be divisive, we will pay you in more likes, followers and retweets. Has partisanship in television and radio pre-existed social media? Yes. Have we ever wired up the most powerful artificial intelligence in the world, pointed it at your brain stem to show you the most enraging content on a daily basis and the longer you scroll, the more you get? We have never done that before.

[END TAPE]

CHUCK TODD:

We are experimenting on brains, America. And the business has never been bigger. When Pew began tracking social media adoption in 2005, just five percent of American adults used at least one of these platforms. Now, that number is 72%. 82% percent of Americans are on YouTube, 70% are on Facebook— and ready for this? — four of these companies have more than a billion worldwide users, that’s more than the population of every country in the world but two. None of these companies has a financial incentive to change: social networking sites in the United States brought in more than $72 billion dollars last year.

[START TAPE]

MARK ZUCKERBERG:

The reality is our country is deeply divided right now and that isn't something that tech companies alone can fix.

EVAN SPIEGEL:

For us, it’s much more important for us to look at, like, the big ideas that might influence the way that tech evolves in the future and more importantly, to build a strategy that does not rely on government intervention for our success.

ELON MUSK:

Twitter has become the de-facto town square. So it's just really important that people have both the reality and the perception that they're able to speak freely within the bounds of the law.

[END TAPE]

CHUCK TODD:

So we invited Meta, Twitter, Google, Snap and TikTok on the broadcast to defend their practices and simply have a conversation about the future of their platforms and what can be done here. All of them declined. We did receive a statement from TikTok and we got links to previously written blog posts from the other companies. The last real legislation that spells out who is legally responsible for content on the internet was signed into law 27 years ago — the last century! In 1996, only a fifth of Americans had ever booted up the World Wide Web. Section 230 as it's known says: "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." In other words, these companies are not to be held liable for harmful or inaccurate content a user posts on their sites — and can't be sued. Of course, the question is, The minute they use an algorithm, do they actually become a publisher? This law was written before the algorithms had taken hold. Now, in Washington, Section 230 is under more scrutiny than ever, with more than 30 bills proposed on social media during the last Congress, but despite all those bills proposed, none have passed. Twenty-four years ago, 46 states and Big Tobacco reached the largest settlement of civil litigation claims in U.S. history, and tobacco companies changed their marketing practices and paid states more than $200 billion dollars in restitution. When we realize products are toxic for us, we pass laws to change them or we hold companies accountable to force the change. Facebook whistleblower and former data scientist Frances Haugen became one of the greatest sources this century when she turned over thousands of confidential company documents, sharing them with regulators, journalists and with lawmakers.

[START TAPE]

FRANCES HAUGEN:

When we realized Big Tobacco was hiding the harms it caused, the government took action. When we figured out cars were safer with seat belts, the government took action. And when our government learned that opioids were taking lives, the government took action. I implore you to do the same here.

[END TAPE]

CHUCK TODD:

And Frances Haugen joins me now. Welcome to Meet the Press.

CHUCK TODD:

And Frances Haugen joins me now. Welcome to Meet The Press.

FRANCES HAUGEN:

Thank you for having me.

CHUCK TODD:

I want to start with why Facebook is so afraid of any government intervention. And I say this because they have helped kill all those bills I showed. And none of them came into law. They – and they're not alone, but they had a lot of lobbyists kill those bills. What are they afraid of?

FRANCES HAUGEN:

When you look at the history of Facebook stock price - and I did this before I came out. Over the course of five years before the Facebook – the Facebook disclosures began to become public, Facebook stock only declined versus the Nasdaq by more than 5% about 25 times – 27 times. Overwhelmingly those events when stock price declined were when something came out that demonstrated Facebook was going to have to spend more money on safety. Facebook is scared that if we actually had transparency, if we actually had accountability, they would not be a company with 35% profit margins. They'd be a company with 15% profit margins.

CHUCK TODD:

That’s dramatic.

FRANCES HAUGEN:

No. They'd be one of the most profitable companies in the world.

CHUCK TODD:

15% profit margin –

FRANCES HAUGEN:

Yeah.

CHUCK TODD:

– is pretty good.

FRANCES HAUGEN:

It's amazing for a company--

CHUCK TODD:

15% return on any savings would be amazing.

FRANCES HAUGEN:

I know.

CHUCK TODD:

So it's just simply, it wouldn't be as profitable.

FRANCES HAUGEN:

It wouldn't be as profitable.

CHUCK TODD:

So you took this job in the civil integrity team.

FRANCES HAUGEN:

Uh-huh (AFFIRM).

CHUCK TODD:

You had a specific motivation to do it. Tell me about it.

FRANCES HAUGEN:

Back in 2016, I had a very close friend who had helped me relearn to walk. So I was very ill. I was paralyzed beneath my knees. And this person was originally an assistant who became a dear friend. And over the course of the middle of 2016, after Bernie Sanders lost both the primaries, he fell down a rabbit hole where he became more and more radicalized. And watching him drift away, at the same time I was working on the algorithms at Pinterest. You know, I was the lead product manager for ranking at Pinterest. And as I began to look at the internet, particularly at Facebook, I would see these glaring deficits. They would have these things like a carousel under every post that would show you other posts. And you could tell that those posts were ranked based on clicks because they were always the most extreme version of whatever you saw. So this is things like you click on an article about the election, the carousel shows you a posting, "The Pope endorses Donald Trump." And it turned out that it was a whole Macedonian misinformation factory going on. There was a cottage industry and these little blogs that would make these fake news stories. And Facebook was just asleep at the wheel. And so when I got offered a chance to work on civic misinformation, I thought back on that experience of watching these deficits, of watching this person who I really cared about spiral into a world of alternate facts. And I said, "This is my chance to do something."

CHUCK TODD:

So let's talk about an algorithm. Let me put up a definition here. Facebook algorithm: A system that decides a post position on the news feed based on predictions about each user's preferences and tendencies. And what your disclosures found is how often they change the algorithm.

FRANCES HAUGEN:

Uh-huh (AFFIRM).

CHUCK TODD:

And to me it shows you they know what's happening. Like, they can do this. But you were in the Civil Integrity - did you feel as if they wanted you to succeed?

FRANCES HAUGEN:

When I initially was hired, I came in with a lot of optimism. You know, Facebook had built this center of excellence inside the company, actually one of the best civic responsibility units available in the industry. And it wasn't until they dissolved that unit immediately after the 2020 election that I realized the company wasn't committed to this enterprise. But if you want to have successful change in an enterprise you have to appoint a vanguard. You have to have executives say, "These people are the future. They're going to lead us in the right direction." And when Facebook dissolved Civic Integrity, I saw that they weren't willing to make that commitment anymore.

CHUCK TODD:

You said something in your statement to Congress. You say you saw “Facebook repeatedly encounter conflicts between its own profits and our safety. And Facebook consistently resolved these conflicts in favor of its own profits.” Give me an example.

FRANCES HAUGEN:

One of the most effective things for reducing misinformation is a very simple intervention. And it's actually free-speech respecting. It’s if – if you look at a chain of reshares, so Alice writes something. Her friend Bob reshares it. Her friend of friend Carol reshares it. It lands in Dan's news feed. Alice doesn't know Dan. Dan doesn't know Alice. Alice could be a misinformation agent. You're outside of that social context. If you said, "Hey, Dan, if you want to share this, you can. You're going to have to copy and paste. You know, we're going to gray out that reshare button." You have to make a choice. You can't just, like, knee-jerk reshare this.

CHUCK TODD:

They have to make a little bit –

FRANCES HAUGEN:

Make a choice. Yeah, yeah.

CHUCK TODD:

– of an effort.

FRANCES HAUGEN:

Be intentional, intentionality in sharing. That has the same impact on misinformation as the entire third-party fact-checking system. And it doesn't choose which ideas are good or bad. It just says, "Let's have humans make choices, just not reflexes make choices." But at the same time, that reduces the amount of content spread in the system. It decreases profits very slightly. And Facebook declined to make that choice.

CHUCK TODD:

All of their – all of their business model is about you and me spending time –

FRANCES HAUGEN:

Uh-huh (AFFIRM).

CHUCK TODD:

– right? There is no other business it really has other than selling advertising based on how much time I spend on it. Is that right?

FRANCES HAUGEN:

Yeah.

CHUCK TODD:

So this was taken away or rolled back, this would massively change the company?

FRANCES HAUGEN:

The way to think about safety on social media platforms is there's lots of very small choices where you make them and you lose .1% of profit, .2% of profit. The problem is, these industries are so sensitive to growth that when they don't grow at the level that the market expects, their stock price crashes. And so they're afraid to take even these small actions. Because they will decrease the profitability of a company –

CHUCK TODD:

I want you to react to –

FRANCES HAUGEN:

– by a little bit.

CHUCK TODD:

I want you to react to Nick Clegg. He's done a lot of writing.

FRANCES HAUGEN:

Oh, Nick Clegg.

CHUCK TODD:

And this one where it feels like he might as well have used the "shrug" emoji. “It would be easier to blame everything on algorithms. But there are deeper and more complex societal forces at play. We need to look at ourselves in the mirror and not wrap ourselves in the false comfort that we have simply been manipulated by machines all along.” This seems to be the reaction of Facebook on everything. It's just--

FRANCES HAUGEN:

Oh, yeah.

CHUCK TODD:

– not, "Hey, this isn't on us. This is society. We're just the mirror."

FRANCES HAUGEN:

Nick Clegg wrote an amazing blog post in March of 2021. So he said, "It takes two to tango.” You know, you're blaming us for our algorithms. But you chose your friends. You chose your interests. And yet in March of 2021, they had already run the same study at least four times where they took a blank account. This is an account that doesn't have any friends. It doesn't have any interests. And they followed some center right, center left issues. And then all they did was click on the content Facebook gave or follow groups Facebook suggested. And in two weeks they went from, you know, center topics, like Fox News, to white genocide just by clicking on the content. The algorithm pushed them in a more extreme direction. It's true there are many forces in society. But our information environment does have consequences.

CHUCK TODD:

Well, we compartmentalized radio. We could compartmentalize TV. You can turn that off. This has been much harder to compartmentalize.

FRANCES HAUGEN:

Well, it's also a question of when TV is on, if you tell a falsehood, other people can see it and respond. On radio, everyone has the same airwaves. When it comes to social media, you can spread lies, and they're invisible. And Facebook has resisted even minimal efforts at transparency that might allow us to begin to re-converge on a single-information environment.

CHUCK TODD:

So what should government regulation look like?

FRANCES HAUGEN:

I'm a big proponent of transparency as the first step. I think people aren't aware of how far behind we are. Social media companies for 20 years – remember, there were social media companies before Facebook – have all been very intentional except for maybe Twitter. Twitter has something called the firehose where you can see a stream of the tweets that are being posted. Most social media companies have resisted even simple efforts to bring more people to the table to ask questions and find solutions. So things like the Platform Accountability and Transparency Act, which has been recently proposed, I think is a great first step.

CHUCK TODD:

When you say transparency, should the government have to approve an algorithm?

FRANCES HAUGEN:

We are at such a basic level of understanding right now. Like, I really want to emphasize this. This is like we're back in 1965. You know, we don't have seat-belt laws yet. And we're just opening the pages of unsafe at any speed and say, "Oh, my goodness. There's all these ways we could have safer platforms." We're at that level of, like, nascent understanding. And so – but we have to have transparency so we can have enough people have conversations about how we move forward.

CHUCK TODD:

Should government be focused on user protection, consumer protection, more than trying to regulate the company?

FRANCES HAUGEN:

I think – that's a great question. Other industries are kept safe because there is something I call the "ecosystem of accountability." You know, there's lawyers that understand what a cut corner looks like. There's investors that understand how to manage for long-term returns. Remember, Facebook stock price is down, like, 70% right now. That's informed citizens like Mothers Against Drunk Driving. That's legislative aides that understand what's possible. Right now that entire ecosystem is missing, because the social media companies hid the information. And so when we talk about, "Should we be protecting users?" we are so far at the beginning that it is difficult to even put everyone at a table and say, "This is the menu of what's possible. Let's negotiate what the floor looks like."

CHUCK TODD:

I know you don't have as much insight into other tech companies. But should we assume that this opaqueness on algorithms and how things work is similar at Twitter and--

FRANCES HAUGEN:

TikTok –

CHUCK TODD:

– TikTok and at YouTube?

FRANCES HAUGEN:

100%. So one of the most important things that Mark - not Mark - Elon Musk could do to prove that he wants to have the public square is he could publish the algorithms.

CHUCK TODD:

Yeah, open source –

FRANCES HAUGEN:

Yes. Yeah, open source it. He'd have more help. It'd be cheaper for him. He'd be more profitable. But companies like TikTok have the exact same problems, if not moreso. Because TikTok is a company that is designed around being censored. You know, it comes from China. It's designed to amplify things so much that only a few pieces of content make up 80% of all of their feeds. And they manually screen those. We deserve to know what those policies are, because they're influencing what information we get to see.

CHUCK TODD:

All right. Well, I'm going to get a couple lawmakers on here and see what they have to say. Frances Haugen –

FRANCES HAUGEN:

Thank you so much.

CHUCK TODD:

– the Facebook whistleblower, thanks for coming in and sharing your perspective.

FRANCES HAUGEN:

Happy to be here.

CHUCK TODD:

When we come back, what can Congress do to regulate social media? Democratic Senator Amy Klobuchar of Minnesota and Republican Congressman Mike Gallagher of Wisconsin both have some ideas. And they're both here.

CHUCK TODD:

Though lawmakers in Washington have talked about a moment of truth to social media companies, they seem to have lacked some urgency. Each party is worried about a regulation hurting their side and benefiting the other. Joining me now is Democratic Senator Amy Klobuchar of Minnesota, who has introduced lots of legislation to attempt to regulate online political advertising, address problems in social media algorithms, and to restore competition online by reigning in big tech companies, and Republican Congressman Mike Gallagher of Wisconsin. He recently introduced a bill to ban TikTok, which he calls, "digital fentanyl," due to its parent company's ties to the Chinese government. Welcome to both of you. Senator Klobuchar, I want to start with you. And I want to start with just how powerful the social media lobby is in this town. Look, I put up that list of legislation there. A week ago, you thought you had a bill that was at least designed to help journalistic organizations, both big and small, to get properly funded by Facebook and all that, a similar law to Australia. You thought it was a done deal and it was gone in 24 hours. How powerful was this tech lobby?

SEN. AMY KLOBUCHAR:

So powerful that you literally can have a bill that got through the Judiciary Committee with strong bipartisan support, you can get promises from leaders that it's going to be a major end-of-year bill and then within 24 hours, it's gone. It's vanished. because of one company, two companies in this case; Facebook and Google - by the way, Google made $66 billion in one quarter in advertising while we are going to lose one third of the nation's newspapers by 2025. We had such strong support for this bill, but these guys just make a few calls and they just say, "Hey. You know, this is going to hurt us just like they did in Australia." The difference was in Australia, their government stood up and said, "No. We're going to do this and we're going to say, 'You've got to negotiate with these news organizations to get fair price for their content.'" And it happened. And they have a better system in place. Right now in the United States of America, these companies have basically started dominating our thought processes. And I think the work Frances has done is incredible. It is about going after these algorithms, making for them transparency. That's one of the bills we have. And it is about getting compensation for our news organizations. And then finally, it's about getting rid of archaic law, Section 230, that gives them immunity from lawsuits.

CHUCK TODD:

I want to talk about this. So Section 230, I'll put it up again here, "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." This law was passed in the '90s. This law was passed when it was message boards and it was forums. I was somebody that used to go on college sports forums. And, yeah, there was some crazy stuff in there. It makes sense for that. This was pre-algorithm. This was pre-iPhone. We didn't know what was coming. Can this be amended, rather than gotten rid of?

SEN. AMY KLOBUCHAR:

Yes. You can amend it and focus on certain kinds of speech misinformation, disinformation. And all you're saying is, "We know people are going to put stuff on your alleged town square,” which has become really a communications company. Your network, other news organizations, have limits in place and standards. And our argument's going to be: If you start making money off of it, if you amplify it, that's a whole different thing. Your angry emojis and all these things you're doing–

CHUCK TODD:

If you change this news feed,

SEN. AMY KLOBUCHAR:

– is to make money.

CHUCK TODD:

your news feed versus my news feed, hasn't the company become a publisher?

SEN. AMY KLOBUCHAR:

They are a publisher. And let's just start facing the facts and stop pretending they're some little company in a garage. Maybe one day, they were. But now, they are mega companies. And this is starting to happen all over the world. We are lagging behind and it is time for 2023, let it be our resolution, that we finally pass one of these bills. We have gotten through the committee some of the first bills, since the internet began, to finally take them on. And so it's not just that we've done nothing. We have gotten bipartisan agreements.

CHUCK TODD:

I understand that.

SEN. AMY KLOBUCHAR:

We have pushed these bills onto the Senate floor.

CHUCK TODD:

But let me ask you about this issue of polarization. I mean, this seems to be what holds back a good bill in Congress is that a Democrat might think, "Hmm. Is this going to hurt our status online," and vice versa on the Republican side.

SEN. AMY KLOBUCHAR:

It's actually even more insidious. When we had a bill that said you can't self-preference your own products, Amazon, at the top all the time over small businesses - support from small businesses all over the country for this - they ran over $150 million and much more in ads all across the country. So what that said to the members - there were Red-state ads, angry guys with pickup trucks. There were Blue-state ads. They ran these ads and it said to the members, "Hey. If you start getting on a bill like this--"

CHUCK TODD:

"We're going to make your life--"

SEN. AMY KLOBUCHAR:

"--or if you support it, we're going to come." This is kids stuff, compared to what you're going to do.

CHUCK TODD:

It's almost like you were being extorted?

SEN. AMY KLOBUCHAR:

This is how it's working. And so it is only going to change - and I thought Frances's work on kids would change it; not yet with the children's privacy bill. It's only going to change if the people of this country say, "This is enough. This is corrupt. You've got to do something. Put some rules in place when it comes to social media." And they've got to be liable when you've got situations where literally deranged people are believing their stuff, and going in, and taking a hammer, and bludgeoning the husband of the speaker of the House, or hundreds of thousands of blog posts that are allowed to go through with maps of the Capitol that they use to create an insurrection. At some point, when they can't control their own platforms while they're making billions of dollars from the American people, and over, as you point out, two thirds of Americans say it's hurting our democracy, come on, Congress. Stop hiding behind this and get something done.

CHUCK TODD:

Look. Let's be realistic. The tobacco companies changed after being hit with hefty lawsuits, more than government regulation. After the lawsuits came more government regulation.

SEN. AMY KLOBUCHAR:

Right.

CHUCK TODD:

Do you think if you opened these companies up to lawsuits; and by the way, Facebook is being sued overseas for its role in Ethiopian civil war and some other places; do you think their behavior would change?

SEN. AMY KLOBUCHAR:

Yes, because then in order to continue, they have to put safety measures in place. Instead of sending out sweet little notes about all the good work they're doing, they would actually have to do something. And so that's why changing Section 230, which was developed for a whole different moment in the Internet, is an answer. The other is taking on monopolies so you can allow competitors to come into being that would have different bells and whistles, privacy and the like; regulating online political ads, which they're still escaping despite some weren't from the Federal Election Commission. This is a bill I had with Senator Lindsey Graham. There are many, many things we could do here. But we need more time, we need votes and people need to say where they are. Are they going to side with these companies or are they going to side with the people of this country?

CHUCK TODD:

Would you like to see a law that was similar to the EU's Digital Services Act, which would target online ads that basically say, "The companies cannot use online ads targeting ethnicity, religion, or sexual orientation"?

SEN. AMY KLOBUCHAR:

I would like to see major work. I'd want to look at that exact bill on ads.

CHUCK TODD:

Fair enough.

SEN. AMY KLOBUCHAR:

And this bill that we have requires disclosures, disclaimers so you know who's paying for them, which is a major problem. But in this last election, and maybe this is going to help with my colleagues, over 30% of Americans said the number one reason they voted Democratic, including a whole bunch of Independents and moderate Republicans, was democracy. And it was voter suppression, yes, but was also about this kind of misinformation and just the fomenting of the lies on both sides sometimes that has caused people to do what they've done. And I just think it's a major issue. I'm not giving up. I'm not giving up. I'm going into 2023 ready to go.

CHUCK TODD:

I was just going to say your passion comes through in a big way. Senator Amy Klobuchar, thanks for coming on, and sharing this, and happy New Year.

SEN. AMY KLOBUCHAR:

Thank you.

CHUCK TODD:

Let me bring in Congressman Mike Gallagher and I want to start with TikTok. You call it, "digital fentanyl," TikTok, which is owned by a company named ByteDance which is based in China, essentially, reporting to and/or owned by the Chinese government, however you want to look at it. But explain why you call it "digital fentanyl"?

REP. MIKE GALLAGHER:

It was FCC Commissioner Brendan Carr who originally called it “digital Fentanyl.” I think the comparison is apt for at least two reasons. One, it's highly addictive and destructive and we're seeing troubling data about the corrosive impact of constant social media use, particularly on young men and women here in America. It's also digital fentanyl in the sense that, as you allude to, it ultimately goes back to the Chinese Communist Party. TikTok is owned by ByteDance; ByteDance is effectively controlled by the CCP. So, we have to ask whether we want the CCP to control what is on the cusp of becoming the most powerful media company in America. That is very troubling. And so, I was glad to see my colleagues in the Senate pass, in unanimous fashion, a ban of TikTok on government devices. I think we should do the same in the House and expand that ban nationally.

CHUCK TODD:

Look, I want put up a TikTok statement, they did give us one and this is what they say, "TikTok's objective is to unite people through creative and entertaining content, not to divide with misinformation that polarizes people. We treat misinformation with the utmost seriousness and take a multi-prong approach to stopping it from spreading, while elevating authoritative information, investing in digital literacy, education to help get ahead of the problem at scale." But this is not about misinformation with TikTok. I want to get more of your concern. And I'm curious, are you more concerned about the Chinese government having our data? Which one might argue, they already do. Or are you more concerned about the fact that the algorithm we know nothing about and they can turn it on or turn it off to say what they want to say at any moment to billions of users around the world?

REP. MIKE GALLAGHER:

I'm concerned about a few things. I am concerned about the Chinese government effectively compiling dossiers filled with our data. But you're right to suggest that predates TikTok, right? I remember getting a letter after the OPM hack because my military records potentially would have been compromised. That gives them enormous leverage. For example, any time an America is operating in China or if there's something our intelligence community needs to do, I'm concerned about that. I'm concerned about TikTok's ability to track your location, track your keystrokes, track what websites you're visiting, even when you're not using the app. I'm concerned about the lack of transparency around the algorithm which is addicting kids. But I think what's more pernicious is the fact that, since a lot of young men and women in America increasingly turn to TikTok to get news, what if they start censoring the news, right? What if they start tweaking the algorithm to determine what the CCP deems fit to print? That's incredibly dangerous. That's as if, in 1958, we allowed the KGB and Pravda to buy The New York Times, The Chicago Tribune, and The Washington Post. That actually probably understates the threat. I think it's a multi-pronged threat we need to look at.

CHUCK TODD:

Is this something with TikTok you think - can they create an American version? Or do you think there's just no way to split this company up to protect Americans?

REP. MIKE GALLAGHER:

I think one acceptable outcome, and this would be allowed in the bill that I have, which is a bipartisan bill with my colleague Raja Krishnamoorthi. So, here you have not only a Democrat and a Republican working together, but a Bears fan and a Packers fan working together on this issue, Chuck. It would allow for a sale to an American company. That option was explored during the Trump administration. Oracle explored a version of that. Microsoft ultimately fell through. I think there's a workable solution there. What we don't want is some quasi-solution where there's a data center in Singapore, but the CCP and ByteDance effectively retains control. So, the devil is in the details. But I'm open to having that discussion with TikTok. And I really want to have that discussion with the Biden administration. I don't think this should be a partisan issue. I want to work with them. And I think the Senate vote that we were talking about earlier is evidence that this isn't a partisan issue.

CHUCK TODD:

What's your level of concern about the Russian investment into Telegram and the Saudi government's investment into Twitter?

REP. MIKE GALLAGHER:

Well, I guess my broad concern of which both of those are part is where we see authoritarian governments exploiting technology in order to exert total control over their citizens. And that's really the concern with the CCP. They seem to be perfecting this model of techno-totalitarian control. It's most prominently and perversely expressed in Xinjiang Province. But they're exporting that throughout the rest of the country. They're using it to shut down the protests that we're seeing in China right now. And ultimately, it's my belief that that's a model that will not stay in China. That's a model they're going to export around the world. Another thing we can do here, Chuck, that our social media companies could do is insist on basic reciprocity. For example, Chinese wolf warrior diplomats, they're propagandists, are all over Twitter and Facebook pushing propaganda attacking America. When at the same time, of course, Chinese citizens aren't allowed access to those apps in China. There's no level playing field. I think a simple rule is, and I actually asked Jack Dorsey, when he ran Twitter, to do this, if your government doesn't allow your citizens access to the platform, we're going to deny your government officials access to that same platform. I think that would be a useful step that would apply not only to China but also to Russia.

CHUCK TODD:

You're not somebody that wants to see a lot more government regulation. But what is the best way to regulate social media? Is it to get rid of 230 and you know what let the courts have at them - no more special protection? Is it a new law? Is it algorithm transparency? What is, in your view, acceptable regulation?

REP. MIKE GALLAGHER:

I do think more transparency around the algorithms is necessary. And I liked what Senator Klobuchar was saying on that front. My only concern with the 230 repeal is that it might accidentally increase censorship on social media. In other words, if these platforms are now liable for what people that use them say, would they not just kick people off proactively? So, I think a better framework might be, mandate data portability across platforms so you're able to bring your network to whatever platform has the best content moderation policy and best transparency that you like. Combined with something called neutrality in the stack where in contrast to Twitter or Facebook, which are private companies that can have different content regulation strategies.Amazon Web Services, for example, the infrastructure of the internet, couldn't deny someone access to their services just because they don't like their political beliefs or what they think about that or that political issue. I think that's the framework I have in mind. But I'm open to having that conversation. I listened to your interview with Senator Klobuchar closely and I'd love to go over and across the Hill and talk to her about her ideas.

CHUCK TODD:

Is this a case of regulating the companies or protecting consumers? Should the focus be more on consumer protection or more on trying to regulate the company?

REP. MIKE GALLAGHER:

For me, it's the latter. It's consumer protection. And one thing that we don't really think about is these complicated user agreements that we all just click automatically. Perhaps it's unreasonable for us to expect your average American citizen to read.

CHUCK TODD:

There's a great South Park spoof about that.

REP. MIKE GALLAGHER:

Yes, that's right. But we in Congress should do a better job of understanding that and translating that and communicating it to the American public and to our constituents in a way that they understand. So, I think there are a variety of things we can do. When it comes to our kids, the government can't raise your kids, can't protect your kids for you. I have two young daughters. It's my responsibility to raise them into healthy adults. But there are certain sensible things we can do in order to create a healthier social media ecosystem.

CHUCK TODD:

To do on those service agreements, we make credit card companies cut down the amount of paragraphs they do, I think we can make social media companies, tech companies, do the same. Congressman Mike Gallagher, go Pack go, Republican from Wisconsin, appreciate you coming on and sharing your perspective.

REP. MIKE GALLAGHER:

Thank you.

CHUCK TODD:

When we come back, social media is seen as mostly good for democracy across the globe. But not here in the United States. Here, it's seen as much more destructive. We're going to go inside those numbers next.

CHUCK TODD:

Welcome back. Data Download time. Social media's impacts are felt all around the world. But Pew Research Center's new Global Attitude Survey shows that the United States is actually an outlier in how Americans perceive social media's impact on democracy. Bottom line: Americans are a lot more skeptical. Social media's impact on democracy, a bad thing? 35% around the world say it's a bad thing, but here in the United States, 64% say it's been a bad thing. How about whether it's made us more divided? This, there's a little more continuity. 65% around the world, 79% here in America. Are we now less civil? 46% globally agree with that. 69% believe that here in the United States. And how about social media, does it make you more informed? Globally, people think it does, 73%, versus Americans at 64%. But despite our skepticism, as Americans, we're only – we’re using this more and more. Look at this. In 2012, about half of Americans were using social media. Now, essentially three quarters. And look at this by age group. Eighteen to 30, this won't surprise. I'd like to know who the 16% who don't use social media are, but 84% of folks under 30, 81% 30 to 50, and even 60% of those folks 50 plus. So we may not like it, but we're becoming more addicted to it. Up next, social media's already changed us and our politics. So what is the best path forward? Our panel is here.

CHUCK TODD:

Back now with a terrific panel. New York Times Technology Reporter Cecilia Kang, co-author of An Ugly Truth: Inside Facebook's Battle for Domination; former Homeland Security Secretary Jeh Johnson; former Republican Congressman Carlos Curbelo; and Senior Editor at Reason Magazine, Elizabeth Nolan Brown. Welcome to all of you. Cecilia, I want to start with you because you've written this book on Facebook. Kara Swisher has a sort of a take on Facebook, and even Frances Haugen. Nobody, Mark Zuckerberg didn't know what necessarily he was creating when he started creating it. And that's what it feels like with all these social media companies. They were started with, sort of, some good intentions, and they lost control of it.

CECILIA KANG:

Yeah. I mean, there are the two guiding forces: They wanted companies that would grow, and grow fast, and grow big, and scale to the point where they're global and they would have historic and lasting impact. And the second thing is we have to remember these are businesses. These are companies that are motivated by their, by profits. And the business model is built on the idea of getting eyeballs and attention to serve up to advertisers and to make money that way. If you miss those two important points, you sort of miss what's happening here.

CHUCK TODD:

But they never, what I want to get at is, they never thought, "Hey, we're going to be trafficking misinformation."

CECILIA KANG:

They never did.

CHUCK TODD:

That seemed to shock them.

CECILIA KANG:

Yeah. And I think a lot of these companies are built oftentimes by actually sort of young, male, idealistic entrepreneurs in Silicon Valley with big dreams. And they've got lots of funding. And their idealism is what really attracts the funding and attracts a lot of interest by engineers who want to work there because they do want to change the world. But they're not motivated to look around the corners for potential problems because their eye is always on growth and their eyes have always been on really growing that business model.

CHUCK TODD:

Elizabeth, would you argue, is the problem the companies, or is the problem us?

ELIZABETH NOLAN BROWN:

I think a lot of things that get blamed on social media are just human problems, and social media just makes them more visible, yes.

CHUCK TODD:

More visible? But do you accept the idea that social media may ramp it up, accelerate these issues?

ELIZABETH NOLAN BROWN:

I think that social media, I mean, things that people don't think about algorithms is that they actually help suppress a lot of the bad content. Without algorithms, you know, we would be seeing a lot more hate speech. We would be seeing a lot more offensive content. I think that they actually do a lot of good for your average person online.

CHUCK TODD:

Jeh, when you were at Homeland Security, you were just beginning to deal with this misinformation issue. How did you tackle it then? And looking back, how would you be tackling it now?

FMR. SECRETARY JEH JOHNSON:

Looking back at the 2016 election, and I should mention that, as a lawyer, I have clients--

CHUCK TODD:

Yes, you work for a firm, Paul, Weiss, that does--

FMR. SECRETARY JEH JOHNSON:

Yeah, we have clients--

CHUCK TODD:

--represent some of these social media companies--

FMR. SECRETARY JEH JOHNSON:

So--

CHUCK TODD:

--we're talking about.

FMR. SECRETARY JEH JOHNSON:

--made that disclosure. This was the Trojan Horse, 2016. 2016, I, as Secretary of Homeland Security, was very focused on potential cyber attacks on election infrastructure. And just before I left office, I declared the election infrastructure to be critical infrastructure. We were worried about, you know, ballot counting, reporting, and so forth. Turned out that was not the issue. Hasn't been the issue. Wasn't the issue in 2020. Wasn't the issue in 2022. The Trojan Horse was the extent to which the Russian government invaded our American conversation. And it's spelled out in the Mueller indictment, but this is an issue that we've really yet to get our arms around because it does implicate free speech.

CHUCK TODD:

Carlos, I feel like your national political career sort of encompassed this moment, where we went from social media, good, to social media, problematic. Is that fair?

FMR. REP. CARLOS CURBELO:

Yeah. And I think, Chuck, one of the big problems especially on the right is that social media has sown a lot of distrust because the right feels under attack by big institutions. Obviously, these big companies represent, you know, mainstream America. And, because we don't really know exactly how these companies operate, it breeds a lot of conspiracy theories. It makes people paranoid. So one of the reasons Elon Musk has become so popular on the right is because of this concept that he's unveiling, right, raising the curtain on everything Twitter did throughout all these years. So yes, it has hurt our democracy. Why? Because it has diminished trust in society.

CHUCK TODD:

Yeah, go ahead.

FMR. SECRETARY JEH JOHNSON:

Social media, in my view, accentuates the point that our greatest strength as a free and open society is also our greatest vulnerability. It's no coincidence that, at the same time as the rise of social media, more people are participating in the American political process. 1992, 105 million people --

CHUCK TODD:

It's an important point. I mean, our voter turnout in the last decade has been going up, up, up, and up. Now--

FMR. SECRETARY JEH JOHNSON:

66 percent in--

CHUCK TODD:

--one might argue--

FMR. SECRETARY JEH JOHNSON:

--2020.

CHUCK TODD:

--that voters in this country start voting when they're worried that the democracy is (UNINTEL)--

FMR. SECRETARY JEH JOHNSON:

Well, there is that too. Yes. And you had this person called Trump that got everybody's attention. Social media raises political awareness. You can mobilize a movement behind social media. Look at how effective President Zelenskyy in Ukraine has been using social media. But this is also our vulnerability too, obviously: the echo chamber, fake news.

CHUCK TODD:

Elizabeth, I want to start with you. If we're going to regulate this, should we come at it consumer first, or company first?

ELIZABETH NOLAN BROWN:

I think we definitely need to consider consumers first. I don't think that regulating it just for the sake of keeping these companies being big is a good thing. You know, we already see Facebook, Twitter, their power is diminishing. Like, new things are coming along. TikTok. Mastodon was getting thousands of users an hour recently. I think that the market will take care of unseating these dominant companies if we don't over-regulate because when there's too much regulation, only Facebook can keep up with it, and it entrenches Facebook's power.

CHUCK TODD:

Look, none of these companies want regulation. I mean, I was talking about it, I mean, it is astonishing how fast Facebook killed that bill. It was a done deal in the Defense Authorization Bill. It was done, and then it got stripped out. She didn't know how it happened.

CECILIA KANG:

Yes. I've seen dozens of bills be proposed on regulating technology companies, and none of them have passed. And I really don't think that regulation is where we're going to see accountability first. I agree with Elizabeth. I think consumers will vote with their feet. We're already seeing Facebook see that users are not using the site as much and not visiting as often. And innovation is winning, in that other sites, be they good or bad, are attracting new users, like TikTok. The other way we're going to see real accountability is probably in the legal system. We're probably going to see lawsuits. And we saw that.

CHUCK TODD:

So you would like to see 230 gone, or amended so that you saw more lawsuits?

CECILIA KANG:

Well, I think regulating speech is going to be hard in general. I think getting rid of 230 is going to be very difficult. I think Republicans and Democrats agree that it needs to be revised in some way, but for very different reasons. They come at it completely polar opposite reasons for why. But as far as 230 goes with speech, I think what you're going to see is more individual lawsuits. You saw, for example, with the lawsuits against Alex Jones by Sandy Hook families. You saw Dominion Voting Machines, the company has sued Fox News Network companies--

CHUCK TODD:

And this is not holding the tech companies, it's going after the individual that used the tech platform, correct--

CECILIA KANG:

That will probably create bottlenecks potentially in the spread of disinformation, the creation and the spread of disinformation.

CHUCK TODD:

Algorithm transparency. It's easy to say, "I don't know what that would look like." Do you know what that would look like?

FMR. REP. CARLOS CURBELO:

Well, but that's the key because this is the mystery. How's information getting boosted? How's information getting suppressed? That's what makes people paranoid, that--

CHUCK TODD:

Well, I'd like to make the choice. You and I were talking about this earlier. I want to make the choice. "Hey, I want some use of the algorithm, but I want to pick and choose when I let you decide on an AI computer."

ELIZABETH NOLAN BROWN:

I think it's great when they let you choose to see things in chronological order still. I wish all tech companies would do that. But again, I think that's something that, you know, users need to demand. It can't come from a top-down mandate.

CHUCK TODD:

Last point--

FMR. SECRETARY JEH JOHNSON:

You asked Frances, "Should government approve algorithms?" I think that comes perilously close to regulating speech. Very interesting exercise – if you put into your phone a well-known right-wing commentator and the words "great replacement," you'll get a site from the Anti-Defamation League, arguing that he should be de-platformed, so.

CHUCK TODD:

An interesting, an interesting way you can come to that. This was a terrific panel. Thank you. When we come back, a special Meet the Press minute, looking back at the first time this program looked into this emerging technology.

CHUCK TODD:

Welcome back. Before there was Facebook, Twitter, or TikTok, blogging was the earliest way for people to share content and build communities online. But in 2004, blogs were still a novelty in the world of campaign politics, at least here in Washington. In my first-ever Meet the Press appearance, Tim Russert asked me to explain how this new social media was being used by presidential candidates that cycle.

[BEGIN TAPE]

TIM RUSSERT:

And here to help us is Chuck Todd of National Journal's Hotline. What is a blog?

CHUCK TODD:

Well, blog — So the actual term itself, by the way, is short for "web log." You know, you drop the W-E, and you get the "blog." I'll just describe what Howard Dean's blog is since it's the one that has the most traction and the most attention. It's essentially like a digital bulletin board saying, "Hey, look. This is what we're up to today. This is our message today. These are some of the things we're doing today." And then it allows a section to comment about what's going on during the day. And this is where you find out who the bloggers are.

[END TAPE]

CHUCK TODD:

Yes, I used to have quite a big head of hair. That's all for today. Thanks for watching. Happy new year. We'll be back next week because if it's Sunday, even in 2023, it's Meet the Press.