Nov 30, 2020
Everyone who uses Facebook, Google, and Twitter has probablynoticed the disappearance of posts and the appearance of labels,especially during the 2020 election season. In this episode, hearthe highlights from six recent House and Senate hearings whereexecutives from the social media giants and experts on social mediatestified about the recent changes. The incoming 117th Congress ispromising to make new laws that will affect our social mediaexperiences; these conversations are where the new laws are beingconceived.
Please Support Congressional Dish – Quick Links
- Click here to contribute monthly or a lump sum via PayPal
- Clickhere to support Congressional Dish via Patreon (donations perepisode)
- Send Zelle payments to:Donation@congressionaldish.com
- Send Venmo payments to:@Jennifer-Briney
- Send Cash App payments to:$CongressionalDish or Donation@congressionaldish.com
Use your bank’s online bill pay function to mail contributionsto:
Please make checks payable to Congressional Dish
Thank you for supporting truly independent media!
Recommended Episodes
CD196:The Mueller Report
CD186: National Endowment for Democracy
Articles/Documents
- Article: President Trump’s latest claims about Wis. absentee ballotsdebunked by election officials WTMJ-TV Milwaukee, November 24,2020
- Article: Don’t Blame Section 230 for Big Tech’s Failures. Blame BigTech. By Elliot Harmon, Electronic Frontier Foundation,November 16, 2020
- Article: Biden, the Media and CIA Labeled the Hunter Biden Emails "RussianDisinformation." There is Still No Evidence. By GlennGreenwald, November 12, 2020
- Article: Ad Library - Spending Tracker: US 2020 Presidential RaceFacebook, November 3, 2020
- Article: What’s the deal with the Hunter Biden email controversy? ByKaelyn Forde and Patricia Sabga, Aljazeera, October 30, 2020
- Article: Congress Fails to Ask Tech CEOs the Hard Questions By ElliotHarmon and Joe Mullin, Electronic Frontier Foundation, October 29,2020
- Article: With the Hunter Biden Expose, Suppression is a Bigger Scandal ThanThe Actual Story, by Matt Taibbi, TK News, October 24,2020
- Article: Read the FBI's letter to Sen. Ron Johnson The Washington Post,October 20, 2020
- Article: DNI Ratcliffe: Russia disinformation not behind published emailstargeting Biden; FBI reviewing, by Kevin Johnson, USA Today,October 19, 2020
- Article: Twitter changes its hacked materials policy in wake of New YorkPost controversy By Natasha Lomas, Tech Crunch, October 16,2020
- Article: Smoking-gun email reveals how Hunter Biden introduced Ukrainianbusinessman to VP dad By Emma-Jo Morris and Gabrielle Fonrouge,New York Post, October 14, 2020
- Article: The Decline of Organic Facebook Reach & How to Adjust to theAlgorithm By Sophia Bernazzani, HubSpot, May 3, 2020
- Article: Facebooklaunches searchable transparency library of all active ads ByJosh Constine, TechCrunch, March 28, 2019
- Article: MAERES Alumna Nina Jankowicz Awarded Fulbright-Clinton Fellowshipto Ukraine SFS, Center for Eurasian, Russian and East EuropeanStudies, June 21, 2016
- Article: OrganicReach on Facebook: Your Questions Answered By Brian Boland,Facebook for Business, June 5, 2014
- Article: NSA slides explain the PRISM data-collection program TheWashington Post, October 4, 2013
Additional Resources
- General Guidelines and policies: Distributionof hacked materials policy, Twitter, October 2020
- Business Help Center: Fact-Checking on Facebook FacebookBusiness
- Business Help Center: Rating Options for Fact-Checkers FacebookBusiness
- Commit totransparency — sign up for the International Fact-CheckingNetwork's code of principles, IFCN Code of Principles
- Section 230 of theCommunications Decency Act, Electronic Frontier Foundation
- Mission Statement: OUR MISSIONOpen Markets
- AboutNews Media Alliance
- LeadershipNews Corp
- ClintWatts Foreign Policy Research Institute
- About FPRI ForeignPolicy Research Institute
- Nina JankowiczWicszipedia
Sound Clip Sources
Hearing: Breaking the News:Censorship, Suppression and the 2020 Election, Senate JudiciaryCommittee, November 17, 2020
Witnesses:
- Jack Dorsey, Twitter, Inc.
- Mark Zuckerberg, Facebook, Inc.
Transcript:
30:50Jack Dorsey: We were calledhere today because of an enforcement decision we made against NewYork Post, based on a policy we created in 2018. To prevent Twitterfrom being used to spread hacked materials. This resulted in usblocking people from sharing a New York Post article, publicly orprivately. We made a quick interpretation, using no other evidencethat the materials in the article were obtained through hacking,and according to our policy, we blocked them from being spread.Upon further consideration, we admitted this action was wrong andcorrected it within 24 hours. We informed the New York Post of ourair and policy update and how to unlock their account by deletingthe original violating tweet, which freed them to tweet the exactsame content and news article again. They chose not to, insteadinsisting we reverse our enforcement action. We do not have apractice around retro actively overturning prior enforcement's,since then it demonstrated that we needed one and so we created onewe believe is fair and appropriate.
35:13Mark Zuckerberg: AtFacebook, we took our responsibility to protect the integrity ofthis election very seriously. In 2016, we began to face new kindsof threats and after years of preparation, we were ready to defendagainst them. We built sophisticated systems to protect againstelection interference, that combined artificial intelligence,significant human review, and partnerships with the intelligencecommunity, law enforcement and other tech platforms. We've takendown more than 100 networks of bad actors, we're trying tocoordinate and interfere globally, we established a network ofindependent fact checkers that covers more than 60 languages. Wemade political advertising more transparent on Facebook thananywhere else, and including TV, radio and email. And we introducednew policies to combat voter suppression and misinformation. Still,the pandemic created new challenges, how to handle misinformationabout COVID and voting by mail, how to prepare people for thereality, the results would take time, and how to handle if someoneprematurely declared victory or refused to accept the result. So inSeptember, we updated our policies again to reflect these realitiesof voting in 2020. And make sure that we were taking precautionsgiven these unique circumstances. We worked with local electionofficials to remove false claims about polling conditions thatmight lead to voter suppression. We partnered with Reuters and thenational election pool to provide reliable information aboutresults. We attach voting information to posts by candidates onboth sides and additional contexts to posts trying to de legitimizethe outcome. We lock down new political ads and the week before theelection to prevent misleading claims from spreading when theycouldn't be rebutted. We strengthened our enforcement againstmilitias and conspiracy networks like QAnon to prevent them fromusing our platforms to organize violence or civil unrestaltogether. I believe this was the largest election integrityeffort by any private company in recent times.
40:50Jack Dorsey: We havetransparency around our policies, we do not have transparencyaround how we operate content moderation, the rationale behind it,the reasoning. And as we look forward, we have more and more of ourdecisions of our operations moving to algorithms, which are, have adifficult time explaining why they make decisions, bringingtransparency around those decisions. And that is why we believethat we should have more choice in how these algorithms are appliedto our content, whether we use them at all so we can turn them onor off and have clarity around the outcomes that they're projectingand how they affect our experience.
45:39Mark Zuckerberg: We workwith a number of independent organizations that are accredited bythe Poynter Institute. And they include Reuters, the AssociatedPress. AJans France presse, United States, USA Today,factcheck.org, Science Feedback, PolitiFact, Check Your Fact,Leadstories and the Dispatch in the United States.
48:54Sen. Lindsay Graham (SC):Do both of you support change to 230? Reform of Section 230?Mark Zuckerberg: Senator I do. Sen.Lindsay Graham (SC): Mr. Dorsey? JackDorsey: Yes. Sen. Lindsay Graham (SC):Thank you.
54:10Sen. Richard Blumenthal(CT): How many times is Steve Bannon allowed to call forthe murder of government officials before Facebook suspends hisaccount? Mark Zuckerberg: Senator, as you say, thecontent in question did violate our policies and we took it down.Having a content violation does not automatically mean your accountgets taken down. And the number of strikes varies depending on theamount and type of offense. So if people are posting terroristcontent or child exploitation content, then the first time they doit, then we will take down their account. For other things. It'smultiple, I'd be happy to follow up afterwards. We try not todisclose these... Sorry, I didn't hear that. Sen. RichardBlumenthal (CT): Will you commit to taking down thataccount? Steve Bannon? Mark Zuckerberg: Senator,no, that's not what our policies would suggest that we should do inthis case.
1:07:05Jack Dorsey: What we sawand what the market told us was that people would not put up withabuse, harassment and misleading information that would causeoffline harm, and they would leave our service because of it. Soour intention is to create clear policy, clear enforcement thatenables people to feel that they can express themselves on ourservice, and ultimately trust it. Sen. John Cornyn(TX): So it was a business decision. JackDorsey: It was a business decision.
2:56:34Mark Zuckerberg: We docoordinate on and share signals on security related topics. So forexample, if there is signal around a terrorist attack or aroundchild exploitation imagery or around a foreign government, creatingan influence operation, that is an area where the companies doshare signals about what they see. But I think it's important to bevery clear that that is distinct from the content moderationpolicies that we or the other companies have, where once we shareintelligence or signals between the companies, each company makesits own assessment of the right way to address and deal with thatinformation.
3:59:10Sen. Mazie Hirono (HI): Idon't know what it what are both of you prepared to do regardingDonald Trump's use of your platforms after he stops being presidentit? Will he still be deemed newsworthy? And will he still get touse your platform to spread this misinformation? MarkZuckerberg: Senator, let me clarify my last answer. We arealso having academic study, the effective of all of our electionmeasures and they'll be publishing those results publicly. In termsof President Trump and moving forward. There are a small number ofpolicies where we have exceptions for politicians under theprinciple that people should be able to hear what their electedofficials are saying and candidates for office. But by and large,the vast majority of our policies have no newsworthiness orpolitical exception. So if the President or anyone else isspreading hate speech, or inciting violence, or posting content,that delegitimizes the election or valid forms of voting, thosewill receive the same treatment is anyone else saying those things,and that will continue to be the case Sen. Mazie Hirono(HI): Remains to be seen. Jack Dorsey: Sowe do have a policy around public interest, where for globalleaders, we do make exceptions in terms of whether if a tweetviolates our terms of service, we leave it up behind aninterstitial, and people are not allowed to share that morebroadly. So a lot of the sharing is disabled with the exception ofquoting it so that you can add your own conversation on top of it.So if an account suddenly becomes, is not a world leader anymore,that particular policy goes away.
4:29:35Sen. Marsha Blackburn(TN): Do you believe it's Facebook's duty to comply withstate sponsored censorship so it can keep operating doing businessand selling ads in that country? Mark Zuckerberg:Senator in general, we try to comply with the laws in every countrywhere we operate and do business.
Hearing: BIG TECH AND SECTION 230 IMMUNITY, Senate Commerce, Science andTransportation Committee, October 28, 2020
Witnesses:
- Jack Dorsey, Twitter, Inc.
- Sundar Pichai, Alphabet Inc.
- Mark Zuckerberg, Facebook, Inc.
Transcript:
10:10Sen. Roger Wicker (MS): Inpolicing, conservative sites, then its own YouTube platform or thesame types of offensive and outrageous claims.
45:50Jack Dorsey: The goal ofour labeling is to provide more context to connect the dots so thatpeople can have more information so they can make decisions forthemselves.
46:20Sen. Roger Wicker (MS): Ihave a tweet here from Mr. Ajit Pai. Mr. Ajit Pai is the chairmanof the Federal Communications Commission. And he recounts some fourtweets by the Iranian dictator, Ayatollah Ali Khamenei, whichTwitter did not place a public label on. They all four of themglorify violence. The first tweet says this and I quote each time'the Zionist regime is a deadly cancerous growth and a detriment tothe region, it will undoubtedly be uprooted and destroyed.' That'sthe first tweet. The second tweet 'The only remedy until theremoval of the Zionist regime is firm armed resistance,' again,left up without comment by Twitter. The third 'the struggle to freePalestine is jihad in the way of God.' I quote that in part for thesake of time, and number four, 'we will support and assist anynation or any group anywhere who opposes and fights the Zionistregime.' I would simply point out that these tweets are still up,Mr. Dorsey. And how is it that they are acceptable to be to bethere? Alan, I'll ask unanimous consent to enter this tweet fromAjit Pai in the record at this point that'll be done. Withoutobjection. How Mr. Dorsey, is that acceptable based on yourpolicies at Twitter? Jack Dorsey: We believe it'simportant for everyone to hear from global leaders and we havepolicies around world leaders. We want to make sure that we arerespecting their right to speak and to publish what they need. Butif there's a violation of our terms of service, we want to label itand... Sen. Roger Wicker (MS): They're still up,did they violate your terms of service? Mr. Dorsey? JackDorsey: We did not find those two violate our terms ofservice because we consider them saber rattling, which is, is partof the speech of world leaders in concert with other countries.Speech against our own people, or a country's own citizens webelieve is different and can cause more immediate harm.
59:20Jack Dorsey: We don't havea policy against misinformation. We have a policy againstmisinformation in three categories, which are manipulated media,public health, specifically COVID and civic integrity, electioninterference and voter suppression.
1:39:05Sen. Brian Schatz (HI):What we are seeing today is an attempt to bully the CEOs of privatecompanies into carrying out a hit job on a presidential candidate,by making sure that they push out foreign and domesticmisinformation meant to influence the election. To our witnessestoday, you and other tech leaders need to stand up to this immoralbehavior. The truth is that because some of my colleagues accuseyou, your companies and your employees of being biased or liberal,you have institutionally bent over backwards and over compensated,you've hired republican operatives, hosted private dinners withRepublican leaders, and in contravention of your Terms of Service,given special dispensation to right wing voices, and even throttledprogressive journalism. Simply put, the republicans have beensuccessful in this play.
1:47:15Jack Dorsey: This one isa tough one to actually bring transparency to. Explainability in AIis a field of research but is far out. And I think a betteropportunity is giving people more choice around the algorithms theyuse, including to turn off the algorithms completely which is whatwe're attempting to do.
2:15:00Sen. Jerry Moran (KS):Whatever the numbers are you indicate that they are significant.It's a enormous amount of money and an enormous amount of employeetime, contract labor time in dealing with modification of content.These efforts are expensive. And I would highlight for mycolleagues on the committee that they will not be any lessexpensive, perhaps less than scale, but not less in cost forstartups and small businesses. And as we develop our policies inregard to this topic, I want to make certain that entrepreneurship,startup businesses and small business are considered in what itwould cost in their efforts to meet the kind of standards tooperate in a sphere.
2:20:40Sen. Ed Markey (MA): Theissue is not that the companies before us today are taking too manyposts down. The issue is that they're leaving too many dangerousposts up. In fact, they're amplifying harmful content so that itspreads like wildfire and torches our democracy.
3:04:00Sen. Mike Lee (UT):Between the censorship of conservative and liberal points of view,and it's an enormous disparity. Now you have the right, I want tobe very clear about this, you have every single right to set yourown terms of service and to interpret them and to make decisionsabout violations. But given the disparate impact of who getscensored on your platforms, it seems that you're either one notenforcing your Terms of Service equally, or alternatively, to thatyou're writing your standards to target conservativeviewpoints.
3:15:30Sen. Ron Johnson (MA):Okay for both Mr. Zuckerberg and Dorsey who censored New York Poststories, or throttled them back, did either one of you have anyevidence that the New York Post story is part of Russiandisinformation? Or that those emails aren't authentic? Did anybodyhave any information whatsoever? They're not authentic more thanthey are Russian disinformation? Mr. Dorsey? JackDorsey: We don't. Sen. Ron Johnson (MA):So why would you censor it? Why did you prevent that from beingdisseminated on your platform that is supposed to be for the freeexpression of ideas, and particularly true ideas... JackDorsey: we believe to fell afoul of our hacking materialspolicy, we judged... Sen. Ron Johnson (MA): Theyweren't hacked. Jack Dorsey: We we judge themmoment that it looked like it was hacked material. Sen. RonJohnson (MA): You were wrong. JackDorsey: And we updated our policy and our enforcementwithin 24 hours. Sen. Ron Johnson (MA): Mr.Zuckerberg? Mark Zuckerberg: Senator, as Itestified before, we relied heavily on the FBI, his intelligenceand alert status both through their public testimony and privatebriefings. Sen. Ron Johnson (MA): Did the FBIcontact you, sir, than your co star? It was false. MarkZuckerberg: Senator not about that story specifically.Sen. Ron Johnson (MA): Why did you throttle itback. Mark Zuckerberg: They alerted us to be onheightened alert around a risk of hack and leak operations around arelease and probe of information. And to be clear on this, wedidn't censor the content. We flagged it for fact checkers toreview. And pending that review, we temporarily constrained itsdistribution to make sure that it didn't spread wildly while it wasbeing reviewed. But it's not up to us either to determine whetherit's Russian interference, nor whether it's true. We rely on thefact checkers to do that.
3:29:30Sen. Rick Scott (FL):That's becoming obvious that your that your companies are unfairlytargeting conservatives. That's clearly the perception today,Facebook is actively targeting as by conservative groups ahead ofthe election, either removing the ads completely or adding theirown disclosure if they claim that didn't pass their fact checksystem.
3:32:40Sen. Rick Scott (FL): Youcan't just pick and choose which viewpoints are allowed on yourplatform an expect to keep immunity granted by Section 230.
News Clip: AdamSchiff on CNN, CNN, Twitter, October 16, 2020
Hearing: MISINFORMATION, CONSPIRACY THEORIES, AND `INFODEMICS': STOPPING THESPREAD ONLINE, Committee on the Judiciary: Subcommittee onAntitrust, Commercial, and Administrative Law, October 15,2020
Witnesses:
- Dr. Joan Donovan: Research Director at the Shorenstein Centeron Media, Politics, and Public Policy at Harvard KennedySchool
- Nina Jankowicz: Disinformation Fellow at the Wilson Center
- Cindy Otis: Vice President of the Althea Group
- Melanie Smith: Head of Analysis, Graphika Inc
Transcript:
41:30 Rep. Jim Himes (CT): And Ishould acknowledge that we're pretty careful. We understand that weshouldn't be in the business of fighting misinformation that'sprobably inconsistent with the First Amendment. So what do we do?We ask that it be outsourced to people that we otherwise are prettycritical of like Mark Zuckerberg, and Jack Dorsey, we say you doit, which strikes me as a pretty lame way to address what may ormay not be a problem.
42:00 Rep. Jim Himes (CT): MissJankowicz said that misinformation is dismantling democracy. I'mskeptical of that. And that will be my question. What evidence isthat is out there that this is dismantling democracy, I don't meanthat millions of people see QAnon I actually want to see theevidence that people are seeing this information, and are in ameaningful way, in a material way, dismantling our democracythrough violence or through political organizations, because ifwe're going to go down that path, I need something more thaneyeballs. So I need some evidence for how this is dismantling ourdemocracy. And secondly, if you persuade me that we're dismantlingour democracy, how do we get in the business of figuring out whoshould define what misinformation or disinformation is?Nina Jankowicz: To address your first questionrelated to evidence of the dismantling of democracy. There's twonews stories that I think point to this from the last couple ofweeks alone. The first is related to the kidnapping plot againstMichigan Governor Gretchen Whitmer. And the social media platformsplayed a huge role in allowing that group to organize. It allowed,that group to, it ceded the information that led them to organizeand frankly, as a woman online who has been getting harassed a lotlately, lately, with sexualized and gender disinformation, I amvery acutely aware of how those threats that are online cantransfer on to real world violence. And that make no mistake ismeant to keep women and minorities from not only participating inthe democratic process by exercising our votes, but also keeping usfrom public life. So that's one big example. But there was anotherexample just recently from a channel for in the UK documentary thatlooked at how the Trump campaign used Cambridge Analytica data toselectively target black voters with voter suppression ads duringthe 2016 election. Again, this is it's affecting people'sparticipation. It's not just about fake news, stories on theinternet. In fact, a lot of the best disinformation is grounded ina kernel of truth. And in my written testimony, I go through acouple of other examples of how online action has led to real worldaction. And this isn't something that is just staying on theinternet, it is increasingly in real life. Rep. Jim Himes(CT): I don't have a lot of time. Do you think that bothexamples that you offered up Gov the plot to kidnap governor, thegovernor of Michigan, and your other example passed the but fortest? I mean, this country probably got into the Spanish AmericanWar over 130 years ago because of the good works of WilliamRandolph Hearst. So how do we, we've had misinformation and yellowjournalism and terrible media and voter suppression forever. And Iunderstand that these media platforms have scale that WilliamRandolph Hearst didn't have. But are you sure that both of thoseexamples pass the buck for they wouldn't have happened without thesocial media misinformation? Nina Jankowicz: Ibelieve they do, because they allow the organization of thesegroups without any oversight, and they allow the targeting thetargeting of these messages to the groups and people that are goingto find the most vulnerable and are most likely to take actionagainst them. And that's what our foreign adversaries do. Andincreasingly, it's what people within our own country are using toorganize violence against the democratic participation of many ofour fellow citizens. Rep. Jim Himes (CT): Okay,well, I'm out of time I would love to continue this conversationand pursue what you mean by groups being formed quote, withoutoversight, that's language I'd like to better understand but I'mout of time, but I would like to continue this conversation into,well, if this is the problem that you say it is, what do weactually do about it?
Hearing: ONLINE PLATFORMS AND MARKET POWER, PART 2: INNOVATION ANDENTREPRENEURSHIP, Committee on the Judiciary: Subcommittee onAntitrust, Commercial, and Administrative Law, July 16, 2020
Witnesses:
- Adam Cohen: Director of Economic Policy at Google
- Matt Perault: Head of Global Policy Development atFacebook
- Nate Sutton: Associate General Counsel for Competition atAmazon
- Kyle Andeer: Vice President for Corporate Law at Apple
- Timothy Wu: Julius Silver Professor of Law at Columbia LawSchool
- Dr. Fiona Scott Morton: Theodore Nierenberg Professor ofEconomics at Yale School of Management
- Stacy Mitchell: Co-Director at the Institute for LocalSelf-Reliance
- Maureen Ohlhausen: Partner at Baker Botts LLP
- Carl Szabo: Vice President and Gneral Counsel at NetChoice
- Morgan Reed: Executive Director at the App Association
Transcript:
55:15Adam Cohen: Congresswomanwe use a combination of automated tools, we can recognizecopyrighted material that creators upload and instantaneouslydiscover it and keep it from being seen on our platforms.
1:16:00Rep. David Cicilline(RI): Do you use consumer data to favor Amazon products?Because before you answer that, analysts estimate that between 80and 90% of sales go to the Amazon buy box. So you collect all thisdata about the most popular products where they're selling. Andyou're saying you don't use that in any way to change an algorithmto support the sale of Amazon branded products? NateSutton: Our algorithms such as the buy box is aimed topredict what customers want to buy, apply the same criteria whetheryou're a third party seller, or Amazon to that because we wantcustomers to make the right purchase, regardless of whether it's aseller or Amazon. Rep. David Cicilline (RI): Butthe best purchase to you as an Amazon product. NateSutton: No, that's not true. Rep. David Cicilline(RI): So you're telling us you're under oath, Amazon doesnot use any of that data collected with respect to what is selling,where it's on what products to inform the decisions you make, or tochange algorithms to direct people to Amazon products andprioritize Amazon and D prioritize competitors. NateSutton: The algorithms are optimized to predict whatcustomers want to buy regardless of the seller. We provide thissame criteria and with respect to popularity, that's public data oneach product page. We provide the ranking of each product.
3:22:50Dr. Fiona Scott Morton:As is detailed in the report that I submitted as my testimony,there are a number of characteristics of platforms that tend todrive them toward concentrated markets, very large economies ofscale, consumers exacerbate this with their behavioral biases, wedon't scroll down to the second page, we don't. We accept default,we follow the framing the platform gives us and instead ofsearching independently, and what that does is it makes it veryhard for small companies to grow and for new ones to get tractionagainst the dominant platform. And without the threat of entry fromentrepreneurs and growth from existing competitors, the dominantplatform doesn't have to compete as hard. If it's not competing ashard, then there are several harms that follow from that. One ishigher prices for advertisers, many of these platforms areadvertising supported, then there's higher prices to consumers whomay think that they're getting a good deal by paying a price ofzero. But the competitive price might well be negative, theconsumers might well be able to be paid for using these platformsin a competitive market. Other harms include low quality in theform of less privacy, more advertising and more exploitativecontent that consumers can't avoid. Because, as Tim just said,there isn't anywhere else to go. And lastly, without competitivepressure, innovation is lessened. And in particular, it's channeledin the direction the dominant firm prefers, rather than beingcreatively spread across directions chosen by entrance. And this iswhat we learned both from at&t and IBM and Microsoft, is thatwhen the dominant firm ceases to control innovation, there's aflowering and it's very creative and market driven. So the solutionto this problem of insufficient competition is complimentary stepsforward in both antitrust and regulation. Antitrust mustrecalibrate the balance it strikes between the risk of overenforcement and under enforcement. The evidence now shows we'vebeen under enforcing for years and consumers have been harmed.
3:22:50Stacy Mitchell: I hopethe committee will consider several policy tools as part of thisinvestigation. In particular, we very much endorse the approachthat Congress took with regard to the railroads, that if youoperate essential infrastructure, you can't also compete with thebusinesses that rely on that infrastructure.
3:45:00Morgan Reed: Here on thetable, I have a copy of Omni page Pro. This was a software youbought, if you needed to scan documents. If you wanted to turn itinto a processor and you could look at it in a word processor. I'vealso got this great review from PC World, they loved it back in2005. But the important fact here in this review is that it saysthe street price of this software in 2005 was $450. Now, righthere, I've got an app from a company called Readdle, that is nearlythe same product level has a bunch of features that this onedoesn't, it's $6. Basically now consumers pay less than 1% of whatthey used to pay for some of the same capability. And what's evenbetter about that, even though I love the product from Readdle,there are dozens of competitors in the app space. So when you lookat it from that perspective, consumers are getting a huge win. Howhave platforms made this radical drop in price possible? Simplyput, they've provided three things a trusted space, reducedoverhead, and given my developers nearly instant access to a globalmarketplace with billions of customers, before the platforms to getyour software onto a retail store shelf. companies had to spendyears and thousands of dollars to get to the point where adistributor would handle their product, then you'd agree agree to acut of sales revenue, write a check for upfront marketing, agree torefund the distributor the cost of any unsold boxes and then spend10s of thousands of dollars to buy an end cap. Digging a little biton this, I don't know how many of you know or aware that theproducts you see on your store shelf or in the Sunday flyer aren'tthere because the manager thought it was a cool product. Thoseproducts are displayed at the end of an aisle or end cap becausethe software developer or consumer goods company literally pays forthe shelf space. In fact, for many retailers the sale of floor thesale of floor space and flyers makes a huge chunk of theirprofitability for their store. And none of this takes intoconsideration printing boxes, manuals, CDs, dealing with creditcards if you go direct translation services, customs authorities ifyou want to sell abroad in the 1990s it cost a million dollars tostart up a software company. Now it's $100,000 in sweat equity. Andthanks to these changes, the average cost for consumer software hasdropped from $50 to three. For developers. Our cost to market hasdropped enormously and the size of our market has expandedglobally.
3:48:55Stacy Mitchell: I'vespent a lot of time interviewing and talking with independentretailers, manufacturers of all sizes. Many of them are very muchafraid of speaking out publicly because they fear retaliation. Butwhat we consistently hear is that Amazon is the biggest threat totheir businesses. We just did a survey of about 550 independentretailers nationally, Amazon ranked number one in terms of beingwhat they said was the biggest threat to their business above,rising healthcare costs, access to capital, government, red tape,anything else you can name. Among those who are actually selling onthe platform, only 7% reported that it was actually helping theirbottom line. Amazon has a kind of godlike view of a growing shareof our commerce and it uses the data that it gathers to advantageits own business and its own business interests in lots of ways. Alot of this, as I said, comes from the kind of leverage its abilityto sort of leverage the interplay between these different businesslines to maximize its advantage, whether it's promoting its ownproduct because that's lucrative or whether it's using themanufacturer of a product to actually squeeze a seller or vendorinto giving it bigger discounts.
[3:53:15] Rep. Kelly Armstrong (ND):When we recognize, I come from very rural area, the closest, whatyou would consider a big box store is Minneapolis or Denver. So andso when we're talking about competition, all of this I also thinkwe've got to remember, at no point in time from my house inDickinson, North Dakota have I had more access to more diverse andcheap consumer products. I mean, things that often would require aplane ticket or a nine hour car ride to buy can now be brought toour house. So I think when we're talking about consumers, we needto remember that side of it, too.
Hearing: EMERGING TRENDS IN ONLINE FOREIGN INFLUENCE OPERATIONS: SOCIALMEDIA, COVID–19, AND ELECTION SECURITY, Permanent SelectCommittee on Intelligence, June 18, 2020
Witnesses:
- Nathaniel Gleicher: Head of Security Policy at Facebook
- Nick Pickles: Director of Global Public Policy Strategy andDevelopment at Twitter
- Richard Salgado: Director for Law Enforcement and InformationSecurity at Google
Transcript:
19:16Nathaniel Gleicher:Facebook has made significant investments to help protect theintegrity of elections. We now have more than 35,000 people workingon safety and security across the company, with nearly 40 teamsfocused specifically on elections and election integrity. We'realso partnering with federal and state governments, other techcompanies, researchers and civil society groups to shareinformation and stop malicious actors. Over the past three years,we've worked to protect more than 200 elections around the world.We've learned lessons from each of these, and we're applying theselessons to protect the 2020 election in November.
21:58Nathaniel Gleicher: We'vealso been proactively hunting for bad actors trying to interferewith the important discussions about injustice and inequalityhappening around our nation. As part of this effort, we've removedisolated accounts seeking to impersonate activists, and twonetworks of accounts tied to organize hate groups that we'vepreviously banned from our platforms.
26:05Nick Pickles: Firstly,Twitter shouldn't determine the truthfulness of tweets. Andsecondly, Twitter should provide context to help people make uptheir own minds in cases where the substance of a tweet isdisputed.
26:15Nick Pickles: We prioritizeinterventions regarding misinformation based on the highestpotential for harm. And the currently focused on three main areasof content, synthetic & manipulated media, elections and civicintegrity and COVID-19.
26:30Nick Pickles: Where contentdoes not break our rules and warrant removal. In these three areas,we may label tweets to help people come to their own views byproviding additional context. These labels may link to a curatedset of tweets posted by people on Twitter. This include factualstatements, counterpoint opinions and perspectives, and ongoingpublic conversation around the issue. To date, we've applied theselabels to thousands of tweets around the world across these threepolicy areas.
31:10Richard Salgado: In search,ranking algorithms are an important tool in our fight againstdisinformation. Ranking elevates information that our algorithmsdetermine is the most authoritative, above information that may beless reliable. Similarly, our work on YouTube focuses onidentifying and removing content that violates our policies andelevating authoritative content when users search for breakingnews. At the same time, we find and limit the spread of borderlinecontent that comes close but just stops short of violating ourpolicies.
53:28Rep. Jackie Speier (CA):Mr. Gliecher, you may or may not know that Facebook isheadquartered in my congressional district. I've had manyconversations with Sheryl Sandberg. And I'm still puzzled by thefact that Facebook does not consider itself a media platform. Areyou still espousing that kind of position? NathanielGleicher: Congresswoman, we're first and foremost atechnology company. We may be a technology company, but it's yourtechnology company is being used as a media platform. Do you notrecognize that? Congresswoman, we're a place for ideas across thespectrum. We know that there are people who use our platforms toengage and in fact that is the goal of the platform's to encourageand enable people to discuss the key issues of the day and to talkto family and friends.
54:30Rep. Jackie Speier (CA):How long or or maybe I should ask this when there was a video ofSpeaker Pelosi that had been tampered with - slowed down to makeher look like she was drunk. YouTube took it down almostimmediately. What did Facebook do and what went into your thinkingto keep it up? Nathaniel Gleicher: Congresswomanfor a piece of content like that, we work with a network of thirdparty fact checkers, more than 60 3rd party fact checkers aroundthe world. If one of them determines that a piece of content likethat is false, and we will down rank it, and we will put aninterstitial on it so that anyone who would look at it would firstsee a label over it saying that there's additional information andthat it's false. That's what we did in this context. When we downrank, something like that, we see the shares of that video,radically drop. Rep. Jackie Speier (CA): But youwon't take it down when you know it's false. NathanielGleicher: Congresswoman, you're highlighting a reallydifficult balance. And we've talked about this amongst ourselvesquite a bit. And what I would say is, if we simply take a piece ofcontent like this down, it doesn't go away. It will exist elsewhereon the internet. People who weren't looking for it will still findit. Rep. Jackie Speier (CA): But it you know,there will always be bad actors in the world. That doesn't meanthat you don't do your level best to show the greatest deal ofcredibility. I mean, if YouTube took it down, I don't understandhow you couldn't have taken down but I'll leave that where itlays.
1:40:10Nathaniel Gleicher:Congressman, the collaboration within industry and with governmentis much, much better than it was in 2016. I think we have found theFBI, for example, to be forward leaning and ready to shareinformation with us when they see it. We share information withthem whenever we see indications of foreign interference targetingour election. The best case study for this was the 2018 midterms,where you saw industry, government and civil society all cometogether, sharing information to tackle these threats. We had acase on literally the eve of the vote, where the FBI gave us a tipabout a network of accounts where they identified subtle links toRussian actors. Were able to investigate those and take action onthem within a matter of hours.
1:43:10Rep. Jim Himes (CT): Itend to be kind of a First Amendment absolutist. I really don'twant Facebook telling me what's true and what's not true mainlybecause most statements are some combination of both.
1:44:20Nathaniel Gleicher:Certainly people are drawn to clickbait. They're drawn to explosivecontent. I mean, it is the nature of clickbait, to make people wantto click on it, but what we found is that if you separate it outfrom the particular content, people don't want a platform orexperience, just clickbait, they will click it, if they see it,they don't want it prioritized, they don't want their time to bedrawn into that and all emotional frailty. And so we are trying tobuild an environment where that isn't the focus, where they havethe conversations they want to have, but I agree with you. A corepiece of this challenge is people seek out that type of contentwherever it is. I should note that as we're thinking about how weprioritize this, one of the key factors is who your friends are thepages and accounts that you follow and the assets that you engagewith. That's the most important factor in sort of what you see. Andso people have direct control over that because they are choosingthe people they want to engage.
Hearing: ONLINE PLATFORMS AND MARKET POWER, PART 1: THE FREE AND DIVERSEPRESS, Committee on the Judiciary: Subcommittee on Antitrust,Commercial, and Administrative Law, June 11, 2020
Witnesses:
- David Chavern: President of the News Media Alliance
- Gene Kimmelman: President of Public Knowledge
- Sally Hubbard: Director of Enforcement Strategy at the OpenMarkets Institute
- Matthew Schrurers: Vice President of Law and Policy at theComputer and Communications Industry Association
- David Pitofsky: General Counsel at News Corp
- Kevin Riley: Editor at the Atlanta Journal-Constitution
Transcript:
55:30David Chavern: Platformsand news organizations mutual reliance would not be a problem, ifnot for the fact that the concentration among the platforms means asmall number of companies now exercise an extreme level of controlover the news. And in fact, a couple of dominant firms act asregulators of the news industry. Only these regulators are notconstrained by legislative or democratic oversight. The result hasbeen to siphon revenue away from news publishers. This trend isclear if you compare the growth in Google's total advertisingrevenue to the decline in the news industry's ad revenue. In 2000,Google's US revenue was 2.1 billion, while the newspaper industryaccounted for 48 billion in advertising revenue. In 2017, incontrast, Google's US revenue had increased over 25 times to 52.4billion, the newspaper industry's ad revenue had fallen 65% to 16.4billion.
56:26David Chavern: The effectof this revenue decline in publishers has been terrible, andthey've been forced to cut back on their investments in journalism.That is a reason why newsroom employment has fallen nearly aquarter over the last decade. One question might be asked is if theplatforms are unbalanced, having such a negative impact on the newsmedia, then why don't publishers do something about it? The answeris they cannot, at least under the existing antitrust laws, newspublishers face a collective action problem. No publisher on itsown can stand up to the tech giants. The risk of demotion orexclusion from the platform is simply too great. And the antitrustlaws prevent news organizations from acting collectively. So theresult is that publishers are forced to accept whatever terms orrestrictions are imposed on them.
1:06:20Sally Hubbard: Facebookhas repeatedly acquired rivals, including Instagram and WhatsApp.And Google's acquisition cemented its market power throughout thead ecosystem as it bought up the digital ad market spoke by spoke,including applied semantics AdMob and Double Click. TogetherFacebook and Google have bought 150 companies in just the last sixyears. Google alone has bought nearly 250 companies.
1:14:17David Pitofsky:Unfortunately, in the news business, free riding by dominant onlineplatforms, which aggregate and then reserve our content has led tothe lion's share of online advertising dollars generated off theback of news going to the platforms. Many in Silicon Valleydismissed the press as old media failing to evolve in the face ofonline competition. But this is wrong. We're not losing business toan innovator who has found a better or more efficient way to reportand investigate the news. We're losing business because thedominant platforms deploy our news content, to target our audiencesto then turn around and sell that audience to the same advertiserswe're trying to serve.
1:15:04David Pitofsky: Theerosion of advertising revenue undercuts our ability to invest inhigh quality journalism. Meanwhile, the platforms have little ifany commitment to accuracy or reliability. For them, a news articleis valuable if viral, not if verified.
1:16:12David Pitofsky: Newspublishers have no good options to respond to these challenges. Anypublisher that tried to withhold its content from a platform aspart of a negotiating strategy would starve itself of readertraffic. In contrast, losing one publisher would not harm theplatform's at all since they would have ample alternative sourcesfor news content.
1:36:56Rep. Pramila Jayapal(WA): So Miss Hubbard, let me start with you. You were anAssistant Attorney General for New York State's antitrust division.You've also worked as a journalist, which online platforms wouldyou say are most impacting the public's access to trustworthysources of journalism? And why? Sally Hubbard:Thank you for the question. Congresswoman, I think in terms ofdisinformation, the platforms that are having the most impact areFacebook and YouTube. And that's because of their business models,which are to prioritize engagement, engaging content because of thehuman nature that you know survival instinct, we tend to tune intothings that make us fearful or angry. And so by prioritizingengagement, these platforms are actually prioritizingdisinformation as well. It serves their profit motives to keeppeople on the platforms as long as possible to show them ads andcollect their data. And because they don't have any competition,they're free to pursue these destructive business models withouthaving any competitive constraint. They've also lacked regulation.Normally, corporations are not permitted to just pursue profitswithout regard to the consequences.
1:38:10Rep. Pramila Jayapal(WA): The Federal Trade Commission has repeatedly declinedto interfere, as Facebook and Google have acquired would becompetitors. Since 2007, Google has acquired Applied Semantics,Double Click and AdMob. And since 2011, Facebook has acquiredInstagram and WhatsApp. What do these acquisitions mean forconsumers of news and information? I think sometimes antitrust isseen and regulation is seen as something that's out there. But thishas very direct impact for consumers. Can you explain what thatmeans as these companies have acquired more and more? SallyHubbard: Sure, so in my view, those, of all of theacquisitions that you just mentioned, were illegal under theClayton Act, which prohibits mergers that may lessen competition.Looking back, it's clear that all of those mergers did lessencompetition. And when you lessen competition, the harms toconsumers are not just high prices, which was which are harder tosee when in the digital age. But its loss of innovation is loss ofchoice, and loss of control. So when we approve anti competitivemergers, consumers are harmed.
1:55:48Rep. Matt Gaetz (FL):Section 230, as I understand it, and I'm happy to be corrected byothers, would say that if a technology platform is a neutral publicplatform, that they enjoy certain liability protections thatnewspapers don't enjoy, that Newscorp doesn't enjoy with itsassets. And so does it make the anti competitive posture oftechnology platforms more pronounced, that they have access to thisspecial liability protection that the people you represent don'thave access to? David Chavern: Oh, absolutely.There's a huge disparity. Frankly, when our contents deliveredthrough these platforms, we get the liability and they get themoney. So that's a good deal from that end. We are responsible forwhat we publish, we publishers can and do get sued. On the otherhand, the platforms are allowed to deliver and monetize thiscontent with complete lack of responsibility.
Hearing: Election Interference: Ensuring Law Enforcement is Equipped toTarget Those Seeking to Do Harm, Senate Judiciary Committee,June 12, 2018
Witnesses:
- Adam Hickey - Deputy Assistant Attorney General for theNational Security Division at the Department of Justice
- Matthew Masterson - National Protection and ProgramsDirectorate at the Department of Homeland Security
- Kenneth Wainstein - Partner at Davis Polk & Wardwell, LLP
- Prof. Ryan Goodman - New York University School of Law
- Nina Jankowicz - Global Fellow at the Wilson Center
Transcript:
9:00Senator Dianne Feinstein(CA): We know that Russia orchestrated a sustained andcoordinated attack that interfered in our last presidentialelection. And we also know that there’s a serious threat of moreattacks in our future elections, including this November. As theUnited States Intelligence Community unanimously concluded, theRussian government’s interference in our election—and Iquote—“blended covert intelligence operations, such as cyberactivity, with overt efforts by the Russian government agencies,state-funded media, third-party intermediaries, and paidsocial-media users or trolls.” Over the course of the past year anda half, we’ve come to better understand how pernicious theseattacks were. Particularly unsettling is that we were so unaware.We were unaware that Russia was sowing division through masspropaganda, cyber warfare, and working with malicious actors to tipscales of the election. Thirteen Russian nationals and threeorganizations, including the Russian-backed Internet ResearchAgency, have now been indicted for their role in Russia’s vastconspiracy to defraud the United States.
Hearing: Facebook, Google and Twitter Executives on RussianDisinformation, Senate Judiciary Subcommittee on Crime andTerrorism, October 31, 2017
Witnesses:
- Colin Stretch - Facebook Vice President and GeneralCounsel
- Sean Edgett - Twitter Acting General Counsel
- Richard Salgado - Google Law Enforcement & Information SecurityDirector
- Clint Watts - Foreign Policy Research Institute, NationalSecurity Program Senior Fellow
- Michael Smith -New America, International Security Fellow
Transcript:
2:33:07Clint Watts: Lastly, Iadmire those social-media companies that have begun working tofact-check news articles in the wake of last year’s elections.These efforts should continue but will be completely inadequate.Stopping false information—the artillery barrage landing onsocial-media users comes only when those outlets distributing bogusstories are silenced. Silence the guns, and the barrage will end. Ipropose the equivalent of nutrition labels for information outlets,a rating icon for news-producing outlets displayed next to theirnews links and social-media feeds and search engines. The iconprovides users an assessment of the news outlet’s ratio of factversus fiction and opinion versus reporting. The rating systemwould be opt-in. It would not infringe on freedom of speech orfreedom of the press. Should not be part of the U.S. government,should sit separate from the social-media companies but be utilizedby them. Users wanting to consume information from outlets with apoor rating wouldn’t be prohibited. If they are misled about thetruth, they have only themselves to blame.
Cover Art
Design by Only Child Imaginations
Music Presented in This Episode
Intro & Exit: Tired of Being Lied To by David Ippolito (found onMusic Alley by mevio)