[00:00] [music]

Kevin Garber: [00:11] Good evening, good afternoon, good morning. My name is Kevin Garber. I am the CEO of ManageFlitter.com. I hope that you're a user of our products. I know we have many users who listen to this podcast.

[00:26] It is Friday, the second of December 2016. We nearly are done with 2016. Thanks for listening to the podcast. I hope you enjoy it. I hope you find it interesting. We have got a fantastic show lined up for you today. As usual my co-host is the ManageFlitter design lead Kate Frappell. How are you Kate?

Kate Frappell: [00:47] I am good, thank you. How about you?

Kevin: [00:49] We are melting in the office today. It is scorching in Sydney. Of course, as luck would have it, our aircon is out on the hottest day of the year, but so be it. I just picture that we're sitting on a beach in Hawaii.

Kate: [01:02] It is the second day of summer, technically.

Kevin: [01:05] True, technically. In Australia, it's always intrigued me. Summer started the beginning of December. In South Africa summer felt like it started in October, so I could never...When people say, "It's the first day of summer." I always want to say, "It's been summer for a while." I know in Australia it's hard and fast, first of December. That's when people consider summer to be.

Kate: [01:24] I feel it's been summer for about a month now.

Kevin: [01:27] Yes, we've had some nice weather. Coming up later in the show, we chat to Peter Cohan. I think he's been on the podcast before. He's the most interviewed guest on the podcast. I think it's number three. We had a fascinating chat about the Facebook fake news phenomenon that's been discussed.

[01:45] The fake news stories on Facebook, did they impact on the election? If so, is it Facebook's responsibility to help them? Who generates these fake news stories? We chatted a little bit about that. Peter Cohan is lecturer in Boston, Massachusetts, in business. He's a contributing writer for "Forbes". We had a fantastic chat. That's coming up a little bit later.

[02:10] We have a feature on the podcast every now and then. We try to do it weekly, but sometimes we miss it. We have a startup minute feature. If you're a startup, work for a startup, a founder of a startup, invest in a startup, we'd like to give a little bit of publicity to interesting startups.

[02:30] We've had a startup to come through this week, coming up next. If you want to be a part of it, email an audio 30 seconds in length to Podcast@itsamonkey.com. We'll play it on the show, and we'll also give you that much valuable link in the show notes. Everything's about SEO and getting links and traffic, so that will definitely help you guys, so feel free to email us. Here's this week's Startup Minute.

[02:57] [startup minute]

Natalie Goldman: [02:58] Hi Kevin, and the podcast crew. My name's Natalie Goldman. I'm the CEO of FlexCareers. It's great to be here, as I love listening to your show. I wanted to take a minute to let you know all about FlexCareers.

[03:09] We're changing the way that careers work by redefining success, rewriting talent management, and realigning workplace expectations with the workforce of today. FlexCareers is a disruptive online talent matching platform connecting talented women with progressive employers offering flexible work.

[03:26] We've engineered game-changing technology that is redefining careers by challenging convention and leading the future of talent matching and career support.

[03:34] Thanks for giving me the opportunity to let you know more about FlexCareers. I love the work that you guys do. Thanks for supporting the entrepreneurial community. Bye.

Kevin: [03:44] As always, we like to start off with a couple of news items just to keep you up to date with some of the goings-on in the tech industry. Kate, a few months ago we were hearing a lot and seeing a lot of the phenomenon of Pokémon Go. What's happening with Pokémon Go? They've come out with some statements recently.

Kate: [04:02] Since their initial release in August, their user numbers have dropped dramatically. Now they're at this stage where they're trying to get a lot of the initial adopters back onto the game. Some of the ways they're doing this are double rewards, and extra points over holiday seasons, like Thanksgiving and Halloween. They're also bringing in anticipated Pokémons, such as Ditto.

Kevin: [04:29] I'm looking at the graph of daily active users and it's this real amazing spike around the beginning of August, and then it just starts plummeting right down. Why did people lose interest? I've tried it once. I'm not really a gamer. Why were they so into it? Then why didn't it hold and continue captivating its users?

Kate: [05:04] From my opinion, Pokémon Go was the first AR-type game that came out. Regardless of the fact that it was Pokémon, people were jumping on the idea of this game. I even downloaded it, played it with some friends, lasted a few months, but we quickly lost interest because we don't have an interest in Pokémon.

Kevin: [05:27] You used to see swarms of people around the city, catching Pokémons, etc. Is it because the game offered them nothing new and it was just more of the same?

Kate: [05:41] Nothing is the opposite. The game actually offers something new. These gamers are used to seeing...

Kevin: [05:46] The novelty...

Kate: [05:46] Yes.

Kevin: [05:46] The novelty effect initially bumped it up.

Kate: [05:49] Yes, initially. You never had to go outside to play the game. You could sit in front of your TV with a game console. Now you actually had to get up and walk around to capture these Pokémon. Everybody jumped on the bandwagon, then they just lost interest, lost traction.

Kevin: [06:08] Usually games are pretty tough to get traction.

Kate: [06:11] They had a really great honeymoon period.

Kevin: [06:14] We're actually going to be chatting to our AR/VR experts in one of upcoming shows. Fascinating area, AR and VR. They're still making quite a lot of money. I've seen this article that they're still making about over three million dollars a day. That's a significant amount of money. Not many companies, you look at this graph and millions a day, so there's still... [laughs]

Kate: [06:44] In this article, it says they're bringing out an Apple watch version, but this also mean so many room is on Facebook that they're bringing out a Harry Potter version of the game. You get rid of Pokémon and suddenly all the Harry Potter fans are on the app.

Kevin: [06:56] There's definitely something in it. AR is this technology that's just waiting to break through. People just want it to break through. We all want it because we can just see the incredible fusion of reality with the created reality, what that can actually do.

Kate: [07:20] Yeah, but you know what's interesting about Pokémon too, is they have this AR view. You've got the creature that appears in a real life scenario. For example, it's appearing on the chair next to you. But that view uses up way too much battery, so everyone turns it off. Then you're back to this animated type vector image, instead of being on the chair to save the battery.

Kevin: [07:45] That's why it's so often technological breakthroughs happen because of a coming together of various technology. For instance, if suddenly battery technologies would improve, people would be experiencing a better version of AR, right?

Kate: [07:58] Exactly.

Kevin: [07:58] So it goes. Often there's a lot of dependencies that need to get sorted out. Let's see what happens with Pokémon Go. I tried it once or twice. I'm not a gamer, though.

Kate: [08:12] No, me either.

Kevin: [08:13] I get bored pretty easily with that sort of stuff.

Kate: [08:17] I'll admit, I went out with two friends when it first came out and we drove around in the suburbs to try and find Pokémon. It's fun, but there's also lots of different things implemented to stop people from using it while driving. If you're going too fast, you can't catch Pokémon, even if you're just a passenger. It was fun, but we haven't done it since. I think we've all deleted the app since then as well.

Kevin: [08:42] [laughs]

Kate: [08:43] We're not just gamers.

Kevin: [08:44] The ultimate insult, uninstall.

Kate: [08:46] The other thing is that if you're not really into Pokémon, then you don't get that thrill out of catching particular Pokémon. The only one I knew about was Pikachu and I never saw Pikachu.

Kevin: [09:01] Some people know all of them. How many are there, do you know?

Kate: [09:03] No idea. [laughs]

Kevin: [09:04] Some people know. Anyway, that's Pokémon Go. Under your game theme, Facebook are getting into the game side of things through their Messenger. They're doing everything through Messenger. I think they're backing it as just such a cool part of their offering.

Kate: [09:20] I'm starting to get to a point where I feel like they've got too much. There's way too much happening in there. There's way too many options.

Kevin: [09:28] There's bots. You can get news from bots in Messenger. You can form groups in Messenger.

Kate: [09:35] Messenger is now a gaming app. It's Snapchat and it's a messaging app, but all in one. There's so much happening.

Kevin: [09:44] Tell us about these games that they've announced.

Kate: [09:46] The games, a lot of them are the ones that have been around for a long time -- Farmville and Words with Friends. But what they've done is they've put it straight into Messenger so you can challenge your friends directly.

[09:58] You can also post your results in your news feed, apparently. Candy Crush and stuff. I see people in my Newsfeed playing that all the time, but I don't know how they're challenging people. Previously, I don't think it was in Messenger.

Kevin: [10:13] This may be a more direct way of engaging with them. I played something on Facebook Messenger a little while ago. I have only remembered it now. They brought out Chess in Messenger.

[10:25] I played with my one friend. The syntax was a bit confusing to move, as if you had to put a certain set of instructions, and both of us would struggle with the syntax every now and then, but it was quite fun. Once a day or twice a day you'd go. It was a little bit competitive, tease each other if you beat them. There was a sort of a novelty and a fun factor in there.

Kate: [10:53] Particularly if you can get into a group conversation. It depends on the quality of the game.

Kevin: [11:01] Of course, gaming is massive. Not only that gaming is massive, watching people play games is massive.

Kate: [11:08] There is a whole culture around gaming.

Kevin: [11:10] Is it Twitch? Which is like YouTube but you watch people playing games. There's some famous gamers that people love to watch play games. This is a whole world that you and I aren't really...

[11:24] If you are listening and you are a massive gamer, come and talk to us about Twitch and the phenomenon of watching people play games. They are very complex, intricate games, and to see someone talented play these games is a thing.

[11:41] We are going to take a little break. When we come back, we are going to play the interview we had with Peter Cohan. He is a business lecturer and he is a contributor to Forbes Magazine.

[11:52] I talked to him about fake news on Facebook and where this comes from. Is Facebook capitalizing on this in terms of their revenue? Are they getting a chunk of revenue because people are reading fake news articles, etc.? We talk about all those things after the break.

Dave Zarate: [12:09] Hi, I am Dave Zarate. I'm the Customer Support Specialist here at ManageFlitter. ManageFlitter is a tool that helps you work faster and smarter on Twitter. With ManageFlitter you can clean up and grow your Twitter account. You all also get access to useful Twitter analytics, social content scheduling, and much more. Go to ManageFlitter.com and start your free trail today.

Kevin: [12:32] You're back with "It's a Monkey" podcast. I'm Kevin Garber. Of course, the big news story over the last few months has been the American election. One of the news pieces doing the round, one of the topics doing the round, was fake news on Facebook, and how has that influenced the election.

[12:51] According to a study, the Pew Research Center, they said 44 percent of adults looked at Facebook, or got a piece of election news at least once from Facebook. Facebook is hugely influential. Mark Zuckerberg came out with a few comments, saying that he's pretty sure that Facebook didn't sway their election.

[13:14] In my Inbox, a couple of days ago, Peter Cohan, who is a lecturer in entrepreneurship at Babson College at Massachusetts, and is a contributor at Forbes website, wrote an article, "Does Facebook generate over half its ad revenue from fake news?"

[13:36] We have had Peter on the podcast before, talking about various things, so I am happy to welcome him back. Peter, thank you so much for joining us.

Peter Cohan: [13:43] Thanks for inviting me.

Kevin: [13:44] It is quite an interesting phenomenon, this whole fake news on Facebook. Firstly, what is the difference between fake news, subjective news, and why is this fake news such a problem?

Peter: [14:04] First of all, it's a very interesting question. There's probably places where the line between fake news and opinion is blurry. The "New York Times," last week or a week or two ago, did a very interesting story on a piece of fake news that was based on a photograph of a bus in Austin, Texas, I think it was.

[14:33] Essentially what happened was a person took a picture of a bus, they tweeted it out, and they said, "This is a bus that was used to take protesters to Texas, and it's paid for by Hillary Clinton. All the protestors on the bus are Hillary Clinton people who have been paid by Hillary Clinton to protest the trump election."

[14:59] It turns out that this particular tweet was shared, say 360,000 times, all across Facebook and everywhere else. Then it turns out that somebody at the New York Times, or somewhere, did a call, and discovered that the bus was owned by a company called Tablo Software, that was using it to transport people who were attending an event in Austin, Texas.

[15:25] The claim by this tweet was completely 100 percent false. The people who shared the corrected version of the tweet, which disavowed the truth and claimed the whole original tweet was false, was shared 3,600 times. You can see there is a huge appetite for fake news, and not much of an appetite for correcting the fake news.

Kevin: [15:54] The interesting phenomenon why fake news is so compelling, which you note in your article, is the confirmation bias effect. When we see something that backs up our worldview, we'd like to be a part of extending it and sharing it and communicating that.

Peter: [16:13] Exactly. This is something I've been teaching for the last couple of semesters. One of the courses I teach is "Strategic Decision Making." About a year ago I revised the course and added several classes on this concept of behavioral economics. The most popular form of behavioral economics is what you just mentioned, confirmation bias.

[16:42] Which is essentially the idea that most people emotionally reach a conclusion because of what Daniel Kahnemann -- who wrote the book "Thinking Fast and Slow" and won a Nobel Prize in 2002 -- what he noticed is that, he kind of positived the existence of two kinds of systems in the human brain.

[17:00] One is called "System 1." System 1 reaches a conclusion very quickly based on emotions and gut reaction and very little information. Then there is another thing called "System 2," which is what you are taught in school, which is gather a lot of data, evaluate the pros and cons, and reach a very well-informed conclusion.

[17:21] It turns out that people don't always make their decisions based on System 2. In fact, a lot of times, what they do is reach a conclusion based on System 1, and when they see information that comes in that may not be consistent with that System 1 conclusions they are already reached, they reject it, and say, "It's not really true, it's not relevant. The only thing that is true is the information that reinforces the things that I already believe."

[17:46] You can go back to that example of the bus in Austin, Texas, although there is a lot of people out there who really wanted to believe that Hillary Clinton was paying people to protest, and therefore, when they saw that Tweet, they shared it with a lot of people who also wanted to see it.

Kevin: [18:03] Facebook is not a publisher of news, but they've got very granular control over the news feed. What do you feel is their responsibility to get involved in this and to offset the confirmation bias?

[18:20] Mark Zuckerberg said that they are going to try and should do a better job of flagging suspected sources. Where does Facebook themselves fit into all of this, factoring in freedom of speech and freedom of publishing, and the fact that all of these platforms are supposed to democratize information?

Peter: [18:37] What I think is easy to do? In my article I had a link to an article about this. It's very easy to put a banner on top of a fake news story and say, "This is from a source that is generally using fake news." The person reading it would know that it was probably fake news.

[19:03] When I look at the New York Times online, they have these things called, "Sponsored stories," which are advertisements somewhat dressed up to look like a real story. You know if you click on it, you're going to get an advertisement.

[19:18] It seems to me that is at a minimum what they should do. If it were up to me, I would make sure that the only information that gets out on Facebook is true. Somehow or other, I don't think that's going to happen.

[19:33] The interesting thing to me is, if Facebook wasn't making any money off of fake news, I don't think they would have a problem doing that. So it must be making a significant amount of money on fake news, but the challenge that I have been coming up with so far is figuring out how much money they're making. I'm pretty sure it's significant, otherwise they would get rid of it.

Kevin: [19:57] They make money by having people on their platform consuming different content, which trickles through to the click-throughs in the ads. It's an indirect moneymaking, right?

Peter: [20:11] If people spend their time on Facebook doing a lot of different things. Essentially, one of the things they do is they spend time watching news or reading what they call news.

[20:28] What is interesting to me is there was a BuzzFeed news article that I looked at and mentioned in my article in Forbes. What the BuzzFeed news analysis concluded was that during the run-up to the election, the amount of fake news that was out there represented 54.2 percent of the total amount of news.

[20:57] There was more fake news than real news. There was more shares of the fake news than there was of the real news, by slightly over half. What I also did was I talked to a professor at Harvard Business School, who explained to me a bit more how the advertising works, which is what you are suggesting.

[21:18] What he said was it was not unreasonable to estimate how much revenue they make based on how much time people spend doing different things on websites. The way the advertising is, most of it is sold is on the basis of cost per click. An advertisement goes up next to your news feed. If you click on it that is how Facebook makes money.

[21:43] The general feeling is there is some proportionality between how much time people spend doing different things on Facebook, like communicating with friends using messages or photos, or conducting commerce on brand pages or reading news.

[21:59] If you knew how much time they spent on those activities, and you could figure out how much of their total time they're spending reading news, and then you figure out how much of the news they're looking at is fake, you can probably come up with a very good estimate for what Facebook's fake ad revenue is.

[22:17] That's a long-winded way of saying, "That's the method I would use to try to estimate." I have asked Facebook to tell me, and they have staunchly given me the cold shoulder.

Kevin: [22:28] No surprise about that. What I find in the economic angle so fascinating, Peter, is I did a bit of research into why are people writing these news fake stories, what's going on? What's their motivation? What is the economics on that side?

[22:43] It is actually quite simple. A large proportion of these fake news stories come from one town in Macedonia, and where these high school kids have worked out that Trump's stories get shared the most on Facebook. They create these fake news stories, often scraped from conservative alarmist-type sites in the US.

[23:10] They create their sites, they publish it on Facebook, the things get shared, the people click through to their sites, they've got an AdSense account, they get traffic on their sites, and money trickles through to them. Being in Macedonia and students, a relatively small amount of money pays for, one kid said, "It pays for my musical gear," and other things like that.

[23:31] It's quite remarkable that we live in a world that hacks like that can have such large trickle-through effects right through to the election of, arguably, the most powerful person in the world.

Peter: [23:43] Yes. In some respects I respect their entrepreneurial skill. On the other hand, I feel as though there is something nefarious and evil about it. Maybe that's just reflecting my own political views. In any case, I definitely think that it is very powerful.

[24:08] The real power, from a political standpoint -- this is stepping away from the economic standpoint for the moment -- the real power is which candidate can create that emotional connection with enough voters. The emotional connection they create gives them a hunger for news that reinforces what they have been made to believe because they have this emotional connection with one of the candidates.

[24:35] The interesting thing is -- you mentioned the Macedonian thing -- I don't know if you mentioned this, they tried to come up with news that would be related to Hilary Clinton. It just didn't make them much money, so they tried Donald Trump. That was the one that made the money.

[24:52] Essentially they were not being loyal to any candidate. They were being loyal to their own desire to make money. It was all greed. "Which one would make the most money?" It was almost as if the amount of money they were making on fake news was a very useful metric for how much emotional connection there was between both candidates and their voters.

Kevin: [25:15] Not to go down the whole political rabbit hole, but if I was in the Democrat party, there's some really interesting findings among all of this, right back to your point about emotional connection, etc. There is some interesting AB testing that has been done on their behalf.

Peter: [25:41] Part of me was looking at reports on the New York Times' stuff on Hillary Clinton had about an 84 percent chance of winning the election the night of the election. I had this gut fear that I got from talking to people in the UK about the Brexit vote, that there was a much stronger emotional connection between Trump and his voters than there was between Clinton and her voters.

[26:07] Essentially, the predominance of the fake news is a reflection of the strength of the candidate in creating an emotional bond with the voters. It's a reflection of the fact that Hillary Clinton did not have passionate voters who really were emotionally connected to her. Barack Obama did a much better job of creating emotional connections with enough voters so that he won, twice.

[26:38] It is a measure of her relative lack of skill as a retail politician. I do not think there is any remedy for that, that's buried in the statistics you would get from analyzing face news, except that it would show you, on the face of it obvious to somebody observing the two candidates, that one of them was better creating an emotional connection that the other one.

[27:09] The next election in America will probably come down to which candidate is able to create the stronger emotional bond with enough voters to get them to win.

Kevin: [27:20] Interesting. Creating emotional connections with your voters and fake news, these are not new phenomena. The only thing new is that these platforms allow you to amplify so much in such a remarkably concentrated manner.

Peter: [27:39] I've long wondered why people, in my opinion, waste so much time on Facebook. I always find it appalling how much productivity is lost because of people wasting time on Facebook. I'm not on Facebook, so I do not have that emotional attachment.

[27:57] One of the things I mentioned in my article, which I wrote about several years ago, was the idea that people who spend time on Facebook, "like" something they put on Facebook, or they get some kind of a response, they get a little bit of shot of dopamine in their brain.

[28:14] In a way, it is a dopamine delivery system, a dopamine stimulus system. Dopamine is a powerful chemical that makes people feel good. People are hungry for that dopamine injection. Somehow or other the initial dose of dopamine that works for you is not enough over time. You need more and more. You are hungry for it.

[28:43] Being able to keep get theses blasts of dopamine by seeing more fake news that talks about how great Donald Trump is and how terrible Hillary Clinton is, is really highly in demand by these people.

Kevin: [28:56] Some of these domains that the Macedonian kids set up, donaldtrump.co, trumpvision365.com, usdailypolitics.com.

Peter: [29:10] If people who were looking for fake news saw a little label on top that said, "This is fake news," they wouldn't care. They'd still love to share it with other people because it was saying what they want to hear.

Kevin: [29:27] Every now and then I bump into people who are into quite remarkable conspiracy theories. I always say to them, and the way they talk with so much passion about these conspiracy theories -- of course they don't label it conspiracy theories -- I say to them, "It's just because your conspiracy theories are so colorful, and the truth is usually quite boring."

Peter: [29:46] People really like good stories. That's another thing that comes out of this Kahnemann book, "Thinking Fast and Slow." Right in the beginning of the book, he talks about how important it is to convey information in the form of good stories. This is another mystery I don't think I have figured out, yet.

[30:06] People do love good stories, so conspiracy theory stories are often compelling stories, and they do not want to let go of them.

Kevin: [30:13] The truth is pretty boring. Facebook, and fake news, do you think it will be easy to draw the line between subjective opinion pieces and fake news? That maybe if the very clearly fake news stories will remove about 90 percent, stories that say the Pope has endorsed Donald Trump, but that just turning the dial down slightly will increase the signal to noise ratio significantly?

Peter: [30:52] Part of me is thinking that what happened is that people would look at...if you labeled the story about the Pope endorsing Donald Trump as fake, people would assume that the label is fake. But it's possible if they just stop publishing those stories all together.

[31:13] If they just said, "This is fake, we're not publishing it," there's a possibility that people would go elsewhere where it was published, or there's a possibility that people would spend more time on real news or spend more time doing other things.

[31:29] Maybe that's the strongest thing that's keeping people going to Facebook, so maybe they'd go to some other social network. It's an interesting thing to consider. Certainly, my instinct is telling me that Facebook can't really justify publishing so much fake news on moral or ethical grounds.

[31:51] It hides behind this fakey sounding, "We're not expressing political opinions here. We're just a platform." This neutral idea which I find...I just don't buy it. There's lines that are drawn in society and I think they've really crossed the line.

[32:13] The real reason that they are trying to maintain this patina of being an objective platform is because of the amount of money that they would give up if they got rid of fake news. But I don't know how much money it is.

Kevin: [32:24] Feeding into another conspiracy theory. Peter Thiel, who's a board member of Facebook and one of the first external investors, was a very outspoken supporter of Trump as well.

Peter: [32:37] Yes.

Kevin: [32:37] A Silicon Valley investor and one of the few Silicon Valley people to really be such a vocal supporter of Trump as well.

Peter: [32:44] Exactly. That's an interesting possibility. Who knows whether there are some deals between Facebook and Breitbart and all these other fake news providers that help make a lot of money for Facebook during the election that nobody wants to talk about or no one can disclose. The beauty of a conspiracy theory is that if nobody gives you any information, then there's nobody to tell you that it isn't true.

Kevin: [33:10] You can say whatever you want, right?

Peter: [33:11] Exactly. When I first wrote this story about "Does Half of Facebook's Revenue Come from Fake News," I was thinking to myself, "I'm consciously writing a story that approaches the border of being fake news myself." I'm using a fake newsy kind of a headline to make a point about how Facebook is making money out of fake news.

Kevin: [33:39] Sort of beat them at their own game, right?

Peter: [33:42] I didn't really do that very much because I didn't want to really cross the line in an obnoxious way. But my point is that I don't really have any information about how much money Facebook is making on fake news, but at least I was able to get some sort of a methodology, which is more than I had but when I started working on it.

[34:00] I still have the question. I still think it's an important question and I'm going to keep trying to get a number. Who knows, maybe in the not too distant future, I'll have more information on that.

[34:13] But you're right. If there's a conspiracy theory and there's elements there, and you don't have enough information, then you can...You want to fill in the blanks so that story's more interesting with some fake information that makes the story better. Maybe Peter Thiel was behind it all, maybe he had nothing to do with it. Who knows?

Kevin: [34:33] Can I ask you if you are an investor in Facebook. Do you own Facebook shares? Am I allowed to ask that?

Peter: [34:38] I am not an investor in Facebook shares. I have been waiting for the stock price to go down so much that I just could not resist buying shares of it anymore, but I am not an investor. At the same time, I'm thinking this is one of the few companies that is so big that's growing so fast.

[34:58] To me the best kind of company to invest in is one that is both big and growing over 20 percent a year. Facebook is pretty incredible, but the valuation seem high to me.

Kevin: [35:12] They're running out of humans on the planet to get as users, right?

Peter: [35:16] Exactly. I don't know what they're going to do to increase the amount of time people spend on it because they can't get any more users. Where are they going to get their next growth from? It's a good point.

Kevin: [35:24] Probably Oculus and VR. Peter, I really appreciate your time. It's a fascinating topic. Peter Cohan is a lecturer at Babson College in Massachusetts in Business Strategy and Entrepreneurship. Also a contributor on Forbes magazine, wrote a fantastic article, "Does Facebook Generate Over Half Its Ad Revenue from Fake News?"

[35:47] We'll put links up on the show notes, as well to his Twitter accounts. Peter, thanks very much for joining us.

Peter: [35:53] Thanks, have a great day.

Kevin: [35:54] Thanks, Peter.

[35:55] [dog barking]

Announcer: [35:56] The It's a Monkey Podcast is brought to you by CheckDog. Use CheckDog to easily review and monitor your website for spelling errors, broken links and broken images all with the push of one button.

[36:11] CheckDog can also automatically monitor your website and notify you of newly-introduced spelling errors. Go to checkdog.com/podcast to receive 50 percent off your first month subscription. Checkdog.com. Helping the world's leading websites keep their content error-free.

[36:29] [dog barking]

Kevin: [36:33] Kate, Facebook, elections, news stories, we can't just get away from it. But what I find so fascinating is there are these teenagers in Moldovia that just work out how to hustle some web traffic and get some pocket money. It just trickles right through to the election of the US president.

Kate: [36:57] I think back to when I was 16, there's no way I could do half of the stuff they're doing, building websites and doing all Facebook ads and clickbait. I wouldn't have a clue.

Kevin: [37:07] I don't think there's a conspiracy behind it. I think it's just literally opportunistic youngsters. Facebook's in a relatively tricky position, balancing the freedom of speech and they're not publishers. Facebook's become so powerful, it's almost like a government. Anything goes wrong, you...It's Facebook's fault.

Kate: [37:36] Yeah, and because Facebook have dipped their fingers in so many different things, the definition of what Facebook is changes. So many people are claiming that it's a media platform now, which in many respects, it is. But initially it was to connect with friends.

Kevin: [37:50] It's going to be interesting to see how they deal with this. As I mentioned in the talk with Peter, there's going to be some huge learning for the Democrat party out of this and why all those people loved sharing those fake Donald Trump articles. It's fun reading in the articles there, the teens say they tried to do fake Hillary Clinton articles and no one would share them. [laughs]

Kate: [38:12] No, Trump's good news. Even if people don't agree with it, they find it entertaining.

Kevin: [38:18] Do you ever bump up against fake news articles in your Facebook feed?

Kate: [38:22] I don't know about fake. Clickbait, definitely. I've had articles, I recently had one. It really infuriated me. It was "50 Places Women Shouldn't Travel on Their Own." I opened it, which I shouldn't have, and it was ridiculous. They were listing off Central Park in New York as being one of the most dangerous places for a woman to travel on her own. This is just piece of rubbish.

Kevin: [38:48] They're just trying to get traffic.

Kate: [38:49] Yeah, and they're just trying to get clicks. That article had so many comments and likes and mostly people just agreeing with it, but all this engagement for a rubbish article.

Kevin: [38:59] The problem with all of this is it's all playing on human nature.

Kate: [39:04] I have a question. In regards to everybody's getting on the bandwagon saying that it's Facebook's fault, why are no other social channels getting the same criticism?

Kevin: [39:16] Twitter's getting a lot of criticism for the [inaudible] and abuse and harassment. They really are. They've recently rolled out a whole bunch of new mute features on Twitter to even identify certain keywords. If people tweet something with a keyword, you can filter that out. You can block certain conversations.

[39:35] They're not getting so much criticism around the fake news side of things. Although the article that Peter did cite, all that experiment was actually a Twitter one, where the fake version of the article got all the shares, and the true one didn't get many shares. That was Twitter.

[39:53] I think it's just a virtue of the fact that Facebook is the dominant player by far. Twitter's at what, 400 million active monthly. Facebook's at nearly two billion. It's just so much more significant.

[40:08] Journalism is undergoing a lot of naval gazing at the moment. I follow a lot of journalists on Twitter and they're really self-reflecting where their place is in this clickbait-type world that we live in.

Kate: [40:23] Definitely. The other thing is that Facebook is getting a lot of criticism and support to adding a feature that stops these articles from coming through on your Facebook news feed. I agree with Twitter's approach of muting or blocking things and flagging it, rather than relying on a platform like Facebook to make the decisions for you.

Kevin: [40:46] I agree. One of the options they did say is allowing people to flag it easier and crowdsource. All of these platforms are up against human nature. You look at email, email used to have a massive problem with spam. It's only the last few years that the spam filters have gotten good enough to not have these false positives.

[41:13] You're up against human nature the whole time, and it's a complicated force. We always like to look for a scape goat. If people didn't want Trump to get in, it's easy to point fingers.

Kate: [41:25] Some of the articles I've read, I feel like they're looking for someone to point the finger at because they can't believe that the human race voted Trump in, because they don't want to take the blame. They want to say, "Oh no, it's Facebook's fault because they gave us misinformed information."

Kevin: [41:42] How many people do you think voted for Trump because they saw an article that the Pope endorsed Trump? [laughs] It would be really interesting.

Kate: [41:49] I really doubt. I think we all have a...

[41:51] [crosstalk]

Kevin: [41:51] It'd be really interesting to know...

Kate: [41:52] an individual responsibility to read between the lines and say, "Do I believe this or not? Is this rubbish, is this true?"

Kevin: [41:59] We do. We have to assume that people have it...If they're voters, they...In fact, they don't even need a reason to justify who they vote for as well.

Kate: [42:13] No. I think a lot of people are surprised because it's a silent vote. So many people didn't verbally or vocally announce their support for Trump, but voted for him on the sly.

Kevin: [42:24] Time will tell. The Peter Thiel angle's quite interesting as well. What was so interesting about that story was he's been so well known to pick winners. When he came up and he supported Trump, Silicon Valley went crazy.

[42:46] There was calls for him to be...All the companies that where he was on the board got under real criticism. There were these debates if...David Hansson Heinemeier, he's one of the founders of Basecamp. I think it's called Base, now?

Kate: [43:03] Base.

Kevin: [43:03] Thirty-seven signals Basecamp. You guys call...

Kate: [43:06] It's still Basecamp.

Kevin: [43:06] Is it still Basecamp?

Kate: [43:08] It's still Basecamp.

Kevin: [43:07] He had an engagement where one of the founder of Y Combinator, which is an accelerator. Peter Thiel is involved in that somehow as well. David Heinemeier, he was saying, "You guys need to get rid of him," type thing.

[43:29] The Y Combinator guys said, "Would you fire a staff member if they supported Trump?" Because literally, that's just what's going on. That's the only thing that's going on. David said, "Yes." It opened up...

Kate: [43:43] That's blurring lines because...

Kevin: [43:43] It's blurring the lines and it's...

Kate: [43:44] he's entitled to an opinion.

Kevin: [43:45] Of course, and you're entitled too if he...At the end of the day, Trump was a candidate of a major party in the US.

Kate: [43:52] Yes.

Kevin: [43:54] How can you ask someone to step down from the board for supporting a major...It's not like you're supporting some French group. It opened up all this discussion when Peter Thiel came up, and a lot of Silicon Valley people were really disheartened. But then Trump won, and then people looked at Thiel and said, "Wow. Again, this guy just picked the winner."

[44:20] He was involved in all these companies that have just...Facebook, he was the guy that wrote $500,000, one of the first external investors in Facebook. That $500,000 turned into billions. He saw something.

[44:37] Again getting conspiratorial, Peter Thiel is the founder of a company called Palantir in Silicon Valley. Palantir are a big data company. They work a lot with governments to even sometimes try to pick up when epidemics are happening or security issues. It's a very hush hush company that's got these big government contracts.

[45:02] People are wondering if through this data that he had access to, Peter Thiel could see that, "This guy is going to win. I'll back this horse," type thing.

Kate: [45:12] Maybe. I recently read, too, that every presidential election, Twitter has picked the winner based on the amount of engagement and hashtags and stuff of the different candidates.

Kevin: [45:26] Interesting. The next election, which is going to be in four years' time, is going to be super interesting and the way people use social. Obama was the Internet election, YouTube and things like that. This has been a social...

Kate: [45:40] Social, yeah. I was actually looking at Trump's Twitter the other day. He has a very celebrity style Twitter account.

Kevin: [45:48] Very much.

Kate: [45:49] I was looking at other ones like for the Queen, or even Obama or presidents, prime ministers of other countries, very formal, very...

Kevin: [46:00] Very measured.

Kate: [46:01] curated tweets.

Kevin: [46:02] Yeah, very curated.

Kate: [46:03] His are just like any old person just sitting in a car, tweeting their thoughts out.

Kevin: [46:09] The funny one was when he was talking about who to put on his executive team and he said something like, "I'm still considering. I haven't announced anything. Only I know who the finalists are."

[46:20] [laughter]

Kevin: [46:20] [inaudible] said, "You're not in 'The Apprentice' anymore." It stirred things up, shaken the tree. We're talking about all these issues, which in a way is a good thing about it. Let's see with the four years, let's hope it works out OK.

[46:46] Anyway, that's been Episode 70 of It's a Monkey Podcast. Please tweet us @MonkeyPodcast. Email us at podcast@itsamonkey.com. We'd love to hear from you. Have a look at the show notes at itsamonkey.com. We put up links to the guests. We put up links to the articles that we speak about.

[47:05] If you're interested in the startup minute, email us. Thirty seconds, we'll give valuable free publicity for your startup. We really hope that you enjoyed the show. Thanks from myself and Kate.

[47:18] [music]