Maciej Ceglowski, Founder, Pinboard, on April 18, 2017, at the Emerging Technologies for the Enterprise conference in Philadelphia:
Build a Better Monster: Morality, Machine Learning, and Mass Surveillance
These, then, are the twin pillars of the online economy[:] We have an apparatus for harvesting tremendous quantities of data from people, and a set of effective but opaque learning algorithms we train on this data. The algorithms learn to show people the things they are most likely to ‘engage’ with—click, share, view, and react to. We make them very good at provoking these reactions from people. This is our sixty billion dollar industry.
So what happens when these tools for maximizing clicks and engagement creep into the political sphere?
This is a delicate question! If you concede that they work just as well for politics as for commerce, you’re inviting government oversight. If you claim they don’t work well at all, you’re telling advertisers they’re wasting their money.
Facebook and Google have tied themselves into pretzels over this. The idea that these mechanisms of persuasion could be politically useful, and especially that they might be more useful to one side than the other, violates cherished beliefs about the “apolitical” tech industry.
Whatever bright line we imagine separating commerce from politics is not something the software that runs these sites can see. All the algorithms know is what they measure, which is the same for advertising as it is in politics: engagement, time on site, who shared what, who clicked what, and who is likely to come back for more.
The persuasion works, and it works the same way in politics as it does in commerce—by getting a rise out of people.
But political sales techniques that maximize “engagement” have troubling implications in a democracy.
One problem is that any system trying to maximize engagement will try to push users towards the fringes. You can prove this to yourself by opening YouTube in an incognito browser (so that you start with a blank slate), and clicking recommended links on any video with political content. When I tried this experiment last night, within five clicks I went from a news item about demonstrators clashing in Berkeley to a conspiracy site claiming Trump was planning WWIII with North Korea, and another exposing FEMA’s plans for genocide.
This pull to the fringes doesn’t happen if you click on a cute animal story. In that case, you just get more cute animals (an experiment I also recommend trying). But the algorithms have learned that users interested in politics respond more if they’re provoked more, so they provoke. Nobody programmed the behavior into the algorithm; it made a correct observation about human nature and acted on it.