Last night 60 Minutes scored an interview with Frances Haugen, the Facebook whistleblower behind the document leak that led to last month’s “bombshell” Facebook Files investigation. She shared internal research that confirms what we’ve known all along. Feeding users polarizing content works great at engaging people and converting them into repeat visitors and that the revenue from those engaged users is intoxicating.
For a humorous TL;DR, check out this 2018 clip from The Daily Show.
Engagement is a metric used by social networks to measure how often someone uses your app or visits your website. Each service counts an engaged user in different ways, new user or old, daily visitor or monthly, but it all boils down to repeat visits. An engaged user is someone who comes back, repeatedly.
If your service is ad-supported, repeat visits generate cumulative ad impressions and revenue. If you track your users and personalize your ads, the more engaged a user is, the higher their value to advertisers. It’s the old “eyeballs” metric of Web 1.0 but with higher definition. In the mobile app world, it’s called ARPU, Average Revenue Per User.
If your company’s “true north” metric is engagement, what happens if you optimize for that and run a business that, above all else, keeps your users coming back and staying longer? If you discover that inflammatory content is the nectar that keeps users coming back aren’t you then measuring the level of a post’s ability to provoke a reaction? This is what I call Enragement Metrics.
Add the quarterly pressure for a trillion dollar public company to meet and exceed revenue targets and corporate incentives can get distorted. Responsibility is foggy in a large company of distributed teams with a shared ethos of “move fast and break things.”
The pursuit of engagement and the momentum of a market that rewards it created a Faustian Bargain that distracted the leadership at Facebook from the impact it was having on not only its users but, as a source of traffic and revenue for its publishing partners, the entire media ecosystem.
Haugen will testify before Congress where she is hoping they will regulate Facebook because, in her view, Facebook is unable to regulate themselves.
The tech and media world will be watching. As with newspapers, radio, and television, before it, a touch of regulation can build trust and improve a technology and balance the pursuit of profit with the benefit for the public good. But if there is stumbling and uninformed regulation, it will either hobble innovation or, in the worst case, favor those with deep pockets for lobbyists that will lock in their client’s dominance.
When the Haugen testimony picks up on Tuesday and they haul in someone from Facebook to explain themselves, I hope there is substantive discussion on a way forward and not the brow-beating grandstanding we so often see on Capitol Hill. I optimistically believe that no one at Facebook set out to poison the public well on purpose but that runaway algorithms and market forces drove them there.
Just as the publication of Silent Spring helped lead to the establishment of the Environmental Protection Agency in 1970, I hope these hearings on the adverse effects of social networks will lead to intelligent discussion of the role these products and the algorithms that power them in our society.
Environmental and safety regulations give businesses a framework against which to justify expenditures that take away from profits. We need an EPA-like independent organization for social networks and machine learning algorithms to regulate an industry and create best practices and guidelines for what they can and cannot do.
Social Networks and machine learning algorithms are powerful tools that can stimulate, motivate and transform society. As with all new technology, they can be put to good use or bad. It’s up to all of us, working together to understand their power and harness it for good.