Facebook is building a platform for bots to combat trolls and abuse

Facebook is building a platform for bots to combat trolls and abuse

Technology

The scandals surrounding Cambridge Analytica in 2018 prompted Facebook to take action. Shortly after the news of abuse on the social network site, founder Mark Zuckerberg confirmed that he would tackle the issues on his platform. Several measures have since been launched. Today researchers from Facebook present a new pilot project. The company reports this in a paper on Facebook Research.

Specifically, the company wants to build a platform for bots that simulate the behavior of trolls and scammers. For example, Facebook ultimately wants to be able to better detect and monitor unwanted profiles. This is done with a “Web Enabled Simulation” (WES), a kind of shadow Facebook on which nonexistent users can like, share and send friend requests. The WES is performed in a separate environment on the site that is further protected from ordinary users.

Simulation of user behavior
With a small-scale shielded simulation, Facebook creates bots that display unwanted behavior. For example, a “scammer” bot can be trained to connect to “target” bots. Other bots can then be trained to violate the privacy of other bots. They can also look for “wrong” content that breaks the platform’s rules.

Although Facebook has been using simulations for research for some time, this approach is unique. The previous simulations were based on a Facebook mockup. With the new “Web Enabled Simulation”, the researchers work in an environment that is much closer to the platform itself. For example, some bots would have access to scan Facebook themselves and learn from real users. In addition, they are not given access to information that is not in accordance with the privacy rules.

Debugging
Facebook’s new simulation would also allow the company to detect bugs in the site faster. For example, WES bots are being built that are aimed at collecting data from other users. If these bots suddenly find a way to steal more data after a system update, it could expose an error in the code. By monitoring these deviations, Facebook can prevent such a bug from being exploited by real users.

The company further emphasizes that interaction between the bot environment and the real platform is not intended. “Bots must be sufficiently isolated from real users to ensure that the simulation, while running on real platform code, does not lead to unexpected interactions between bots and real users.”