Facebook is a public forum prone to exploitation in all forms. To avoid such scenarios, the company is working on a way to create a ‘shadow Facebook’ to fool scammers.
Facebook Fooling Scammers: Here’s How It Will Work
The social media giant is apparently looking to build a toned down fake platform that will only have bots. This platform will be used to run simulations and uncover hidden bugs in the “real” Facebook.
The bots will interact with each other the same way real users on the real app do. This means that the bots can like, comment, share and send a friend request. This also means they can take a darker turn, harass, abuse, and scam other bots.
According to the Verge, Facebook wants to stop people from abusing its system. Company researchers have released a paper on a “Web Enabled Simulation” (WES) for testing the platform.
WES World (WW)
Facebook has decided to call this system WW, which stands for WES World. A Web-Enabled Simulation (WES) is a simulation of the behaviour of a community of users on a software platform.
It uses a web-enabled software platform to simulate real-user interactions and social behaviour on the real platform infrastructure. For example, a “scammer” bot might be trained to connect with “target” bots that exhibit behaviours similar to real-life Facebook scam victims. Other bots might invade fake users’ privacy or seek out “bad” content that breaks Facebook’s rules.
This is not Facebooks first run at this. Facebook is expanding on an earlier automated testing tool called Sapienz. However, the WES World is distinct because it turns lots of bots loose on something close to the actual platform.
This move could help Facebook detect bugs. Researchers can build WES users whose sole goal is stealing information from other bots, for example, and set them loose on the system.
If they suddenly find ways to access more data after an update, that could indicate a vulnerability for human scammers to exploit, and no real users would have been affected.
Future Plans
In the future, the company can use this technology to build bots that perform a selected type of function and gain intel from that. For example, if Facebook wanted to uncover bugs in the system that hackers can exploit to gain sensitive information about users.
However, researchers caution that bots must be suitably isolated from real users. This is to ensure that the simulation does not lead to unexpected interactions between bots and real users.
Comments