Spark

Lying to Facebook could help protect your data

Information security expert Chad Loder, suggests employing what he called "active deception" — flooding these services with fake data to make it impossible for them to tell what data is relevant.
Dozens of cardboard cutouts of Facebook CEO Mark Zuckerberg are seen during an Avaaz.org protest outside the U.S. Capitol in Washington, U.S., April 10, 2018. (REUTERS/Aaron P. Bernstein)

This week's testimony from Facebook CEO Mark Zuckerberg in front of the US Congress likely didn't allay many concerns about online privacy. Questions about Facebook's plan to limit third-party access to personal data, and whether or how the American government will regulate them were mostly left unanswered.

While some people have decided to delete their accounts, others are looking to ways to protect their data themselves, if Facebook is unable or unwilling to do it.



Chad Loder is the co-founder and CEO of Habitu8, a company that trains companies and their employees in information security. According to Loder, we have tended to see the collection of our personal data as a nuisance, something like pop-up ads. Instead, he said, they should be treated as as an active threat to our security and privacy.

Chad Loder is the co-founder and CEO of Habitu8. (Habitu8)
"You can't allow active threats to continue to attack you with impunity," Loder said. "You have to try to disrupt their activities and make them pay a penalty for what they're doing."

Loder's suggestion is to "actively deceive" these sites: to flood them with incorrect data, making it much more difficult to know what information is real and what is fake. One example is the browser extension AdNauseam, which will simulate clicking every ad that appears on a page many, many times. Since advertisers often pay by how many clicks they get, this punishes anyone trying to fill the page with ads, as well as confusing any attempt to build a profile on you.



"The idea is to put pressure on these sites to have a less-dangerous approach and a less reckless approach to collecting and using our data than they currently do," Loder said. "And ultimately it's resulting in changes to behaviour and adoption of self regulation. And I think that's what we need to do."