Facebook made its own Android app crash on purpose to test user loyalty: report
Facebook reportedly tested in the past 'how addicted Android phone users' are to its app
Latest
- Facebook representative tells CBC News that the company 'does not have a statement' at this time
Remember how mad the internet got when news broke that Facebook had covertly turned 700,000 of its users into unwilling test subjects for an experiment on "emotional contagion"?
Facebook doesn't.
Either that, or, perhaps more likely, the company hadn't considered that people would ever find out they'd been messed with in the name of data again.
A report published this week by San Francisco-based tech publication The Information has industry journalists and casual Facebook users alike shaking their heads over what the social network is now alleged to have done in an effort to glean information: Make its own Android app crash, on purpose, to see how people would react.
"Facebook has tested the loyalty and patience of Android users by secretly introducing artificial errors that would automatically crash the app for hours at a time" writes veteran tech reporter Amir Efrati for The Information, referring to Facebook's app in the Google Play store.
"The purpose of the test, which happened several years ago, was to see at what threshold would a person ditch the Facebook app altogether," he continues. "The company wasn't able to reach the threshold. … Even if the native app continued to not work, the users would open Facebook on their phone's mobile browser."
But not before letting everyone know how annoyed they were, judging by the volume tweets about Facebook Android app crashes in recent years (though it is not known if these crashes occurred as a result of any experiments.)
Another <a href="https://twitter.com/facebook">@facebook</a> app crash? Facebook app for <a href="https://twitter.com/hashtag/android?src=hash">#android</a> is a total <a href="https://twitter.com/hashtag/disaster?src=hash">#disaster</a>
—@MxmUkr
I'd appreciate it if the Android Facebook app didn't crash every five seconds like
—@DawnVibration
<a href="https://twitter.com/hashtag/android?src=hash">#android</a> <a href="https://twitter.com/facebook">@facebook</a> app started to crash randomly since the last update. Why the heck is that???? <a href="https://twitter.com/hashtag/Facebook?src=hash">#Facebook</a>
—@waseemwsm
A representative from Facebook Canada responded to a CBC News by email Wednesday morning after being asked about the report.
"At this time we do not have a statement to provide," wrote the PR rep. "However if that changes we will let you know."
The question of why Facebook would intentionally "sabotage" its own app, as some have put it, just to test how loyal users are can be answered by looking at the company's history with Google.
Facebook's rocky relationship with the search giant does in fact comprise most of The Information's now-viral report.
Specifically, it focuses on what Facebook is doing to prepare for a potential conflict with Google that would see its apps booted from the Android app store.
Finding out "how addicted Android phone users are to Facebook apps" was the company's way of testing its own independence from the Android operating system, which was being used by 1.4 billion people worldwide as of September.
The contentious app crash experiment took place "several years ago" according to Efrati, who spoke with "people who have been involved in different parts" of Facebook's Android strategy for his report.
One of said people also told him that it was a "one-time experiment." The number of users subjected to intentional app crashing is not known at this point.
Still, as news circulates that Facebook may have experimented on its own users – again – shock, anger and disbelief is percolating among those following the story.
Not surprised Facebook kept this test secret. This is disturbing. We are all operant-conditioned pigeons. <a href="https://t.co/MTnZeb6TQu">https://t.co/MTnZeb6TQu</a>
—@jeffbercovici
Next up: Facebook randomly separates users into "guards" and "prisoners". <a href="https://t.co/BsknweYHfn">https://t.co/BsknweYHfn</a> via <a href="https://twitter.com/verge">@verge</a>
—@ethanschoonover
Facebook has a profound lack of respect for its users. <a href="https://t.co/RTo8YDcthq">https://t.co/RTo8YDcthq</a>
—@tylermenezes
"You can't on one hand position your company as the place to declare your safety in the wake of terrorist attacks at the same time you're selectively disabling access to your own service," wrote The Verge's Casey Newton of the Facebook app crash report.
"Perhaps this experiment was controversial inside Facebook itself — it would help explain why it was apparently a one-off test. Maybe the company's ethical standards have evolved over time," he continued. "What's troublesome is that we simply don't know, because Facebook itself won't say."
The current version of Facebook's data use policy says that the company collects information "from or about the computers, phones, or other devices where you install or access our Services, depending on the permissions you've granted."
"We are able to deliver our Services, personalize content, and make suggestions for you by using this information to understand how you use and interact with our Services and the people or things you're connected to and interested in on and off our Services," the policy reads. "We conduct surveys and research, test features in development, and analyze the information we have to evaluate and improve products and services, develop new products or features, and conduct audits and troubleshooting activities."