This article is part of the On Tech newsletter. You can sign up here to receive it on weekdays.
There has been a backlash on WhatsApp in recent days after apparently revised privacy policies were released. Let me try to clarify what happened.
Some people believe that the messaging app is now going to force those who use it to give their personal information to Facebook, which owns WhatsApp.
That is not completely right.
WhatsApp’s policies have changed cosmetically and not so that Facebook gets more data. The bottom line is that Facebook is already collecting a lot of information from WhatsApp users.
The confusion was the result of botched communications from Facebook, distrust of the company, and broken American privacy laws.
Here’s what has changed and what hasn’t changed with WhatsApp:
Facebook bought WhatsApp in 2014 and since 2016 almost everyone who uses the messaging app has shared (usually unknowingly) information about their activities with Facebook.
Facebook knows the phone numbers used, how often the app is opened, the resolution of the device screen, the estimated location based on the internet connection and much more, as my colleague Kashmir Hill explained five years ago.
Facebook uses this information to make sure WhatsApp is working properly and to help a shoe company serve you an ad on Facebook.
Facebook cannot see the content of texts or phone calls because WhatsApp communication is encrypted. Facebook also says it doesn’t keep a record of who people contact on WhatsApp, and WhatsApp contacts are not shared with Facebook. (This article on cables is also useful.)
WhatsApp has many positive aspects. It’s easy to use and communication in the app is secure. But yes, WhatsApp is Facebook, a company that many don’t trust.
There are alternatives including Signal and Telegram – both of which have seen a surge in new users recently. The Electronic Frontier Foundation digital privacy group says Signal and WhatsApp are great choices for most people. The Wall Street Journal also went through the pros and cons of several popular messaging apps.
The reason WhatsApp recently notified app users of revised privacy rules is because Facebook is trying to make WhatsApp a place to chat with an airline about a missed flight, search for handbags, and pay for things.
WhatsApp’s policies have been changed to reflect the possibility of commercial transactions where activity is mixed between Facebook apps. For example, a handbag that you browse in WhatsApp will later appear in your Instagram app.
I would also like to address deeper reasons for the misunderstandings.
First, this is a hangover in Facebook’s history of being careless with our personal information and being reckless with how it is used by the company or its partners. It’s no wonder people assumed that Facebook changed WhatsApp policies in bloody ways.
Second, people have understood that privacy policies are confusing and that we really don’t have the power to get companies to collect less data.
“This is the problem with the nature of data protection law in the United States,” said Kash. “As long as they tell you they are doing it on a policy you are unlikely to read, they can do what they want.”
That means that digital services like WhatsApp offer us an unattractive choice. We either give up control of what happens to our personal information or we don’t use the service. That’s it.
Eliminate more WhatsApp confusion
Another false belief in WhatsApp – and this, too, is WhatsApp’s fault, not yours – is that the app is currently removing an option that allows users to refuse to share their WhatsApp data with Facebook.
Not entirely correct.
Facebook would still collect data from WhatsApp users, as I explained above, but the company wouldn’t use the data to “improve its ads and product experiences,” such as making recommendations from friends.
However, this option in WhatsApp was only available for 30 days in 2016. That was a life in the digital years ago and about four million Facebook data scandals ago.
For everyone who’s been using WhatsApp since 2016 – and that’s a lot of people – Facebook has collected a lot of information without the option to opt out.
“A lot of people didn’t know that until now,” said Gennie Gebhart of the Electronic Frontier Foundation. And she said it’s not our fault.
Understanding what is happening to our digital data seems to require advanced training in computer science and a law degree. And Facebook, a company with tons of cash and more than $ 700 billion in stock value, couldn’t or couldn’t explain what was happening in a way people could grasp.
Before we go …
Other digital effects of the Capitol mob: YouTube blocked President Trump’s account from posting new videos for at least the next seven days, my colleague Dai Wakabayashi wrote. Like Facebook and Twitter, YouTube cited the potential for false or inflammatory claims made in Mr. Trump’s videos to increase the risk of violence related to the handover of presidential administration.
Even more digital fallout from the Capitol Mob: Gizmodo has mapped hundreds of users of the Parler social network in the crowd that flooded the Capitol last week. This could be due to Parler’s lax security, which allowed researchers to download data that included records of people’s contributions and GPS coordinates.
Some people make good money online. Many don’t: This goes for YouTube and Instagram – and for OnlyFans, the website where people can bill others for access to sexually explicit images. My colleague Gillian Friedman spoke to women about her experiences as OnlyFans creators.
A big trend in TikTok videos over the past few weeks has been people singing and remixing Sea Shanties – yup, those old, contemporary sea shanties. This Sea Shanty video is as delightful as this Electronica Edition.
We want to hear from you. Tell us what you think of this newsletter and what else you would like us to explore. You can reach us at firstname.lastname@example.org.
If you do not have this newsletter in your inbox yet, please register here.