How Social Media Apps Encode Users’ Privacy, Not Always for Good

To understand how different ethical conceptions of privacy can be embodied by technological design, consider two of the largest international text message services, WhatsApp and Telegram.

Although there are many philosophical and sociological definitions of the concept of “privacy” as an ethical value in technology, what ultimately affects people’s daily life is how privacy is designed into these products. Social media apps, such as Facebook, Twitter, WhatsApp, and Telegram, are particular examples of this connection. Even with apps that have the same function, differences in their design create environments with different types of privacy for users—and thus different ethical implications. Choices made by app developers are not value-neutral, and they demonstrate how difficult it is to truly contemplate privacy in the abstract.

To understand how different ethical conceptions of privacy can be embodied by technological design, consider two of the largest international text message services, WhatsApp and Telegram. Specifically, WhatsApp places greater emphasis on the privacy of those receiving a message, while Telegram is more inclined to respect the privacy of the message sender. Since users act as both receiver and sender in a social network application, they need to be aware of these implications to protect their own privacy. And at a broader level, policymakers need to be aware of how the design of the technological experience—not only the content—can potentially harm society by enabling crimes including cyberbullying and harassment, extortion, and fraud.

Policymakers need to be aware of how the design of the technological experience—not only the content—can potentially harm society by enabling crimes including cyberbullying and harassment, extortion, and fraud.

Discussing these two apps reveals what is missing when policymakers consider only the larger social media platforms, such as Twitter and Facebook. On these social media platforms, the most common concern voiced about privacy is that users’ private information, gathered on central servers by the apps, will be offered to third parties for financial purposes, security or surveillance, search, or data mining without users’ awareness and permission. Another privacy issue centers around websites that install cookies on users’ devices enabling the sites to determine their interests, geographical location, and medical and health information, as well as their behavioral and cognitive patterns to use in marketing and other commercial purposes. The “magic” of social media and free content has prompted people to use these services willingly, even after they have become aware that their private information is being recorded—as a cost for these services.

These social media apps, then, display what Don Ihde, a contemporary American philosopher of science and technology, calls “inclination” of technology. Although Ihde argues against technological determinism, he emphasizes that when a technology guides users willingly to some behaviors it displays an inclination. The design of such technologies provides a robust framework favoring certain actions over others. Indeed, in no way does this aspect of technology mean that the user is obliged to behave in certain ways; rather, it specifies the stimulation of an inclination toward a specific action or behavior, which the user may choose not to follow.

The “magic” of social media and free content has prompted people to use these services willingly, even after they have become aware that their private information is being recorded—as a cost for these services.

This inclination of social media technology design can significantly impact users’ privacy and well-being, in more complex ways than simply taking data and using the information for other purposes. This can be seen in the differing designs of WhatsApp and Telegram, which enjoy a near market monopoly over social networks and instant messaging technologies. WhatsApp Messenger is now the world’s top instant messaging platform, with over 2 billion monthly active users. When one receives an anonymous message or call in WhatsApp, the caller’s number and a part of his or her name are provided. If one joins a private group, the phone number and a part of the name of each member are provided. In this way, users’ privacy is less likely to be violated because a part of their identity is public.

Telegram, which has grown remarkably fast during the past six years, to over 500 million monthly active users, has taken a different approach to privacy. With only a username, users of Telegram can easily exchange messages or make calls. Unlike WhatsApp, Telegram does not reveal the phone number of the message sender when a new conversation is started between two users who don’t know each other.

Although this feature is appealing to people interested in remaining anonymous in social media platforms, many users do not welcome anonymous communications and prefer to communicate with only people they know. Telegram users have the ability to modify some privacy settings in order to limit calls and messages to those users already listed in their device’s phone book. But these settings are not readily accessible, making them less likely to be used. In fact, most Telegram users keep the default settings that allow every user to make calls or send messages.

Accordingly, the above discussion may be rephrased in Ihde’s terms of inclination. By enabling usernames for users (rather than phone numbers) and default privacy settings that allow unlimited and even anonymous communication between all users, Telegram stimulates an inclination among users, either willingly or unwillingly, to respect other users’ privacy less than the similar settings in WhatsApp Messenger.

Telegram stimulates an inclination among users, either willingly or unwillingly, to respect other users’ privacy less than the similar settings in WhatsApp Messenger.

What this means in practical terms is that as a user, if you choose WhatsApp over Telegram, the probability of receiving a message with unwanted or criminal contents is decreased. Since WhatsApp shows the phone number of the sender, citizens can file complaints with law enforcement so that cyber  police identify the owner of the number. In this way, using WhatsApp can trigger consequences that serve as a deterrence against some privacy violations.

Of course, it does not follow that using Telegram necessarily exposes a user to cyber threats. If you are aware of the design of the app, you may be able to protect your privacy by modifying privacy settings. But fighting the inclination of each technological design puts burdens on the user.

In our country, Iran, WhatsApp and Telegram, like most other social media companies, have no agreement with the government to expose the names of users who have committed crimes. As a result, there are many cases of fraud in Iranian courts in which Telegram users have taken people’s money with the promise of delivering goods, but delivered nothing. For example, some fraudulent users have claimed to sell precious goods such as gold, Bitcoin, or even premium versions of famous international games, but after payment deliver nothing. In these cases, cyber police can’t find the suspects because they have nothing more than a user ID.

 Understanding how these intentional differences in design function in real life should be a part of discussions by both users and policymakers who are trying to ensure that social media respect privacy.

Another, more disturbing example of how hidden privacy inclinations can affect people’s lives is seen in the many cases of extortion carried out on the Telegram app. In Iran and other countries in the region, extortion using personal photos or videos is a crime, but it is accelerated and simplified by untraceable Telegram IDs. Frequently, such extortionists go unpunished because they cannot be connected to their Telegram ID.

As we noted earlier, the designers of technological products have different perspectives on the ethical concerns of cyberspace, particularly privacy, and they design these perspectives into applications. These decisions in turn influence the actions of users on the apps. By giving priority to the privacy of the message receiver, WhatsApp creates a different social environment than Telegram, which gives greater priority to that of the message sender. Understanding how these intentional differences in design function in real life should be a part of discussions by both users and policymakers who are trying to ensure that social media respect privacy.

Despite the technology’s potential to do harm, very few foundational decisions about privacy are visible to users. Users must be very conscious about how they select and enable these two competing messaging apps. Because many users live their lives on social media, they become accustomed to virtual friends, relatives, and features designed to keep them engaged, such as collecting “likes” from followers. As a result, switching from one application to another is difficult and costly—in terms of time, friendships, and social connections.

Despite the technology’s potential to do harm, very few foundational decisions about privacy are visible to users.

Before assuming daily and continuous use of such an app, responsible users need to study not only its explicit privacy policies, terms of agreement, and service description; they also need to understand the inclinations of the technology’s design. This seems like a very high expectation to have of a user, especially a new one or perhaps even a child. What’s more, putting the onus of protecting their own privacy on the individual user—even though it is embedded in the technology and subject to particular risks based on contracts between individual governments and service providers to reveal users’ identities—seems like a high bar. Even when these applications allow modification of their settings, it is a lot to ask of users.

There is a clear need for policymakers to increase public awareness, critical thinking, and media literacy in the use of social technologies. Citizens need to consider more than basic privacy, and government, in its regulatory role, could require app stores and developers to inform users clearly and in detail about the consequences of each feature, setting, and permission choice to their privacy.

 If the government does not devote enough time and money to studying the economic, social, and political aspects of widely used transnational applications (such as social media platforms), nations worldwide will later have to deal with complex social problems.

Widespread use of social media platforms has consequences for society—some irreparable and irreversible. In his valuable journal article “Ten Paradoxes of Technology,” the philosopher Andrew Feenberg identifies an enduring paradox: simplification complicates. This is particularly true for social media apps. If the government does not devote enough time and money to studying the economic, social, and political aspects of widely used transnational applications (such as social media platforms), nations worldwide will later have to deal with complex social problems, such as cybercrimes. Though the matter of intentionality in the design of privacy in such apps may seem like a small thing, for both individual users and society in general it makes a tremendous difference.

Recommended Reading

Feenberg, Andrew. (2010). Ten Paradoxes of Technology. ­Tékhne. 14(1). pp. 3-15.

 

Moor, J. H. 1990. The Ethics of Privacy Protection. In Library Trends 39 (1, 2: Summer-Fall). pp. 69-82.

 

Ihde, Don. (1990). Technology and the Lifeworld: From Garden to Earth. Bloomington: Indiana University Press.

Your participation enriches the conversation

Respond to the ideas raised in this essay by writing to [email protected]. And read what others are saying in our lively Forum section.

Cite this Article

Rahimi, Morteza, and Mostafa Taqavi. “How Social Media Apps Encode Users’ Privacy, Not Always for Good.” Issues in Science and Technology (March 16, 2021)