Have data transparency and openness lost to secrecy and opaqueness?
In the midst of controversial data collection and questionable data sharing practices, waves of public outrage, congressional hearings and heightened regulatory scrutiny, Facebook CEO Mark Zuckerberg lays out “A Privacy-Focused Vision for Social Networking” in a blog post, which outlines his vision and principles around building a privacy-focused messaging and social networking platform. This is a big change for the social media company at a fundamental level.
Under this new vision, Facebook will provide users with private interactions and spaces as a foundation, end-to-end encryption in private communications, reducing retention of messages or stories which makes data more transient and ephemeral, as part of several stated principles. In short, Facebook will become a closed platform by default rather than the current open platform with a public interaction model.
First and foremost, the “privacy” Mr. Zuckerberg refers to needs clarification. Privacy in this context only means Facebook users’ personal data will be private and intimate in viewing, which is only visible to their circle of friends and not to general public. However, it doesn’t mean Facebook will change the company’s data collection and sharing practices to ensure more privacy and security for Facebook users. As a company, Facebook repeatedly stated in the past that sharing data with advertisers and other partners are just part of its business model and therefore public shall not expect it to change.
But this vision of close-knit virtual and “private” community already exists and we have seen it in reality. For instance, the popular Chinese social media app WeChat, with 1.1 billion active users, does exactly that. WeChat posts, messages, discussions, photos and videos shared on Moments (literally means “Circle of Friends” in Chinese) are visible and only visible to a user’s first degree connections. Unlike Facebook, the whole WeChat platform is inherently walled in with private user interactions and spaces.
In user interaction analysis, one scenario of a closed platform is between a user’s first degree connections if they are not friends (not connected directly) then they can’t see each other’s information. A popular joke is corrupt Chinese officials can simultaneously and safely post messages to their wives and mistresses in “Circle of Friends” and the WeChat private and stealthy mode is designed specifically for them.
In chatroom, which is semi-private because of invitation only policy or joining via verified QR code, although messages can be seen by all chatters, it is still a closed community with a upper limit of 500 users and often subject to recurring censorship and blockage of free flow of information. Tencent, the company owns WeChat, sometimes controls the messaging and viewership in monitored chatrooms so sensitive information can’t be viewed by users in China.
WeChat also has a range of services and functionalities with the integration of other FinTech and social tools, i.e. WeChat Pay, Mobile Top Up, Wealth, Utilities, Charity, Public Services. Similarly, Mr. Zuckerberg made it clear that interoperability is also in Facebook’s new vision. And just like WeChat, once integrated with other services and combined with other messaging platforms, Facebook could learn more user interactions, know their habits and privacies and ultimately harvest even more user data.
The biggest issue of WeChat, however, aside from censorship, is it’s easily and perpetually mired in disinformation and rampant rumors and fake news, precisely because it’s a closed platform.
Mr. Zuckerberg’s privacy-focused vision for social networking sounds so much like the virtual and divisive community existing in WeChat already. Online tribes form and congregate more easily and secretively when everyone retreats to comfort zones in tightly-knit communities which often serve as echo chambers, leading to segregation, partisanship and division. Is this kind of future for Facebook worth wanting?
More startling is Mark Zuckerberg's long "vision" announcement completely avoids the mentioning of how Facebook will fight disinformation and fake news, which as we know has been the predominant issue faced by the company. In a closed platform, fake news and disinformation have less resistance and could spread like wildfires without checks and balances.
By touting the narrowly defined “privacy” without addressing the real issue of disinformation on Facebook platform, has Mr. Zuckerberg chosen what’s convenient over what’s right but hard to do? Moreover, does it mean Facebook has no real interest or capability to tackle the widespread problem of fake news? Or to Facebook, has the war against fake news been lost? And finally, does the approach of taking everything to private by default entail that in the future we don't even know to what extent the real problem of fake news is since it will be less visible in public?
Mr. Zuckerberg’s vision also begs the question if the company is taking a feel-good approach to sidestep tough regulatory oversight of user data and privacy control on an open platform. Converting an open platform to a closed platform doesn’t make these problems go away but it surely will make them hard to detect and hence the whole system more opaque and secretive.
As we strive for a more transparent and open society in real life, why shouldn’t we demand our future online world be the same in principle? The deepest concern is whether transparency and openness have lost to secrecy and opaqueness in the world of social networking.
Perhaps the fundamental and philosophical question is this - is Facebook more of a public square where people meet and share common beliefs and differences or a living room where only close friends or like-minded people are invited and welcome? If there is ever an ideological battle between transparency and secrecy, and between openness and opaqueness, in fostering a social networking platform for a better future, I wonder if Facebook has lost that battle.