If we were given the capacity to track and feel one another’s emotions, would we behave better?
Netflix’s Black Mirror explores whether such transparency can breed empathy and respect in its “Black Museum” episode, where a museum owner guides a young woman through technology artifacts aimed at perfecting human consciousness. He ultimately reveals how technology merely amplifies bigotry, hatred and greed buried inside the human psyche.
While this may be an axiom of most dystopian dramas, the episode reveals our own muddled relationship with technology, communication and human behavior.
China’s efforts to hack human consciousness through its “social credit system” is a case in point. For the past few years, the Chinese government has been developing a system of social credits to reward citizens for virtuous behavior. The idea is that if citizens behave unethically or dishonestly in their day-to-day lives, they will lose access to everything from government benefits to public transportation. In the government’s eyes, direct consequences lend themselves to a more transparent society.
This system is informed by Chinese President Xi Jinping’s warning about declining trust in the state, which could lead the country into a Tacitus Trap, likely named for the Roman historian who posited that once people lose trust in the state, they will always interpret its actions or messages as untrue and evil.
Xi had plenty of cause for concern. Massive socioeconomic transformation since the late 1970s, including growing inequality, corruption and compromised product and food safety, have led to distrust across all sectors and social classes in China. In 2015, the Pew Global Attitudes Survey showed that 84 percent of Chinese respondents saw corrupt officials as a big problem.
In November 2016, the Shanghai municipal government introduced its first test of the social credit system with the Honest Shanghai app. The app drew upon up to 3,000 sources of personal information, collected from about 100 government entities, to calculate an individual’s public credit score in three categories: very good, good or bad.
A “very good” score could get the user fast-tracked for a loan or access to perks such as discounted plane or train tickets. But a bad score meant that one would be unlikely to qualify for a loan. Worse yet, a poor score could lead to blacklisting, meaning a loss of access to public transport, job opportunities and other services.
Dozens of similar programs are being developed countrywide: Once China completes its national social credit system in 2020, citizens will live in a reality where “the cloud is computing everything people do,” as Premier Li Keqiang has noted. The Chinese government has pitched the big-data–driven social credit system as a way to do everything from expose corrupt officials, identify business frauds, promote public safety and health, and manage traffic.
While there has been no official measure of new opportunities the system affords citizens, metrics are available on whom this new level of surveillance has shut out. China’s Chief Justice Zhou Qiang reported 8.42 million instances where “discredited” people were blocked from buying airline tickets. Individuals with “bad” social credit scores were also banned from buying train tickets 3.27 million times.
The social credit system not only overreaches, it may be ineffective: Rampant data manipulation and a lack of personal data protection make true transparency almost impossible. China’s stark digital divide keeps 45 percent of the population under the radar in spite of the country being the largest market of internet and mobile users. Tight political control and pervasive surveillance, as well as corporate data breach and misuse, may deter citizens from sharing data or spur them to game the system.
Interestingly, China is not the only country with a trust deficit it hopes big data can help resolve. A 2017 Media Matters survey found only 24 percent of Americans trust media to be “moral” and accurate. A Pew survey found only 18 percent of Americans trust the government to do what’s right “most of the time.”
The FCC’s successful revoking of net neutrality rules designed to ensure customers know what they’re buying and startups engage on an even playing field is the latest example the erosion of trust. FCC Chairman Ajit Pai argued the government established the rules needlessly when “there was no evidence of the dysfunction that regulatory proponents feared.” He implied net neutrality rules were based largely on paranoia and fear that “the internet would devolve into a digital dystopia of fast lanes and slow lanes.”
If Pai sees urgency in resource consolidation and mass collection of data, the people of China and the United States, who have long lost trust in their government and media, certainly do.
Moreover, if media access and providers continue to consolidate, the U.S. corporate sector will suffer a Tacitus Trap akin to China’s. People will deepen their distrust in information they access daily.
In "Nosedive,” another landmark Black Mirror episode, Lacie, a woman living in an alternative reality where people use smartphones to rate and be rated, smiles hard to boost her rating to access services and opportunities, with ominous results. Show creator Charlie Brooker famously said: “I promise you, we didn’t sell the idea to the Chinese government!” When it comes to data accumulation and media consolidation, Brooker might want to advise the U.S. not to take its lead from China.