Experts are evenly split on whether the coming decade will see a reduction in false and misleading narratives online. Those forecasting improvement place their hopes in technological fixes and in societal solutions. Others think the dark side of human nature is aided more than stifled by technology.
In late 2016, Oxford Dictionaries selected “post-truth” as the word of the year, defining it as “relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief.”
The 2016 Brexit vote in the United Kingdom and the tumultuous U.S. presidential election highlighted how the digital age has affected news and cultural narratives. New information platforms feed the ancient instinct people have to find information that syncs with their perspectives: A 2016 study that analyzed 376 million Facebook users’ interactions with over 900 news outlets found that people tend to seek information that aligns with their views.
This makes many vulnerable to accepting and acting on misinformation. For instance, after fake news stories in June 2017 reported Ethereum’s founder Vitalik Buterin had died in a car crash its market value was reported to have dropped by $4 billion.
Misinformation is not like a plumbing problem you fix. It is a social condition, like crime, that you must constantly monitor and adjust to.
When BBC Future Now interviewed a panel of 50 experts in early 2017 about the “grand challenges we face in the 21st century” many named the breakdown of trusted information sources. “The major new challenge in reporting news is the new shape of truth,” said Kevin Kelly, co-founder of Wired magazine. “Truth is no longer dictated by authorities, but is networked by peers. For every fact there is a counterfact and all these counterfacts and facts look identical online, which is confusing to most people.”
Americans worry about that: A Pew Research Center study conducted just after the 2016 election found 64% of adults believe fake news stories cause a great deal of confusion and 23% said they had shared fabricated political stories themselves – sometimes by mistake and sometimes intentionally.
The question arises, then: What will happen to the online information environment in the coming decade? In summer 2017, Pew Research Center and Elon University’s Imagining the Internet Center conducted a large canvassing of technologists, scholars, practitioners, strategic thinkers and others, asking them to react to this framing of the issue:
The rise of “fake news” and the proliferation of doctored narratives that are spread by humans and bots online are challenging publishers and platforms. Those trying to stop the spread of false information are working to design technical and human systems that can weed it out and minimize the ways in which bots and other schemes spread lies and misinformation.
The question: In the next 10 years, will trusted methods emerge to block false narratives and allow the most accurate information to prevail in the overall information ecosystem? Or will the quality and veracity of information online deteriorate due to the spread of unreliable, sometimes even dangerous, socially destabilizing ideas?
Respondents were then asked to choose one of the following answer options:
The information environment will improve – In the next 10 years, on balance, the information environment will be IMPROVED by changes that reduce the spread of lies and other misinformation online.
The information environment will NOT improve – In the next 10 years, on balance, the information environment will NOT BE improved by changes designed to reduce the spread of lies and other misinformation online.
Some 1,116 responded to this nonscientific canvassing: 51% chose the option that the information environment will not improve, and 49% said the information environment will improve. (See “About this canvassing of experts” for details about this sample.) Participants were next asked to explain their answers. This report concentrates on these follow-up responses.
Their reasoning revealed a wide range of opinions about the nature of these threats and the most likely solutions required to resolve them. But the overarching and competing themes were clear: Those who do not think things will improve felt that humans mostly shape technology advances to their own, not-fully-noble purposes and that bad actors with bad motives will thwart the best efforts of technology innovators to remedy today’s problems.
And those who are most hopeful believed that technological fixes can be implemented to bring out the better angels guiding human nature.
More specifically, the 51% of these experts who expect things will not improve generally cited two reasons:
The fake news ecosystem preys on some of our deepest human instincts: Respondents said humans’ primal quest for success and power – their “survival” instinct – will continue to degrade the online information environment in the next decade. They predicted that manipulative actors will use new digital tools to take advantage of humans’ inbred preference for comfort and convenience and their craving for the answers they find in reinforcing echo chambers.
Our brains are not wired to contend with the pace of technological change: These respondents said the rising speed, reach and efficiencies of the internet and emerging online applications will magnify these human tendencies and that technology-based solutions will not be able to overcome them. They predicted a future information landscape in which fake information crowds out reliable information. Some even foresaw a world in which widespread information scams and mass manipulation cause broad swathes of public to simply give up on being informed participants in civic life.
The 49% of these experts who expect things to improve generally inverted that reasoning:
Technology can help fix these problems: These more hopeful experts said the rising speed, reach and efficiencies of the internet, apps and platforms can be harnessed to rein in fake news and misinformation campaigns. Some predicted better methods will arise to create and promote trusted, fact-based news sources.
It is also human nature to come together and fix problems: The hopeful experts in this canvassing took the view that people have always adapted to change and that this current wave of challenges will also be overcome. They noted that misinformation and bad actors have always existed but have eventually been marginalized by smart people and processes. They expect well-meaning actors will work together to find ways to enhance the information environment. They also believe better information literacy among citizens will enable people to judge the veracity of material content and eventually raise the tone of discourse.
The majority of participants in this canvassing wrote detailed elaborations on their views. Some chose to have their names connected to their answers; others opted to respond anonymously. These findings do not represent all possible points of view, but they do reveal a wide range of striking observations.
Respondents collectively articulated several major themes tied to those insights and explained in the sections below the following graphic. Several longer additional sets of responses tied to these themes follow that summary.
The following section presents an overview of the themes found among the written responses, including a small selection of representative quotes supporting each point. Some comments are lightly edited for style or length.