‘Understand What You Believe’: Fmr. FBI Agent Unpacks Information Threats
In the past few years, social media has transformed from a communications gold mine to a minefield of disinformation campaigns.
CPX 360 – New Orleans, La. – Social media, initially built to improve global communications, has become a weapon in the hands of cybercriminals launching disinformation campaigns.
CEO Gil Shwed used the world’s growing interconnectedness as the topic to begin his morning keynote at the Check Point CPX 360 conference, and he used the Olympics to illustrate just how connected we have become. The 1996 Atlanta Olympics generated 171 hours of coverage, he noted. One decade later, the 2016 summer games in Rio de Janeiro generated 6,755 hours.
“Everything has to do with technology,” Shwed said. Ticket reservations, hotel bookings, and viewings of Olympic events are all done online. “Every transaction around an event like the Olympics has to be built using the Internet and using the connected world.” Still, the benefit of using the Internet to make the Olympics a more global event is countered by potential threats.
“Protecting all of our systems is a problem that’s becoming bigger and bigger,” he added.
This sentiment extended into a keynote talk by Clint Watts, former special agent for the FBI and distinguished research fellow at the Foreign Policy Research Institute. He spoke to the rapid growth of social media and its role as a boon to communications and evolving attack vector.
“The information landscape has changed,” said Watts. People championed social media in 2011, when it was used to anonymously communicate during the Arab Spring. By 2016, only a few years later, those same platforms had entered a new era: “the rise of the trolls,” as he put it.
Today’s social networks, used among billions of people for legitimate communications, double as a space for the Internet’s hecklers to sow mistrust and advance false narratives. “They have a very specific mission,” Watts said. Most trolls get tired over time; they will eventually leave you alone. Now, we see a rise in armies of trolls discrediting people and hacking to power influence.
“In just 10 years, we went from social media and the Internet bringing us all together, to tearing us all apart,” he said. Trolls pose a new threat to consumers and businesses alike. Fake news stories about corporations can cause stock prices to rise or fall, depending on the angle.
There are two components to building what Watts calls a “preference bubble,” or the way in which users see social media content. One is an algorithm; the other is the reader. Each individual’s social feed is uniquely tailored to them, he explained, and nobody else sees the world in that way. This makes it easier to spread false narratives, as other people don’t know if they should believe content because it wasn’t specifically made for them.
Why do people believe everything they see? Watts shared a few tactics that attackers commonly use to manipulate victims — for example, provoking emotions such as anger and fear.
“If I can scare you as an audience, you’re more likely to believe what comes next,” he said. “I can draw in followers by scaring people with news.”
The Current and Future State of Disinformation
Disinformation campaigns rely on three biases, Watts continued. The first is confirmation bias: The more you like and follow topics online, the more of them you see. Implicit bias, the second, describes when we have unconscious attitudes toward people. As Watts described it, most humans prefer information from people who look and talk like them. This bias motivates attackers to create fake profiles that look more like the people they are working to discredit.
The third bias is status quo bias, an emotional preference to keep things the way they are. This means you may start shaping what you say, and what you share, because you want to stay within your tribe. You only post things your followers agree with, maintaining a status quo.
Watts then described three dynamic changes shaping the information landscape, starting with “clickbait populism,” or the promotion of popular content and opinions. The more a person plays to the crowd’s preferences, the more they will be promoted. Social media nationalism is another; this is the collective adherence to a digital identity drawn by hashtags and avatars. Then there is the death of expertise, or the belief that anyone on the Internet is just as smart as anyone else, regardless of their experience, training, education, or specialty, he explained.
Over time, the objectives, methods, and actors driving social media manipulation will change. Activists and extremist groups will have fewer resources; lobbyists and the very wealthy will have more. Attackers will need more sophistication to incite fear and provoke conflict, to create alternative information outlets, or to distort reality via pseudoscience and revised histories.
“What if there are whole communities of bots that talk to each other, and you are looking at one person, but it’s really nine people talking?” he added. Threats on the horizon include trolling-as-a-service, or disinformation for hire, as well as social media influencers-as-a-weapon, pseudoscience firms, alternative universities, and cross-platform computational propaganda.
How can businesses respond to all this? Watts advised implementing both a social media usage policy and security training for employees and an insider program that goes beyond data loss. Organizations should develop and rehearse playbooks to protect their brand and reputation, especially when a disinformation attack is targeting them in real time.
“Speed is essential,” he emphasized. “When I see hacking to power influence, the immediate response is to have a meeting. But the longer it goes on, the worse it’s going to get.”
For citizens, Watts encouraged them to understand why they believe the things they trust online. “Listen more than you speak, read more than you write, watch more than you film,” he said. We’re most likely to believe the things we see first, those we see most, those that come from sources we trust, and those that don’t come accompanied by a rebuttal.
Kelly Sheridan is the Staff Editor at Dark Reading, where she focuses on cybersecurity news and analysis. She is a business technology journalist who previously reported for InformationWeek, where she covered Microsoft, and Insurance & Technology, where she covered financial … View Full Bio