The whole demoRESILdigital-team offers their sincere congratulations to Lena Clever for completing and successfully defending her doctoral thesis!
Titled “Tackling the Dark Side of Social Media by Automated Content Analysis”, it explores how Dark Participation actors and processes can be analyzed and detected using automated methods. Another important part of her thesis focuses on how automated methods can help to foster resilience against Dark Participation by educating people on the dangers, be it by making research more easily accessible and shareable or by developing interactive games, or by offering support through automated help desks or content moderation like some fact-checking organizations like mimikama already do.
Behind blue skies: A multimodal automated content analysis of Islamic extremist propaganda on Instagram
Lena Clever, Tim Schatz-Eckrodt, Lena Frischlich and colleagues conducted a case study on the German group Generation Islam to examine how they use Instagram to spread Islamic extremist content.
The researchers examined 1187 posts that were collected over a period of 2 years, from January 2016 to December 2018.
They examined affect in hashtag networks in which users may come across propagandistic content, used deep learning to examine the emotional valence of the visuals and employed automated linguistic analysis to describe the collective action cues contained within the texts.
In general, it has to be noted that Islamic propaganda spread via social media is more common than one might think. The team extracted number of public posts using the specific hashtag from Instagram, and, using inductive coding, the hashtags were sorted into eight different categories. Next, an image analysis was conducted with Imavis. SentiBank was deployed to detect distinct emotions. For the linguistic analysis, the LIWC dictionary was used.
The team found that the most frequent 15 hashtags could be sorted into the category of religion, but a combination of religious and non-religious hashtags was most commonly used. The hashtags relied on positive affect.
Imavis classified most pictures as negative, and the analysis with SentiBanks confirmed these findings (anger, fear, defense-related images dominated). The textual elements anchored the social identities of users who were attracted by the religious hashtags and mostly spread a positive view of the future if users followed “the right path”.
Timo K. Koch, Lena Frischlich and Eva Lermer’s experimental study found that warning labels reduced the perceived credibility of a fake news post exaggerating the consequences of climate change. Warning labels also lowered the (self-reported) likelihood to amplify fake news. Removing social endorsement cues like views, likes or shares did not have an effect.
The study featured a mock Facebook feed, in which the fake news post about the probability of a “White Christmas” in Central Europe dropping below 5% because of global warming was embedded. Participants either saw the post with or without warning labels and with or without social engagement cues like shares or views. They then had to answer questions about whether or not they believed in the message of the post, how credible they found it to be, and they had to rate their self-estimated likelihood to engage with that post. Their analytical thinking was also measured because the heuristic-systematic model of information processing (HSM) was used as the theoretical framework for the experiment. There is the systematic system, which evaluates information carefully and with the intention to thoroughly understand any available information. The heuristic system describes conditions under which easily comprehended cues are used as shortcuts for opinion formation. Previous research has found that a more systematic elaboration is associated with a lower susceptibility to fake news.
The danger that terrorism and extremism pose to our societies is well known by now. Preventing people from becoming radicalized is one important goal of extremism prevention.
The chapter that was now published in the handbook on peace psychology (available here), provides an overview of extremism prevention. Based on a processual understanding of radicalization, the authors describe terrorism and extremism as one possible, but not necessary, outcome of a radicalization process.
The interplay of risk and protective factors can result in the occurrence of extremism or terrorism. These factors can be located at different levels (individual, extremist groups, society). Extremism prevention starts at different stages of the radicalization process: Depending on the stage, it attempts to prevent radical worldviews (universal prevention), to avert a turn to extremism and violence (selective prevention), to prevent (renewed) use of violence (indicated prevention), or to promote a turn away from extremist groups or radical worldviews (distancing).
Frischlich and Bögelein present examples of some projects, especially from Germany, and illustrate the chapter with stories of people that are leaving right-wing extremism and Islamist extremism behind. Finally, they evaluate extremism prevention.
Lena Frischlich contributes Introduction to Special Issue of Digital Journalism
In the introduction to Contesting the Mainstream: Understanding Alternative News Media, Dr. Lena Frischlich discusses the influence alternative news media has on news diversity and elaborates on some key concepts that are further discussed in the articles published in the special issue. She argues that it becomes more and more important for awareness about normative positions in alternative media research to spread so that their role in society can be better understood. As the issue presents research from Europe, Asia, North and Latin America and the Middle East, the differences in how these radical actors are discussed and studied make clear that it is essential to further think about the normative purpose of alternative media and how it guides our understanding of their role in differently structured societies and political systems.
On Friday, 21 of October, the closing event of the different research projects that have been supported by the Digital Society research program funded by the Ministry of Culture and Science of the German State of North Rhine-Westphalia took place in Berlin. Attendees included representatives from politics, science and the media who discussed potentials and weaknesses of democracy in Germany.
Policy advisor Martin Fuchs held the keynote and afterwards, the achievements of the three research groups dealing with digital change in political parties, dark participation on the internet and echochambers, were presented and discussed.
Together with other experts on Artificial Intelligence (AI) like neuroscientist Dr. Christian Klaes or scientist Lukas Brand, Lena Clever spent a week in October with high school students at the Lernferien NRW workshop to explore what an AI is, how it works and how it can affect our daily lives. She gave a presentation on social bots and tried to answer the question of whether or not they are on their way to world dominance.
As in previous events, all participants had great fun during a week packed with talks, debates and fun activities like an escape game.
Anja Schmidt-Kleinert and Lena Frischlich Contribute Chapter to Handbook on Terrorism Research
The chapter is titled “Ethical challenges in terrorism research” and deals with the high risks that not only researchers but also those researched take to contribute to the research process. The authors also take into consideration the various challenges that such an interdisciplinary topic presents: not only political but also security authority interests have an influence on the research and they present the questions that scientists have to answer for themselves in order to take a scientific and ethical standpoint.
Alternative Counter-News Use and Fake News Recall During the COVID-19 Crisis
In this study, a random-quota survey (N=967) was conducted and validated the claim that people who already have an ideologically biased worldview are particularly likely to use alternative-counter news. They recalled having seen more alternative news than people who do not consume alternative media and counter-news were also found to mediate the relationship between counter-hegemonic attitudes and fake news recall. While not all counter-news is automatically fake news, these publications do attract a specific audience and further “pollute” the information ecosystem.