top of page
donut plots showing the fraction of untrustful accounts in six topics

Fake News and Untrustful Social Accounts

Online social media (OSM) allows people to share news, create content and start trends, potentially influencing large sectors of the population. Unfortunately, several organizations and individuals take advantage of the OSM in order to gain influence, damage competitor’s reputation, or spread political propaganda by financing misinformation campaigns. In many cases, these campaigns are carried out by armies of online untrustful accounts, which can be operated by humans, computer programs, or even a combination of the two.
In recent years, several approaches to identify untrustful accounts, as well as misinformation campaigns, have been proposed. However, misinformation is still reported to proliferate on the Web.


In this study, we investigate novel approaches to estimate the level of manipulation within online discussion based on the detection of un-trustful accounts. The proposed approaches are evaluated using multiple available datasets collected from Twitter. The data collected for this study includes manually labeled trustful and un-trustful accounts that participated in crowdturfing platforms. 

Source code: GitHub 

Early Detection Alert and Response to eThreats (eDare): Project

Fake News and Untrustful Social Accounts

References and Links to Papers

Aviad Elyashar, Jorge Augusto Bendahan, Amparo Maria Sanmateu, Rami Puzis, "Measurement of Online Discussion Authenticity within Online Social Media" ,ASONAM (2017) ,Sydney, Australia. [google]

Aviad Elyashar, Jorge Augusto Bendahan, Rami Puzis, "Is the News Deceptive? Fake News Detection using Topic Authenticity" ,SOTICS (2017) . [google]

Luiza Nacshon, Rami Puzis, Amparo Sanmateu, Chanan Glezer, "DiffTrack: Pinpointing Key-Posts of Influence Within Discussion Topics in Social Media" ,ICSMM`17 (2017) . [google]

Aviad Elyashar, Jorge A. Bendahan, Rami Puzis, (2017) "‏Detecting Clickbait in Online Social Media: You Won't Believe How We Did It‏", arXiv preprint arXiv:1710.06699‏

bottom of page