The professional home for the engineering and technology community worldwide.

The Technical Challenge of Hate Speech, Incitement and Extremism in Social Media

When:
August 18, 2016 @ 7:00 pm – 9:00 pm America/New York Timezone
2016-08-18T19:00:00-04:00
2016-08-18T21:00:00-04:00
Where:
MIT Room E51-325
Cambridge
MA
USA

Computer Society and GBC/ACM

Dr. Andre Oboler

This talk is being sponsored by the IEEE Computer Society Distinguished Visitor program.

The primary challenge is working out how to identify incitement and hate speech given: (a) the volume of content creation in social media (b) the use of videos, images, coded language, local references etc (c) the changing nature of the expression over time (d) limitations that prevent governments demanding access to non-public data.

Further, without knowing what the public is reporting to the social media platforms, how can a governments judge if social media platforms are responding adequately? This has come up in cases like the murder of Leigh Rigby (the Telegraph reports: “Facebook ‘could have prevented Lee Rigby murder'”, Sky News “Facebook Failed To Flag Up Rigby Killer’s Message”) it’s also been a hot topic in the US Congress e.g. ABC News reports, “Officials: Facebook, Twitter Not Reporting ISIS Messages”. The latest, is from Israel where Internal Security Minister Gilad Erdan said Facebook has blood on its hands for not preventing recent killings. He is quoted by Al-Monitor as saying, “[The Facebook posts] should have been monitored in time, and [the murder] should have been averted. Facebook has all the tools to do this. It is the only entity that, at this stage, can monitor such a tremendous quantity of materials. It does it all the time for marketing purposes. The time has come for Facebook to do the same thing to save lives.”

The approach my organization uses relies on crowd sourcing, artificial intelligence and cloud computing. It enables content to be evaluated by people, but then quality controls the response of the crowd through AI. It allow empirical results to be gathered, such as those reflected in this report we produced for the Israeli Government on antisemitism in social media:http://mfa.gov.il/MFA/ForeignPolicy/AntiSemitism/Pages/Measuring-the-Hate-Antisemitism-in-Social-Media.aspx

18 August - Computer Society - Obeler

Dr Andre Oboler is CEO of the Online Hate Prevention Institute, an Australian charity combating racism, bigotry and extremism in social media. He also serves as an expert on the Australian Government’s Delegation to the International Holocaust Remembrance Alliance, co-chair of the Working Group on Antisemitism on the Internet and in the Media for the Global Forum to Combat Antisemitism, and as a Vice Chair of the IEEE Computer Society’s Member and Geographic Activities Board. Dr Oboler holds a PhD in Computer Science from Lancaster University (UK), a Juris Doctor from Monash University (Australia) and completed a Post Doctoral Fellowship in Political Science at Bar-Ilan University (Israel). His research interests include empirical software engineering, process improvement, hate speech in social media and the social implications of technology. Web: Online Hate Prevention Institute www.ohpi.org.au; personal website www.oboler.com.

This joint meeting of the Boston Chapter of the IEEE Computer Society and GBC/ACM will be held in MIT Room E51-325. E51 is the Tang Center on the corner of Wadsworth and Amherst Sts and Memorial Dr.; it’s mostly used by the Sloan School. You can see it on this map of the MIT campus. Room 325 is on the 3rd floor.

Up-to-date information about this and other talks is available online at http://ewh.ieee.org/r1/boston/computer/. You can sign up to receive updated status information about this talk and informational emails about future talks at http://mailman.mit.edu/mailman/listinfo/ieee-cs, our self-administered mailing list.

To assist us in planning this meeting, please pre-register at http://www.ieeeboston.org/Register/.