× Business
TelecomHealthcareDigital MarketingERPRetailMedia and EntertainmentOil and GasFood and BeveragesMarketing and AdvertisingBanking and InsuranceMetals and MiningLegalComplianceCryptocurrency
Big DataCloudIT ServiceSoftwareMobileSecurityNetworkingStorageCyber SecuritySAPData AnalysisloTBio TechQuality AssuranceEducationE-commerceGaming and VFXArtificial Intelligencescience-and-technology
Cisco DATABASE Google IBM Juniper Microsoft M2M Oracle Red hat Saas SYMANTEC
CEO ReviewCMO ReviewCFO ReviewCompany Review
Startups Opinion Yearbook Readers Speak Contact Us

Artificial Intelligence can clear up the harmful Facebook contents

siliconreview Artificial Intelligence can clear up the harmful Facebook contents

Google, YouTube and many such companies have already introduced a system where the harmful contents can be removed from the webpage. Recent studies have proved that Facebook is also introducing a similar kind of filtering process. The harmful contents no matter what will be filtered out the page. Facebook has ramped up use of artificial intelligence such as image matching and language understanding to identify and remove content quickly, Monika Bickert, Facebook's director of global policy management, and Brian Fishman, counter-terrorism policy manager.  We can see that many extremist groups are using Facebook as their media and the company wanted to bring an end to the entire process and this marks the beginning. Although many methods were previously launched, nothing can be as successful as Artificial Intelligence and that of Machine Language.

Facebook uses artificial intelligence for image matching that allows the company to see if a photo or video being uploaded matches a known photo or video from groups it has defined as terrorist, such as Islamic State, Al Qaeda and their affiliates, the company said in the blog post. There is yet another system that makes the system run more successfully. YouTube, Facebook, Twitter and Microsoft last year created a common database of digital fingerprints automatically assigned to videos or photos of militant content to help each other identify the same content on their platforms.