Since Facebook came into existence, the social media company has dedicated its resources to preserve its community with friendly relations, which appears to be impossible and difficult. Introducing a new set of tools dedicated to grouped communities — allows the admins of the group to moderate how its members communicate.
The AI-powered tool Developed by Facebook is simplified which offers a vivid overview of the contents posted by members and the members themselves. While Admins of a group will be able to control the uprise of conflict within its community.
According to Facebook, this tool that can identify ill posts and unhealthy conversations in the comment box is dubbed Conflict Alerts which is similar to the previous keywords Alerts Facebook developed to feature as a tool that alerts group admins about members who post certain words and phrases that contradicts Facebook’s policy.
The social media company is still testing Conflict Alerts and just like the previous Keywords Alert that used machine learning to bolster its functions — conflict alert is expected to spot the uprise of trouble which gives the admin an upper hand in controlling these defects.
It worth noting that this AI-powered tool is designed to alert group admin once it spots conflicts — whereby an available option for Admins to either delete the user or the post from the group. Admins can also curb how it members comments and regulate how member comment on a certain post that is likely to arise conflict.
The newly developed AI tech that detects ill posts as well as the users that share these posts in the comment box — Facebook didn’t clarify how this tool is expected to duly function. Remember, the social media company noted they are still testing this tool.
Still, Facebook spokesman hinted about using machine learning to bolster conflict alert with “multiple signals such as reply time and comment volume to determine if the engagement between users has or might lead to negative interactions.”
Conflict Alerts is similar to the previous AI tech Facebook has frequently used to detect hate speech. Still, this tool can overreact or prone to mistakes, detecting comments that include local slang, irony, and humour. To avoid conflict, this tool is expected to detect when users use words or phrases like “stupid fool, idiots, etc”
Facebook introduced another tool that helps group admins — a new admin dashboard that reveals a vivid overview of reported comments, posts, and members. A Facebook spokesman said this dashboard also gives summaries of new members that join a group, and “each group member’s activity in the group, such as the number of times they have posted and commented, or when they’ve had posts removed or been muted in the group.”
Facebook’s blog post revealed this announcement which also includes the Admin Assist that regulates users’ comments. Admins can also restrict who is allowed to share comments especially new members who are liable to share spam and unwanted promotion links. Note that Conflict Alerts is also part of Admin Assist.