Facebook Indonesia tightens up ad rules ahead of elections

Notice: Undefined property: stdClass::$image_fulltext_alt in /home/bkmarketinginter/public_html/templates/ja_teline_v/html/layouts/joomla/content/image/intro.php on line 30

Notice: Undefined property: stdClass::$image_fulltext_caption in /home/bkmarketinginter/public_html/templates/ja_teline_v/html/layouts/joomla/content/image/intro.php on line 31
Facebook Indonesia is place a temporary ban on electoral ads purchased from outside Indonesia ahead of the election in April. The social media giant's public policy director, global elections Katie Harbath and head of public policy Indonesia Ruben Hattari said that they have learned from global elections over the past two years on how to create a "robust approach" to safeguard election integrity on the platform - without compromising people's voice in the political process.

The restriction took effect on 4 March and will apply to any ad coming from an advertiser based outside of the country, if it "references politicians or political parties or attempts to encourage or suppress voting." Facebook is using a mix of automated and human review to identify foreign electoral ads that should no longer be running on its platform.

Meanwhile, Facebook has taken steps to provide more information about any ad a page is running in the page’s “Info and Ads” section. This includes electoral ads. Providing more information such as when it was created and its previous names, the section aims to give users a better understanding of the page’s original purpose. Users can also report an ad by tapping the three dots in the top right corner and selecting “Report Ad.”

"We’re also rolling out a set of global political advertising tools by the end of June. With each election, we’re learning which tools are most useful, so we can bring them to more countries faster," said the statement.

Educating viewers

To make sure users receives accurate information about the upcoming election and make informed choices, Facebook will reduce its distribution of "clickbait or sensational material" that undermines the authenticity of the platform but may not be breaking its rules. Users can tap on "About this article" to get more context on the article and the publisher. Any content that violates its community standards will be removed.

Additionally, Facebook is also educating locals to fight false news and misinformation. In partnership with YCAB Foundation and Do Something Indonesia, Facebook is conducting the Think Before You Share school programme to equip more than 30,000 students, teachers and parents in seven provinces with critical thinking skills in the online world. Also, the social platform is training local government officials and community members in key digital skills through its Laju Digital program in East Indonesia.

Keeping bad behaviour at bay

According to Facebook, the platform has removed thousands of pages, groups and accounts that engaged in coordinated inauthentic behaviour across our platforms. Such behaviour includes misinformation, misrepresentation, foreign interference, phishing, harassment and violent threats, which tends to intensify during elections.

The statement added, "We’re also setting up an operations centre in Singapore focused on election integrity ahead of key elections across Asia Pacific. The centre will be staffed by experts from Facebook, Instagram and WhatsApp, who will work cross-functionally with our threat intelligence, data science, engineering, research, community operations, legal and other teams. This centre will work with our Menlo Park headquarters and in-country experts to serve as another line of defense against false news and misinformation, hate speech, voter suppression and election interference."

Facebook's teams will also be working closely with lawmakers, election commissions, fact-checkers, researchers, academics and civil society groups to "better integrate efforts on important issues related to election integrity." While these efforts are global, Facebook said that it is customising safety and security efforts to individual countries based on research and threat assessments that begin many months before ballots are cast. Facebook currently has more than 30,000 people working on safety and security across the company, three times as many as we had in 2017.

"We’re also providing safety and security guidance to help protect candidates and party Pages from hacking and impersonation. Like we do for all elections, we will be proactively monitoring for impersonation or hacked accounts," said the statement.