YouTube is pressing pause on its business relationship with Logan Paul (pictured), one of its content creators, who has recently landed in trouble after he uploaded a graphic video containing the body of a suicide victim.
“In light of recent events, we have decided to remove Logan Paul’s channels from Google Preferred. Additionally, we will not feature Logan in season 4 of ‘Foursome’ and his new Originals are on hold,” a Google spokesperson said in a statement to Marketing. This comes shortly after the Google-owned video platform issued an open letter to its community via Twitter, addressing frustrations over the platform’s “lack of communication”.
In the open letter, YouTube condemned Paul’s actions and said it was looking into “further consequences”, it however did not reveal at the time what these actions were.
An open letter to our community:
Many of you have been frustrated with our lack of communication recently. You’re right to be. You deserve to know what's going on.
— YouTube (@YouTube) January 9, 2018
YouTube’s actions followed multiple calls to remove Paul from YouTube entirely, due to his controversial video where he was deemed disrespectful to Japanese culture, and had inappropriate reactions coming across a dead body in Japan’s Aokigahara forest (famously dubbed as the “Suicide Forest”).
While Paul has apologised for his actions, the spotlight has also fallen on YouTube for coming up as the top trending videos page of YouTube, before being taken down for violating community guidelines. Meanwhile in the marketing landscape, the move also prompted conversations about brand safety and video monetisation. Currently, there is also a petition for YouTube to ban Paul completely from its platform.
This is not the first time YouTube has pulled one of its creators from Google Preferred, which allows brands to access the top 5% of content on YouTube, reaching audiences from ages 18 to 34 years old. Last year, the video platform pulled one of its top YouTubers, Felix Kjellberg, known as PewDiePie from the service for making anti-Semitic jokes.
Lee Nugent, regional director APAC, Text100, said that YouTube was right to remove Paul given he "clearly violated" YouTube’s community guidelines. However, the move took too long and YouTube’s silence in the interim was unhelpful.
"While I absolutely would not advocate that YouTube ‘control’ what users and influencers say or do on its platform, there clearly needs to be guidelines and standards that must be adhered too and consequences when they aren’t," Nugent added. He said:
A challenge marketers face today is the presence of a small, but growing number of social 'influencers' who appear to be hungry for instant fame and all that it brings.
Nugent explained that when these influencers, focused on instant fame, step into territory that would typically be occupied by trained, competent, ethically-responsible journalists, situations such as these arise.
He added that various parties need to take responsibility when such situations crop up – be it platform owners, content generators, commentators and readers or viewers.
“When something as morally reprehensible and clearly wrong takes place, then the arbiter, in this case YouTube, has to act swiftly. But the primary responsibility falls on the shoulders of a morally-detached and very (I’m being over-generous here) naïve young man whose actions were unquestionably despicable," Nugent added.
Stanley Clement, managing director, Society added that the move made by YouTube was the right one, “but Logan Paul is not the only one to share the blame in this matter”. He added that YouTubers make a living from creating content.
Given this economic opportunity by and for YouTube, it is a social and moral responsibility of the platform to be guardians and ensure a tighter review process.
“His content (like many others), is built on sarcasm, finding opportunities to make fun of others and seeking for stuff that would scandalise. The line gets quickly blurred as they keep generating content,” he said.
While YouTube need not deploy total control over content, the platform does share the responsibility to ensure that sensitivities are managed and to respect cultural nuances, given the prevailing issues on brand safety.