For violating policies 58 million videos were Eliminated past quarter by YouTube


It also eliminated 1.7 million stations and over 224 million remarks.

Between July and September, the company took down 7.8 million movies, nearly 1.7 million channels, and over 224 million remarks, and YouTube noted that machine learning continues to play a major part in this endeavor.

"We have always used a mix of human reviewers and technology to deal with violative content on the platform, and in 2017 we began implementing more advanced machine learning technologies to flag content for review by our teams," the company stated. "This mixture of smart detection technologies and highly-trained human reviewers has allowed us to consistently enforce our policies with increasing speed."

Of the more than 7.8 million movies that were taken down to violating YouTube's community guidelines, 81 percent were detected from the company's automated systems. And the vast majority of those videos -- 74.5 percent -- didn't obtain a single opinion prior to being discovered. Nearly three-quarters of the videos were spam, while videos separating child security and content rules each accounted for 10 percent of what was removed. Only 0.4 percent of eliminated videos included content that promoted violence or violent extremism.

You can also read: 2018's biggest Cyber attacks and how companies can prevent it in 2019

As for entire stations, they are eliminated after they have accrued three strikes for breaking community guidelines if they comprise severe abuse or are found to be"wholly dedicated" to violating YouTube's guidelines. Nearly 80% of those 1.7 million eliminated channels were removed for promoting spam, over 12% were eliminated for hosting adult content and 4.5 percent were removed for violating child safety rules. And since all of a station's videos are removed when it is terminated, 50.2 million additional videos have been removed in the past quarter through channel terminations.



The greater than 224 million comments removed by YouTube comprised people who violated the platform's network guidelines in addition to comments that YouTube tagged as"likely spam" and was not accepted by the creators whose stations they seemed on. YouTube's automated systems caught 99.5 percent of the remarks which were eliminated.

YouTube has had problems in the past regarding inappropriate and bothering children' videos in addition to extremism. To handle the problem, Google committed to boosting its machine learning efforts and incorporating more visitors to YouTube's Trusted Flagger program, this past year. The European Union has lasted to drive companies Such as Google, Facebook, and Twitter to remove extremist content in an hour.

Visit us: https://nitesh-khawani.tumblr.com/

Comments