Updated YouTube, under fire since inception for building a business on other people's copyrights and in recent years for its vacillating policies on irredeemable content, recently decided it no longer wants to host instructional hacking videos.
The written policy first appears in the Internet Wayback Machine's archive of web history in an April 5, 2019 snapshot. It forbids: "Instructional hacking and phishing: Showing users how to bypass secure computer systems or steal user credentials and personal data."
Lack of clarity about the permissibility of cybersecurity-related content has been an issue for years. In the past, hacking videos in years past could be removed if enough viewers submitted reports objecting to them or if moderators found the videos violated other articulated policies.
Now that there's a written rule, there's renewed concern about how the policy is being applied.
Kody Kinzie, a security researcher and educator who posts hacking videos to YouTube's Null Byte channel, on Tuesday said a video created for the US July 4th holiday to demonstrate launching fireworks over Wi-Fi couldn't be uploaded because of the rule.
"I'm worried for everyone that teaches about infosec and tries to fill in the gaps for people who are learning," he said via Twitter. "It is hard, often boring, and expensive to learn cybersecurity."
In an email to The Register, Kinzie clarified that YouTube had problems with three previous videos, which got flagged and are either in the process of review or have already been appealed and restored. They involved Wi-Fi hacking. One of the Wi-Fi hacking videos got a strike on Tuesday and that disabled uploading for the account, preventing the fireworks video from going up.
The Register asked Google's YouTube for comment but we've not heard back.
Security professionals find the policy questionable. "Very simply, hacking is not a derogatory term and shouldn’t be used in a policy about what content is acceptable," said Tim Erlin, VP of product management and strategy at cybersecurity biz Tripwire, in an email to The Register.
"Google’s intention here might be laudable, but the result is likely to stifle valuable information sharing in the information security community."
Erlin said that while it may be reasonable to block content that shows actual illegal activities, like breaking into a specific organization's systems, instructional videos play an important role in cybersecurity education.
"In cybersecurity, we improve our defenses by understanding how attacks actually work," said Erlin. "Theoretical explanations are often not the most effective tools, and forcing content creators onto platforms restricted in distribution, like a paid training course, simply creates roadblocks to the industry. Sharing real world examples brings more people to the industry, rather than creating more criminals."
Tyler Reguly, manager of security R&D at Tripwire, said censorship has been a concern among YouTube video makers for some time. In an email to The Register, he expressed sympathy for the challenge YouTube faces as a business.
"If YouTube wants advertisers to pay, they need to be aware of the content they are allowing," he said. "We tend to forget that these websites exist to make money, not for the betterment of society."
But he noted that YouTube's policies aren't easy to interpret and there may be reasons Kinze's video got flagged, such as the fact that it deals with fireworks.
"The YouTube system, based on reports that I’ve seen in the past, is quite arbitrary and difficult to understand, even as a YouTuber working directly with the company, nothing is as straightforward as it seems," he said.
Dale Ruane, a hacker and penetration tester who runs a YouTube channel called DemmSec, told The Register via email that he believes this policy has always existed in some form. "But recently I've personally noticed a lot more people having issues where videos are being taken down," he said.
While he said he hasn't seen Kinzie's video and can't say for certain why it was removed, the video removals he's dealt with have tended to involve the metadata provided when uploading videos to the site.
"It seems adding video tags or titles which could be interpreted as malicious results in your video being 'dinged,'" he said. "For example, I made a video about a tool which basically provided instructions of how to phish a Facebook user. That video was taken down by YouTube after a couple of weeks."
Similarly, he said, if he were to attempt to make a Wi-Fi penetration test video more easily discoverable via search, by adding a tag like "hack neighbors Wi-Fi," it would be demonetized (denied ad revenue) or taken down.
Ruane said he somewhat agrees with that policy but notes that YouTube's recent algorithm changes mean videos have to be "click-baity" to appear in the Suggested Videos list.
"I've had around 5-10 videos removed in total and they all tend to follow this trend where I have included metadata with the goal of making the video more 'clickable,'" he said. "However when I post videos now, I ensure that I don't include company names and always include a disclaimer in the video and metadata. This hasn't stopped some videos being automatically demonetized but most of them get reinstated after I appeal the decision."
Evidently, YouTube doesn't want to get rid of hacking videos altogether. Ruane said in phone conversations with YouTube advisors – charged with helping video creators grow their audiences – no one has expressed reservations about the hacking videos he discussed.
"I think the way in which this policy is written is far too broad," said Ruane, allowing that it would be better if it were more narrowly tailored to forbid showing people how to compromise specific systems, like the Facebook phishing video he made. "I also find the policy extremely hypocritical from a company (Google) that has a history of embracing 'hacker' culture and claims to have the goal of organizing the world's information." ®
After this story was filed, a YouTube spokesperson replied with some talking points on background. Per the company’s request, we won’t repeat them. If we receive attributable comments, we’ll pass them along.