The shooting resulted in 49 deaths in New Zealand that shocked the world.
With major news events, YouTube's safety team uses a system similar to Content ID copyright tool to search for re-up videos of the original video to reconcile metadata and images. Video re-upload will be removed if it has not been edited. If edited, the tool marks (flag) for the administrator to evaluate, determining whether the video violates company policies.
YouTube's system can remove terrorist and child pornography content immediately. But with the shootings in New Zealand because it may have news value, this YouTube system cannot be removed immediately. According to YouTube rules, 'shocking or disgusting audiences' footage will be banned but if it is used for the purpose of reporting, the label will only age to protect the young audience.
Local people put flowers outside the church to mourn the victims of the shooting in New Zealand.Photo: Getty Images.
In addition, YouTube developed a Content ID system not for the purpose of handling sensational events (breaking news). To register content, YouTube Content ID takes a few minutes or sometimes a few hours. With copyright content, this period is not a big problem because if it is spread and has not been removed immediately, it will not harm the community. But with violent videos like the shootings in New Zealand, it's the exact opposite.
One of the other important reasons is that the live shootings in New Zealand were happening while the content was constantly changing, so video censorship became more difficult than ever.
Currently, YouTube is still trying to determine which video news and videos are in violation of the rules by combining every video that appears to have similar metadata and images in the gunman's original livestream. But like what YouTube can do is not enough to solve this problem.