

THIRUVANANTHAPURAM: Next time when you are about to post the seemingly innocent videos and pictures of your kids on internet, think twice. It’s just not that the paedophiles would misuse the videos, but those videos will create hindrance to the law enforcement agencies as well who work overtime to detect abuse videos. As the abuse videos are detected by artificial intelligence-powered bots, the videos of kids posted without any sinister intention will also be read by the system as child-abuse videos, making it difficult for the agencies to segregate them.
For Kerala police, the data on child abuse materials are provided by the Interpol and the Internet Crime Against Children-Child Online Protective Services (ICACCOPS), developed by the US Department of Justice. The Interpol prepares the data pertaining to particular states using their data mining artificial intelligence-powered bots. The data concerned is then distributed among the respective regions. However, the agencies have a difficult task of separating wheat from the chaff due to the presence of large number of videos that have actually got nothing to do with child abuse.
The Interpol data is provided to the state police from Delhi once or twice a month. Each report usually will have mention of 100 to 150 cases of abusive materials available on torrent sites as well as peer-to-peer networks. Similarly, the ICACCOPS- that’s a portal which can be accessed any time- also provides enormous data on child abuse materials pertaining to the state. However, the Interpol’s system and ICACCOPS work on artificial intelligence and hence will include, in its report, even innocuous videos like parents cuddling their kids or bathing them. This, according to Cyberdome officials, makes life difficult for the officials.
“Suppose Interpol’s report has mentions of 150 videos, only 20-30 will be abusive videos. The rest will be normal videos such as bathing children or cuddling them. But they are also picked up by the bots since they are programmed that way,” said a Cyberdome source. A Cyberdome official said the bots cull videos on the basis of the degree of skin exposure, skin colour etc. “At Cyberdome, we have to go through the grind to identify real abuse videos. The presence of more number of normal videos makes our job tough,” said the officer.