Basmi Materials For Child Sexual Harassment, Discord Releases Two New Security Features

JAKARTA - Discord has finally introduced a new feature to protect their underage users. They introduced this feature through the Teen Safety Assist program.

Teen Safety Assist is one of Discord's efforts to eradicate child abuse (CSAM) material on their platform. This effort is also made to protect teens from adult predators.

The Teen Safety Assist program has two features in it, namely Automatic Warning and content filters. Both features have been set by default or automatically on their system.

When a teenager gets a message by an unrecognized user, a warning will appear automatically, especially when Discord detects the danger of first-party messages.

Teenagers who get messages from foreigners will be directed to three options, including responding to messages, blocking accounts, or related safety tips.

Meanwhile, sensitive images sent by a person to a teenager will be automatically distributed with a content filter. Apart from blurring, this sensitive image will automatically be deleted.

This feature can be turned off via Settings and can be activated for adults who need it. In other words, this filter can be useful for all users of the platform.

Discord began to focus on the CSAM issue after NBC News discovered 165 cases of prosecution against CSAM through the Discord platform. In fact, a number of adults forced teenagers to send sexual images.

After this problem got serious, Discord began banning teenage dating servers and banning the division of CSAMs created with Artificial Intelligence (AI). Discord also introduced parents-only dashboards to track their children's playing ways on Discord.