Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Shutterstock

Social media sites should find new ways to verify children's ages - report

Facebook, Snapchat and Instagram gave statements on their security to the Oireachtas Committee on Children and Youth Affairs.

A REPORT COMPILED by TDs and Senators has recommended that social media sites like Facebook and Snapchat should work harder to verify young people’s ages.

The Joint Committee on Children and Youth Affairs released a report today that lists a range of measures that should be taken to protect children from being taken advantage of, bullied or harassed online.

In its report, the committee members outlined concerns about the methods used by social medias to verify people’s ages.

“The Joint Committee also had concerns about this issue from its interaction with the social media platforms Facebook, Instagram and Snapchat and how they verify that their users are of the age where they are able to set up an account.”

Therefore, the Joint Committee considers that a robust system of age verification is necessary to ensure that children and young people of an appropriate age are using these platforms.

In its submission to the committee, expert Professor Barry O’Sullivan said that “self-verification does not work”.

A child simply saying he or she is 13 or 16 is not adequate. One does not enter bars on the basis that one says one is 22 if one is not. Ireland could lead in the area of online age verification.
We believe that robust age verification online is one of the most critical requirements to deliver on child and youth security in cyber contexts.

Among its other suggestions, the oireachtas committee said that a Digital Safety Commissioner should be created who would place a specific emphasis on children’s safety online.

It recommended that the proposed commissioner’s office could work with social media companies to find a “more accurate system” for children to verifying their ages.

shutterstock_776295472 Shutterstock / itthiphon suangam Shutterstock / itthiphon suangam / itthiphon suangam

FB, Snaps, and Insta

Speaking to the committee on the 6 December 2017, representatives of Facebook told the committee how it deems what is inappropriate content on its site, and how quickly it can remove it.

“We prioritise the most serious issues first. Many of the reports related to suicide, credible threats, child safety or bullying are reviewed ahead of any other topics.

“The vast majority of reports are reviewed within 24 hours and evaluated against our community standards.”

It said that the team that deals with tens of millions of reports is 7,500-people strong, and that “several hundred” of that number are located in its Dublin headquarters.

If reported content is found to be against our community standards, it is immediately removed.

“We also close the loop with the person who reported the content to let him or her know what action we have taken.

People who engage in abusive behaviour on Facebook face varying consequences, ranging from a warning to losing their accounts permanently.
In the most severe cases, for example, where child exploitation is involved, such people can be referred to law enforcement.

Representatives of Snapchat and Instagram were also asked to attend the Oireachtas committee – they declined, choosing instead to submit statements.

shutterstock_1053173438 Shutterstock / Mrs Allsorts Shutterstock / Mrs Allsorts / Mrs Allsorts

In Snapchat‘s submission, it said that pornography on its Stories, nudity in a sexual context, and nudity of people under the age of 18 is prohibited under its rules.

Snapchat also said it does not tolerate content that “threatens to harm a person, group of people or their property”, harassment, bullying, or other actions that “makes people feel bad”.

Equally, if a Snapchatter blocks another user, it is not okay to contact them from another account.

It also said that Snap, which owns Snapchat, doesn’t market the app to children.

Snap makes no effort to market Snapchat to children or to make the user experience appealing to children. Snapchat is not available in the Kids or Family sections of any app store.
Snapchat is rated 12+ in the Apple app store and rated Teen in the Google Play store, putting parents on notice that Snapchat is not designed for children. These ratings reflect Snapchat’s content, which is designed for teens and adults, and not children under 13.

It said that it had developed the ways in which you can report Snaps, Stories and users to Snapchat, adding that you can block or delete users too.

Instagram made a similar submission. Because the company is owned by Facebook, it shares that office of reviewers mentioned earlier.

It said that it removes “content that contains credible threats or hate speech; content that targets private individuals to degrade or shame them; personal information meant to blackmail or harass someone, and repeated unwanted messages”.

It said that it has a team of people who monitor Instagram’s comment section.

Comments can also be blocked from certain people or groups of people, whether your account is public or private. You are able to ‘swipe’ to delete comments, or to easily block and report abusive content.
We have also introduced tools that allow people to turn comments off altogether, should they wish to let an image stand on its own.

What else did the report say?

In its other recommendations, the committee said that new laws were needed to protect children against the specific threat of abuse, harassment, bullying, and stalking online.

It also recommended that schools make classes on cyber security available as part of the primary and secondary school curriculum. It said that classes should also be set up for parents to learn about the dangers of children being unsupervised online.

Read: TD warns mobile phones give children access to ‘unlimited pornography of every type’

Read: Social workers have to develop skills to deal with allegations of historical abuse ‘on the hoof’

Readers like you are keeping these stories free for everyone...
A mix of advertising and supporting contributions helps keep paywalls away from valuable information like this article. Over 5,000 readers like you have already stepped up and support us with a monthly payment or a once-off donation.

Author
Gráinne Ní Aodha
View 19 comments
Close
19 Comments
This is YOUR comments community. Stay civil, stay constructive, stay on topic. Please familiarise yourself with our comments policy here before taking part.
Leave a Comment
    Submit a report
    Please help us understand how this comment violates our community guidelines.
    Thank you for the feedback
    Your feedback has been sent to our team for review.

    Leave a commentcancel

     
    JournalTv
    News in 60 seconds