Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Shutterstock

Social media sites should find new ways to verify children's ages - report

Facebook, Snapchat and Instagram gave statements on their security to the Oireachtas Committee on Children and Youth Affairs.

A REPORT COMPILED by TDs and Senators has recommended that social media sites like Facebook and Snapchat should work harder to verify young people’s ages.

The Joint Committee on Children and Youth Affairs released a report today that lists a range of measures that should be taken to protect children from being taken advantage of, bullied or harassed online.

In its report, the committee members outlined concerns about the methods used by social medias to verify people’s ages.

“The Joint Committee also had concerns about this issue from its interaction with the social media platforms Facebook, Instagram and Snapchat and how they verify that their users are of the age where they are able to set up an account.”

Therefore, the Joint Committee considers that a robust system of age verification is necessary to ensure that children and young people of an appropriate age are using these platforms.

In its submission to the committee, expert Professor Barry O’Sullivan said that “self-verification does not work”.

A child simply saying he or she is 13 or 16 is not adequate. One does not enter bars on the basis that one says one is 22 if one is not. Ireland could lead in the area of online age verification.
We believe that robust age verification online is one of the most critical requirements to deliver on child and youth security in cyber contexts.

Among its other suggestions, the oireachtas committee said that a Digital Safety Commissioner should be created who would place a specific emphasis on children’s safety online.

It recommended that the proposed commissioner’s office could work with social media companies to find a “more accurate system” for children to verifying their ages.

shutterstock_776295472 Shutterstock / itthiphon suangam Shutterstock / itthiphon suangam / itthiphon suangam

FB, Snaps, and Insta

Speaking to the committee on the 6 December 2017, representatives of Facebook told the committee how it deems what is inappropriate content on its site, and how quickly it can remove it.

“We prioritise the most serious issues first. Many of the reports related to suicide, credible threats, child safety or bullying are reviewed ahead of any other topics.

“The vast majority of reports are reviewed within 24 hours and evaluated against our community standards.”

It said that the team that deals with tens of millions of reports is 7,500-people strong, and that “several hundred” of that number are located in its Dublin headquarters.

If reported content is found to be against our community standards, it is immediately removed.

“We also close the loop with the person who reported the content to let him or her know what action we have taken.

People who engage in abusive behaviour on Facebook face varying consequences, ranging from a warning to losing their accounts permanently.
In the most severe cases, for example, where child exploitation is involved, such people can be referred to law enforcement.

Representatives of Snapchat and Instagram were also asked to attend the Oireachtas committee – they declined, choosing instead to submit statements.

shutterstock_1053173438 Shutterstock / Mrs Allsorts Shutterstock / Mrs Allsorts / Mrs Allsorts

In Snapchat‘s submission, it said that pornography on its Stories, nudity in a sexual context, and nudity of people under the age of 18 is prohibited under its rules.

Snapchat also said it does not tolerate content that “threatens to harm a person, group of people or their property”, harassment, bullying, or other actions that “makes people feel bad”.

Equally, if a Snapchatter blocks another user, it is not okay to contact them from another account.

It also said that Snap, which owns Snapchat, doesn’t market the app to children.

Snap makes no effort to market Snapchat to children or to make the user experience appealing to children. Snapchat is not available in the Kids or Family sections of any app store.
Snapchat is rated 12+ in the Apple app store and rated Teen in the Google Play store, putting parents on notice that Snapchat is not designed for children. These ratings reflect Snapchat’s content, which is designed for teens and adults, and not children under 13.

It said that it had developed the ways in which you can report Snaps, Stories and users to Snapchat, adding that you can block or delete users too.

Instagram made a similar submission. Because the company is owned by Facebook, it shares that office of reviewers mentioned earlier.

It said that it removes “content that contains credible threats or hate speech; content that targets private individuals to degrade or shame them; personal information meant to blackmail or harass someone, and repeated unwanted messages”.

It said that it has a team of people who monitor Instagram’s comment section.

Comments can also be blocked from certain people or groups of people, whether your account is public or private. You are able to ‘swipe’ to delete comments, or to easily block and report abusive content.
We have also introduced tools that allow people to turn comments off altogether, should they wish to let an image stand on its own.

What else did the report say?

In its other recommendations, the committee said that new laws were needed to protect children against the specific threat of abuse, harassment, bullying, and stalking online.

It also recommended that schools make classes on cyber security available as part of the primary and secondary school curriculum. It said that classes should also be set up for parents to learn about the dangers of children being unsupervised online.

Read: TD warns mobile phones give children access to ‘unlimited pornography of every type’

Read: Social workers have to develop skills to deal with allegations of historical abuse ‘on the hoof’

Readers like you are keeping these stories free for everyone...
A mix of advertising and supporting contributions helps keep paywalls away from valuable information like this article. Over 5,000 readers like you have already stepped up and support us with a monthly payment or a once-off donation.

Close
20 Comments
This is YOUR comments community. Stay civil, stay constructive, stay on topic. Please familiarise yourself with our comments policy here before taking part.
Leave a Comment
    Install the app to use these features.
    Mute Denis McClean
    Favourite Denis McClean
    Report
    Mar 16th 2018, 1:10 PM

    Multiple carts before horses and fields with open gates come to mind.

    25
    Install the app to use these features.
    Mute Paul O Faolain
    Favourite Paul O Faolain
    Report
    Mar 16th 2018, 1:47 PM

    Was it pilot error or something else,does anyone know the answer

    16
    Install the app to use these features.
    Mute Nauris Serna
    Favourite Nauris Serna
    Report
    Mar 16th 2018, 2:11 PM

    @Paul O Faolain: Ad with most accidents, it’s a sequence of events that led to it. Also, worth noting, that the purpose of the investigation is to prevent a repeat, rather than put a blame or liability on somebody.

    31
    Install the app to use these features.
    Mute Crocodylus Pontifex
    Favourite Crocodylus Pontifex
    Report
    Mar 16th 2018, 2:53 PM

    @Paul O Faolain: The investigation is ongoing. There is a requirement to publish a report 1 year after the event if the case is still open which is what today’s update is about. A lot of things had to line up for this tragedy to happen so there will be no clear and simple answer.

    14
    Install the app to use these features.
    Mute Hugh Legat
    Favourite Hugh Legat
    Report
    Mar 16th 2018, 3:39 PM

    Investigation ongoing. I would like to know is it correct the pilot repeatedly ignored radar alarms of an “imminent collision” take evasive action! It doesn’t matter it wasn’t on her chart.

    16
    Install the app to use these features.
    Mute Ohhh_reeally
    Favourite Ohhh_reeally
    Report
    Mar 16th 2018, 4:01 PM

    @Hugh Legat: good man Hugh, “you’d like to know”. Do you mind me asking why you’d like to know? What use will the information be to you once you have it?

    14
    Install the app to use these features.
    Mute pat winters
    Favourite pat winters
    Report
    Mar 16th 2018, 5:10 PM

    @Hugh Legat: A lot of reading but avoiding the proverbial elephant (lighthouse) in the room. A lighthouse is a tower, building, or other type of structure designed to emit light from a system of lamps and lenses and to serve as a navigational aid for maritime pilots at sea or on inland waterways. Lighthouses mark dangerous coastlines, hazardous shoals, reefs, and safe entries to harbors; they also assist in aerial navigation.

    6
    See 2 more replies ▾
    Install the app to use these features.
    Mute EUGENE 70 percent
    Favourite EUGENE 70 percent
    Report
    Mar 16th 2018, 10:35 PM

    @Hugh Legat: is it the ground proximity warning system that gives the warning you refer to.

    Because an earlier finding was that Blackrock was not on whatever mapping the ground proximity warning uses.

    I see on RTE that the investigation is continuing and that warning systems are one of the things that will continue to be looked at.

    Best wait for the FINAL report before rushing to assume the pilot ignored warmings.

    Quite striking in my eyes to see a reccomendation for review of SYSTEMS and how the SAR operations operate “if it was as simple as a pilot ignoring a warning”.

    Can’t help but suspect that theres a POSSIBILITY that investigators see wider system flaws as at the very least a POTENTIAL contributory factor in the crash.

    They aren’t making recommendations for fun. They are likely making them because the investigation is uncovering things pointing them to a NEED for one.

    4
    Install the app to use these features.
    Mute Quentin Moriarty
    Favourite Quentin Moriarty
    Report
    Mar 17th 2018, 9:09 AM

    @Ohhh_reeally: landing gear down 9 miles out
    Legat has a point

    3
    Install the app to use these features.
    Mute Quentin Moriarty
    Favourite Quentin Moriarty
    Report
    Mar 16th 2018, 5:32 PM

    Altitude approach protocols a priority

    4
    Install the app to use these features.
    Mute Martin Dale
    Favourite Martin Dale
    Report
    Mar 16th 2018, 3:37 PM

    Proper order

    4
    Install the app to use these features.
    Mute Neuville-Kepler62F
    Favourite Neuville-Kepler62F
    Report
    Mar 17th 2018, 6:24 PM

    “Professional” Pilot error is never a root cause of air accidents when viewed from an engineering perspective. It is a systems or process error. This is the second helicopter crash after Tramore. The root cause of both must be identified.

    Even at speed a good on board radar system will give ample advanced warning of impending collision and should intervene in aircraft flight to avoid it.

    1
Submit a report
Please help us understand how this comment violates our community guidelines.
Thank you for the feedback
Your feedback has been sent to our team for review.

Leave a commentcancel

 
JournalTv
News in 60 seconds