Contents
How Facebook is Protecting Minors in 2022
Facebook is continuing to push for safety and integrity policies. It is using machine learning to keep an eye on questionable behavior. This new feature will educate underage users and take action before they respond to messages. This is a huge step forward for the social networking platform. Let’s take a closer look at the changes Facebook has made to its platform. Read on to find out how Facebook is protecting minors in 2022.
Facebook’s platform safety and integrity rules
The board of directors of Facebook has released recommendations for improving its platform safety and integrity rules. These recommendations include publishing lists of dangerous individuals and organizations, ensuring that the policy is easily accessible and easily understood, and addressing the “information gap” between the policy and its internal rules. Facebook has a long history of addressing content moderation issues, and this new policy will hopefully help it do just that. The policy is twelve thousand words long and contains dozens of subcategories and technical definitions. The document is also buried in an internal software system that is only accessible by content moderators and select employees. These recommendations are largely similar to the Santa Clara Principles, which are a civil society charter that outlines minimum standards for content moderation.
The lack of transparency about the process of creating and maintaining the Integrity team has created a situation whereby the company is unwilling to disclose all the information it collects from its users. These concerns led the company to hire contractors to transcribe audio chats, and then re-transcribe them to see if they are accurate. The company has also faced criticism for its revenue model, which involves selling user data. Some individuals and employers have even used Facebook data for their own purposes.
Its move into the metaverse
If you’ve been following the latest developments in social media, you’ve probably noticed that Facebook is making a big push into the metaverse. The term is a play on words that means “big picture” or “combined platform,” and it came from a science fiction novel by Neal Stephenson. The author of the novel, Snow Crash, describes a virtual world that is free to use and allows users to create content. Facebook has taken a similar approach, but with different technologies.
While Facebook has owned Oculus for years, it has also been expanding into the smart glasses space, reaching into the Spectacles market while Snapchat was releasing Spectacles. Like the internet, the Metaverse will be a network with no central host, meaning that everyone can share the same space. The big difference is that Facebook is going into the Metaverse as a facilitator. The company is attempting to position itself as the facilitator of the Metaverse and not as a host.
Its efforts to address racial injustice
Facebook made strides in February to make its advertising policies less discriminatory, but its efforts were not enough. Federal law prohibits the use of the ‘N-word’ to describe anyone, and it is illegal for companies to promote housing, employment, or credit options that discriminate against people of color. Facebook also reduced pressure by closing an investigation into its advertising policies, but still has far to go.
To counter the growing number of hateful and racist posts, the company has pledged $10 million to nonprofit groups dedicated to racial justice. However, the company has faced a backlash after President Donald Trump’s tweet saying, ‘When looting starts, the shooting begins!’ Twitter flagged this tweet and warned users not to glorify violence. For the time being, Facebook is working with employees and civil rights advisers to find better ways to combat hate speech and racial inequality.
Its new feature to protect minors
The Senate recently passed a bill requiring online platforms to protect minors. The new law would require these platforms to make the strongest safeguards the default setting, and prohibits services from encouraging minors to disable those safeguards. It also requires covered platforms to issue annual reports based on an independent third-party audit. Additionally, the bill would allow researchers to access certain data from these platforms, which must be vetted by the National Telecommunications and Information Administration.
While tech companies say they already comply with many federal rules, many have implemented new rules or features to ensure the safety of their younger users. Instagram and TikTok have also rolled out new features and rules that help keep their users younger. And this pressure could only grow as more states consider legislation that would limit the activities of these companies. The Future of Tech Commission, an independent bipartisan group, recommended banning the collection of personal information from users younger than 16 and regulating behavioral advertising to protect children.