Ofcom Publishes First Set of Draft Codes of Practice Under the Online Safety Act

Ofcom, the UK’s online safety regulator, has recently released its first set of draft Codes of Practice under the Online Safety Act. These draft codes primarily focus on guidelines for user-to-user (U2U) services, highlighting the importance of protecting children from illegal content. Among the recommendations provided are the avoidance of presenting children with suggested friends, keeping children’s connection lists private, and prohibiting accounts outside a child’s connection list from sending direct messages. The Online Safety Act aims to safeguard users from various risks associated with illegal content, including child sexual abuse material, terrorism, and fraud. Ofcom’s approach eschews a one-size-fits-all solution, instead opting for different risk mitigations tailored to the size and risk profile of individual services. The release of the final guidance is anticipated in autumn 2022 following a consultation period, with compliance expected within three months thereafter. Striving to strike a balance between user safety and freedom of expression, Ofcom acknowledges the complexities of regulating online safety. While controversial issues like the impact on end-to-end encryption are sidestepped in the draft codes, the regulator will determine the services under its supervision based on user base size and associated risks. Enforcement will be carried out through information notices, and non-compliant services may face sanctions.

Table of Contents

Purpose and Scope of the Online Safety Act

Introduction to the Online Safety Act

The Online Safety Act is a piece of legislation in the United Kingdom aimed at protecting users from harmful and illegal content online. With the increasing prevalence of digital platforms and services, there has been a growing need to establish guidelines and regulations to ensure online safety for all individuals, particularly vulnerable groups such as children. The act was introduced to address these concerns and provide a comprehensive framework for online safety.

Aims of the Online Safety Act

The main objectives of the Online Safety Act are to safeguard users from the risks posed by illegal content, protect children from harmful material, and create a safer online environment. The act focuses on identifying and addressing various forms of illegal content, including child sexual abuse material, terrorism-related content, and fraudulent activities. By setting standards for digital services, the act aims to mitigate the potential harm that users may face while utilizing online platforms.

Scope of the legislation

The Online Safety Act applies to a wide range of digital services operating in the United Kingdom. This includes social media platforms, messaging services, video-sharing platforms, and search engines, among others. The legislation recognizes the diverse nature of these services and takes a tailored approach to risk mitigation based on their size and risk profile. By encompassing a broad spectrum of digital services, the Online Safety Act aims to ensure comprehensive protection for users across various online platforms.

Ofcom’s Role as the Online Safety Regulator

Overview of Ofcom’s responsibilities

Ofcom, the UK’s communications regulator, has been entrusted with the role of the Online Safety Regulator. As the regulator, Ofcom is responsible for overseeing and enforcing the provisions outlined in the Online Safety Act. This includes setting and implementing rules for digital services, monitoring compliance, and taking appropriate enforcement measures when necessary. Ofcom’s expertise in the communications sector positions it well to fulfill this crucial role in ensuring online safety for all.

Ofcom’s authority and powers

As the Online Safety Regulator, Ofcom has been granted significant authority and powers to enforce compliance with the Online Safety Act. This includes the ability to issue information notices to non-compliant services, which require them to take specific actions or provide relevant information. Ofcom can also impose sanctions on services that fail to meet the required standards, with penalties ranging from financial penalties to blocking access to non-compliant services. These powers empower Ofcom to effectively regulate digital services and ensure adherence to the online safety guidelines.

Importance of Ofcom’s role in online safety

Ofcom’s role as the Online Safety Regulator is crucial in facilitating a safer online environment for users. By introducing and enforcing regulations, Ofcom provides a clear framework for digital service providers to follow, ensuring that they prioritize user safety. The oversight and monitoring by Ofcom helps identify potential risks and instances of non-compliance, allowing for swift action to be taken. As an independent regulator, Ofcom brings transparency and accountability to the process, instilling confidence in users and service providers alike.

Draft Codes of Practice

First set of draft codes published

Ofcom has recently released the first set of draft codes of practice under the Online Safety Act. These draft codes provide detailed guidance to digital service providers on how to respond to and mitigate the risks associated with illegal content. The publication of these codes marks an important milestone in the implementation of the legislation and sets the stage for clearer guidelines for online safety.

Focus on user-to-user (U2U) services

The draft codes of practice primarily focus on user-to-user (U2U) services, recognizing their significant role in facilitating the dissemination of content among individuals. Given the potential risks and harmful content that may be shared through these services, it is essential to establish guidelines to safeguard users, particularly children. The codes of practice outline specific measures and recommendations for U2U services to adopt, ensuring that they effectively address the challenges posed by illegal content.

Emphasis on protecting children

One of the key priorities of the draft codes of practice is the protection of children from harmful content online. Recognizing the vulnerabilities of children and the potential long-term impact of exposure to illegal material, the codes emphasize the need for stringent measures to safeguard their online experiences. The draft codes propose various mitigation measures, such as avoiding the presentation of suggested friends to children, restricting the visibility of children’s connection lists, and prohibiting direct messages from non-connections. These measures aim to create a safer online environment for children, mitigating the risks they may encounter while using U2U services.

Mitigation Measures for U2U Services

Avoiding presentation of suggested friends to children

One of the proposed mitigation measures for U2U services is avoiding the presentation of suggested friends to children. This is based on the recognition that the suggestions made by these platforms may not always be appropriate or safe for children. By removing this feature or tailoring it to exclude potentially risky connections, U2U services can effectively reduce the likelihood of children being exposed to harmful individuals or content.

Restricted visibility of children’s connection lists

To ensure the privacy and safety of children, the draft codes recommend that the visibility of children’s connection lists be restricted. This means that individuals outside of a child’s approved connections should not be able to view their list of connections. By implementing this measure, U2U services can minimize the risk of unauthorized access to children’s information and protect them from potential harm.

Prohibiting direct messages from non-connections

Another important mitigation measure suggested by the draft codes is the prohibition of direct messages from non-connections. This prevents individuals who are not approved connections from initiating direct communication with children. By limiting interactions to only approved connections, U2U services can reduce the risk of unauthorized contact and minimize the potential for harm.

Risks Posed by Illegal Content

Child sexual abuse material

One of the most significant risks associated with illegal content online is child sexual abuse material. This form of content poses a direct threat to the safety and well-being of children, and its dissemination must be effectively mitigated. The Online Safety Act and the accompanying draft codes of practice prioritize the identification and removal of such material, placing a strong emphasis on protecting children from this heinous crime.

Terrorism-related content

Terrorism-related content is another risk that the Online Safety Act intends to address. The act aims to prevent the spread of extremist material that promotes violence and radicalization. By imposing obligations on digital service providers to proactively identify and remove such content, the legislation seeks to contribute to national security efforts and mitigate the potential harm associated with terrorist propaganda.

Fraud and scams

Online platforms can also be breeding grounds for fraudulent activities and scams. The Online Safety Act seeks to address this risk by requiring digital service providers to take appropriate measures in identifying and minimizing fraudulent content and activities. By protecting users from scams and fraudulent schemes, the legislation aims to foster trust and confidence in online interactions.

Tailored Approach to Risk Mitigation

Adoption of risk-based approach

Recognizing the diverse nature of digital services, the draft codes of practice propose a risk-based approach to risk mitigation. This means that the mitigation measures and guidelines would vary depending on the size of the service and its risk profile. Larger services with a wider user base and greater potential for harm may be subject to more stringent requirements, while smaller services with limited reach may have proportionate safeguards in place. This tailored approach ensures that the regulations are effective while avoiding a one-size-fits-all approach that may hinder innovation or disproportionately burden smaller services.

Different mitigations based on service size and risk profile

The draft codes of practice outline specific mitigations that digital services should implement based on their size and risk profile. For instance, larger services with a substantial user base may be required to invest in advanced content moderation technologies, implement robust reporting mechanisms, and establish dedicated teams to handle user complaints and concerns. On the other hand, smaller services with a limited user base may be expected to implement proportionate measures such as clear reporting mechanisms and user education initiatives. This tailored approach ensures that risk mitigation measures are commensurate with the potential risks associated with each service.

Flexible guidelines for different types of services

One of the key aspects of the draft codes of practice is their flexibility in accommodating different types of digital services. The codes recognize that the risks and challenges faced by social media platforms may differ from those faced by search engines or messaging services. As a result, the guidelines provided in the codes allow for adaptability and customization based on the unique characteristics and functionalities of each service. This ensures that the regulations remain relevant and effective across a wide range of digital platforms.

Consultation and Final Guidance

Draft codes subject to consultation

The draft codes of practice released by Ofcom are subject to a comprehensive consultation process. This allows stakeholders, including digital service providers, experts, and the general public, to provide feedback and input to refine the guidelines. The consultation process serves as an opportunity to address any gaps or concerns and ensure that the final guidance takes into account a broad range of perspectives and expertise. By adopting a consultative approach, Ofcom aims to develop regulations that are both evidence-based and reflective of the diverse needs and challenges in the online safety landscape.

Timeline for consultation process

The consultation process for the draft codes of practice is expected to span over a defined period of time. It provides an opportunity for interested parties to review the guidelines and submit their feedback and comments. The timeline ensures that there is ample time for thorough deliberation and consideration of all input received. This inclusive approach reflects the commitment of Ofcom to transparency and engagement in the regulatory process.

Release of final guidance in autumn 2022

Following the consultation process, Ofcom will analyze the feedback received and make any necessary revisions to the draft codes of practice. The final guidance is anticipated to be released in autumn 2022. This timeline allows for careful consideration of all factors and ensures that the regulations are robust, effective, and reflective of the evolving online safety landscape. The release of the final guidance marks a critical milestone in the implementation of the Online Safety Act, providing digital service providers with clear and actionable guidelines to enhance user safety.

Implementation and Compliance

Expected compliance deadline

Once the final guidance is released by Ofcom, digital service providers will be given a defined timeline to achieve compliance with the regulations. While the specific compliance deadline would be established by Ofcom, it is expected that service providers would have a reasonable period to adapt their systems, policies, and procedures to align with the requirements. This approach recognizes the need for a smooth transition and allows service providers adequate time to implement the necessary changes.

Consequences for non-compliance

Non-compliance with the regulations outlined in the Online Safety Act and the associated codes of practice can have consequences for digital service providers. Ofcom is empowered to take appropriate enforcement measures against non-compliant services, which may include the imposition of financial penalties or blocking access to non-compliant platforms. These consequences serve as a deterrent and underscore the importance of adhering to the regulations to ensure the safety and well-being of users.

Enforcement through information notices

To ensure compliance, Ofcom has the authority to issue information notices to non-compliant services. These notices require the services to take specific actions or provide relevant information to demonstrate their commitment to meeting the required standards. By employing information notices, Ofcom can monitor and enforce compliance effectively, ensuring that digital service providers are accountable and responsive to the online safety guidelines.

Balancing User Safety and Freedom of Expression

Challenges of regulating online safety

Regulating online safety presents a range of challenges, particularly when it comes to striking a balance between user safety and freedom of expression. Online platforms have become vital spaces for individuals to express themselves, share diverse perspectives, and engage in public discourse. However, this openness also creates opportunities for the dissemination of harmful and illegal content. Regulators face the challenge of addressing these risks while upholding the fundamental principles of freedom of expression.

Striking a balance between safety and freedom of expression

The Online Safety Act and the draft codes of practice aim to navigate the delicate balance between user safety and freedom of expression. While prioritizing user safety and mitigating risks, the regulations strive to avoid overly restrictive measures that may stifle legitimate expression or innovation. By adopting a proportionate and tailored approach to risk mitigation, the aim is to ensure that user safety is upheld without compromising the fundamental principles of freedom of expression.

Consideration of controversial elements

In the process of developing the draft codes of practice, Ofcom has taken into account various controversial elements related to online safety. One such element is the impact on end-to-end encryption, which has raised concerns regarding privacy and security. To avoid potential conflicts and challenges, the draft codes have focused on other aspects of online safety, such as the mitigation of illegal content. By deferring certain contentious issues, Ofcom aims to ensure that the guidelines are pragmatic, effective, and balanced in addressing immediate concerns.

Enforcement and Sanctions

Enforcement mechanisms for the codes

To enforce compliance with the codes of practice, Ofcom has a range of mechanisms at its disposal. These enforcement mechanisms are designed to hold digital service providers accountable for meeting the required standards and promoting user safety. Through the use of information notices, compliance assessments, and monitoring systems, Ofcom ensures that service providers fulfill their obligations and take appropriate measures to mitigate risks.

Potential sanctions for non-compliant services

Non-compliant services may face a range of sanctions for failing to meet the required standards outlined in the codes of practice. Ofcom has the authority to impose financial penalties on non-compliant service providers, which can act as a significant deterrent. Additionally, in more severe cases of non-compliance, Ofcom may also consider blocking access to non-compliant platforms. These sanctions emphasize the seriousness of safeguarding user safety and create strong incentives for digital service providers to prioritize compliance.

Ensuring accountability and adherence to the regulations

The enforcement and sanctions framework established by Ofcom ensures that there is accountability and adherence to the regulations. By levying penalties and taking appropriate enforcement measures, Ofcom sends a clear message that user safety is a priority and that non-compliance will not be tolerated. This framework creates a strong incentive for digital service providers to invest in robust safety measures, implement effective content moderation systems, and maintain a high standard of online safety for their users.

Scroll to Top