The Digital Platform Regulators Forum (DP-REG)

The Digital Platform Regulators Forum (DP-REG) is an important information-sharing and collaboration initiative between Australian independent regulators focused on fostering a safe, trusted, fair, innovative and competitive digital economy in Australia. 

As technologies continue to evolve it is vital that regulators continue to work together on emerging issues to ensure that Australians continue to benefit from new technologies. 

Background

This working paper, the third in a series exploring digital platform technologies, examines multimodal foundation models (MFMs) and their implications for consumer protection, competition, the media and information environment, privacy, and online safety within the digital platform context. 

A multimodal foundation model (MFM) is a type of generative AI that can process and output multiple data types. Large Language Models (LLMs), as explored in our previous working paper, are an example of a type of generative AI that focuses on a single data type. 

MFMs can generate multiple outputs including text, images, videos and even audio. These AI models are trained on exceptionally large datasets comprising various formats, allowing them to process and generate outputs that combine these different forms.

Opportunities and risks

MFMs perform as supercharged AI creators. Give them a text prompt, and they can create an image to match. Feed them audio, and they might generate a corresponding video. Provide a picture, ask them to describe it, and they can provide a text description.  These capabilities could open many opportunities for consumer and business adoption across various industries – from generating personalised content experiences to new ways of creating music and images. 

Many of the risks associated with MFMs are similar to the limitations considered by DP-REG members in our examination of large language models (LLMs) – for example, the potential to produce unexpected outputs or outputs that are inaccurate or harmful. Although MFMs provide potential opportunities to consumers and business, they also have the potential to amplify risks. The ability to generate multiple types of content, such as image, audio and video also raise concerns about scams and deceptive practices, the spread of misinformation and disinformation, the generation of harmful content, and loss of control over personal information.

Applicable regulatory frameworks

MFMs impact various regulatory areas, raising issues that span the responsibilities of all DP-REG members. These include deepfakes, the collection of personal information, and scams. For example, deepfakes can have implications for online safety, privacy, misinformation, consumer protection, and trust in the digital economy.

Common themes emerge from the potential harms of MFMs, for example: 

  • Without clear disclosure and labelling, people may struggle to tell genuine content from AI- generated material, and
  • MFMs can use personal information to produce highly personalised content at scale, making content more persuasive and increasing risks such as the spread and amplification of misinformation, terrorist propaganda, and scams.

Further, the deployment of MFMs could result in growing challenges for enforcement, as well as regulatory challenges spanning across the remit of each DP-REG member. These challenges highlight the need for DP-REG member regulators to consider this technology together. 

DP-REG members’ regulatory frameworks can apply to the harms arising from MFMs. As such, where frameworks apply, regulated entities using MFMs remain subject to the same consumer, competition, privacy, online safety and media laws or regulations and are expected to comply with their obligations. In some cases, there are also new requirements  – for example, there are online safety codes and standards registered in 2023-24 which apply to certain services deploying or providing access to MFMs. The full paper provides a detailed explanation of how MFMs intersect with the responsibilities of each DP-REG member.

Domestic and international developments 

The Australian Government is considering proposed reforms that could enhance the ability of regulators to tackle issues related to consumer protection, competition, privacy, online safety, and misinformation and disinformation. 

Both in Australia and around the world, this includes work to address the issues posed by artificial intelligence (AI). Internationally, regulators and policymakers have introduced dedicated AI legislation, self-regulatory principles and governance frameworks. The Australian Government is also investing in developing policies and capabilities that support the safe and responsible adoption and use of AI technology. This includes funding to support industry analytical capabilities, and coordination of AI policy development, regulation and engagement activities across government. These efforts will also review and strengthen existing regulations in healthcare, consumer and copyright law. This working paper aims to complement and inform these broader government initiatives. DP-REG members will continue to apply our existing frameworks and engage with Government on these issues to ensure the digital economy is a safe, trusted, fair, innovative and competitive space.

Find out more 

The full working paper dives deeper into the world of MFMs. The paper explores key insight questions, alongside a comprehensive analysis of the potential impacts and applicable regulatory frameworks to address them. It also examines the latest domestic and international developments in this rapidly evolving field.

Note drafting of this paper was finalised in August 2024.

For more information about the regulatory remit and activities of each DP-REG member, visit the following web sites:

  • Australian Competition and Consumer Commission ACCC
  • Australian Communications and Media Authority ACMA
  • eSafety Commissioner eSafety Commissioner
  • Office of the Australian Information Commissioner OAIC

This document has been prepared by the ACCC, ACMA, eSafety Commissioner and OAIC (member regulators) in their capacity as members of the Digital Platform Regulators Forum (DP-REG). The information in this publication is for general guidance only. It does not constitute legal or other professional advice, and should not be relied on as a statement of the law in any jurisdiction. Because it is intended only as a general guide, it may contain generalisations. You should obtain professional advice if you have any specific concern. 

The member regulators have made every reasonable effort to provide current and accurate information, but do not make any guarantees regarding the accuracy, currency or completeness of that information. 

Parties who wish to re-publish or otherwise use the information in this publication must check this information for currency and accuracy prior to publication. This should be done prior to each publication edition, as member regulator guidance and relevant transitional legislation frequently change. Any queries parties have should be addressed to DP-REG@esafety.gov.au.