In effect from: 1 July 2025

Agencies must handle personal information in accordance with the privacy principle requirements in the Information Privacy Act 2009 (Qld) (IP Act), including the Queensland Privacy Principles (QPPs).

This page explains how the privacy principle requirements in the IP Act mesh with an agency’s use of software, data, computer systems and platforms, the internet and electronic devices.

Agencies that use cameras to collect personal information must comply with QPP 3 and QPP 5. Before using the cameras, agencies should consider:

  • Is there a clearly articulated business case for establishing or extending a camera surveillance system, i.e. what operational purpose do the cameras serve and is that purpose part of the agency’s responsibilities?
  • Will the cameras serve the purpose, e.g. if they are intended to detect and prosecute illegal dumping, will the type of camera, the quality of its recordings, and its location achieve that goal?
  • Is there an alternative to using cameras? Would live monitoring of the cameras serve the purpose without recording what the cameras capture?
  • Has a Privacy Impact Assessment been conducted?
  • Is use of the cameras lawful and fair?
  • Will the cameras record sensitive information?

What to tell people about the cameras

QPP 5 requires agencies to take reasonable steps to inform individuals of a range of relevant matters specified in QPP 5.2 when they collect personal information.

For CCTV or other fixed surveillance, the best way to comply with QPP 5 is by placing a sign near the cameras’ location. Detailed information about the QPP 5 matters can also be included on the agency’s website and/or in its QPP 1 privacy policy. Depending on the location and purpose of the cameras, agencies may prefer to place minimal, simple information on the sign along with details of where people can find out more information.

Agencies which use body worn cameras may need to consider the best way to comply with QPP 5, as signage will rarely be a suitable option. This could be, for example, a verbal statement by the officers wearing the cameras or a pre-printed pamphlet or card the officers can hand out.

There are exceptions in the IP Act for law enforcement agencies and activities, e.g. an agency conducting covert surveillance does not need to advise the surveilled individuals of the QPP 5 matters, but these agencies/activities are not automatically exempt from the QPP 5 requirements.

Securing camera recordings

Security measures are not one size fits all and must be adapted to suit the nature and sensitivity of the recording and the method by which it is collected, e.g., recorded to the camera's hard drive or transmitted wirelessly to the agency.

Security measures could include:

  • Access controls for monitoring rooms and data storage areas.
  • Ensuring the public and unauthorised agency officers cannot see or access screens on which the recordings are displayed.
  • Access and audit controls for viewing, accessing, or copying recordings.
  • Encryption of wireless recordings in transit and encryption or access controls on portable recordings, e.g. on body worn camera hard drives.

Agencies should also establish polices about access to and management of recordings, including:

  • limiting as-of-right access to officers who require it as part of their duties
  • protocols for requesting access, including procedures for disclosing recordings, e.g., to a law enforcement agency under QPP 6
  • audit protocols for monitoring and recording who accesses the recordings and when; and
  • protocols for disposal of recordings in compliance with the relevant Retention and Disposal Schedule.

Using and disclosing the recordings

Under QPP 6, agencies can use or disclose personal information for the purpose it was collected or for any of the listed secondary purposes.

Any internal request to use recordings or external request to disclose recordings for a secondary purpose must comply with QPP 6.

The phrase 'cloud computing' is simply a shorthand term for moving functions from an agency-owned computer or server to a server owned by someone else which is accessed online. Microsoft 365, which allows programs such as Teams, Word and Excel to be used through a browser, is an example of cloud computing.

Computing power, storage space, applications and programs may all be outsourced to 'the cloud', i.e. a remote provider whose services are accessed via the internet.

Contracted service provider requirements

In some circumstances an agency will have to take reasonable steps to bind a contracted service provider to comply with the privacy principles as if they were an agency. This obligation generally arises when, as part of the service agreement, personal information will travel between the agency and the contractor.

An agency planning to move to a cloud-based service may need to negotiate an alternative or additions to the cloud provider's standard terms and conditions in order to meet this obligation. A failure to take these reasonable steps may make the agency liable for any privacy breach by the cloud provider.

Disclosure out of Australia rules

Agencies must comply with section 33, which only permits personal information to be disclosed out of Australia in specific circumstances. Disclosure outside of Australia requires more than the movement of personal information out of the country. The agency must cease to have control over the personal information once it has left Australia.

Agencies should be aware of where a cloud provider operates and whether the terms and conditions of the agreement means the agency will cease to have control over information stored by the cloud provider. If the provider's servers are located overseas and the agency will lose control of information transferred to their servers, the agency must ensure that it complies with section 33.

Protection and security

QPP 11 requires agencies to protect personal information against misuse, interference, loss, and unauthorised access, modification, or disclosure.

Agencies will need to consider the security a cloud provider will apply to their information and whether this complies with QPP 11. Agencies should also consider whether the agreement with any cloud computing service provider obliges, or should oblige, the provider to notify the agency if security is breached.

Access and amendment rights

QPPs 12 and 13 and the Right to Information Act 2009 (Qld) give individuals the right to access and amend their personal information. Agency information which is stored with a cloud provider will ordinarily be subject to these rights.

Agencies must ensure information stored in the cloud is not overlooked when searches are being undertaken to locate information relevant to an access or amendment request.

Use or disclosure

If an agency's agreement with a cloud provider allows the agency to retain control over its information, then the transfer of information from the agency computer to the cloud provider will be a use and not a disclosure.

However, if the agreement does not allow the agency to retain control over the information, or it allows the cloud provider to access the information for its own purposes, e.g., it permits scanning of the information for marketing purposes, transfer/storage of information will be a disclosure.

Agencies must ensure that the movement of personal information to a cloud provider complies with QPP 6.

Mandatory notification of a data breach

Agencies have mandatory notification of data breach (MNDB) obligations under chapter 3A of the IP Act.  These obligations apply to personal information contained in documents held by an agency. Personal information stored by an agency on a cloud computing service will generally remain under the agency’s control, and will therefore be subject to MNDB obligations in the event of a breach. When agency information is stored on a cloud provider's systems, the agency is generally reliant on the cloud provider to advise the agency if there is  of a security or data breach.

Agencies should consider including a mandatory breach notification clause in all agreements with cloud providers. This will oblige the cloud provider to tell the agency if there has been an incident which may have impacted on the security of the agency’s data. This will allow the agency to take steps to minimise the negative impacts of such a breach and meet the MNDB obligations.

Lawful access in other countries

If a cloud provider or its hardware is located in a country outside of Australia, an agency's information may be subject to the laws of that country. For example, information stored on a server physically located in the United States of America may be subject to the Patriot Act, which allows broad access by the government to data located in the country. An agency planning to use a cloud provider located in another country should consider the impact of any such laws on their information.

Note: this section focusses on Microsoft Copilot because it is integrated into the Microsoft products and services used extensively across the Queensland public sector. The issues discussed apply to any generative AI system used by Queensland government agencies.

Generative AI is the common term used to describe computer systems which generate content in response to user prompts. A user prompt is the information the user inputs into the generative AI system, eg a question or request to carry out a task, which prompts the system to generate output.

Microsoft Copilot and Microsoft 365 Copilot are generative AI systems. Microsoft 365 Copilot (M365 Copilot) is integrated into the Microsoft 365 environment and is available in Word, Excel, PowerPoint, Outlook and other applications.

Microsoft Copilot and M365 Copilot will be referred to collectively on this page as Copilot, unless there is a need to distinguish between them.

Microsoft Copilot is similar to M365 Copilot, but instead of being integrated into Microsoft 365, it is available on the web and through the Edge browser, as well as being embedded in Windows.

Copilot has the potential to be a useful tool, but its potential utility comes with privacy and security risks which must be managed.

Copilot and privacy

Agencies are responsible for their use of Copilot or other generative AI systems. If an agency's use of Copilot results in a privacy or information security incident, the agency cannot avoid responsibility by claiming the incident was 'caused by the generative AI'.

Any use of Copilot that involves personal information, whether as user input or generated output, must comply with the privacy principle requirements, including those that govern the collection, accuracy, security, use, and disclosure, including overseas, of personal information.

Agencies should not use personal information in Copilot or other generative AI systems just because they are available. They should only be used where they are the best way to address an identified problem.

Incorrect information

Generation of incorrect data, commonly referred to as ‘hallucinations’, is a known risk of using generative AI systems, which may be trained using inaccurate information.

The definition of personal information in the IP Act does not require it to be correct, which means the privacy principles apply regardless of its accuracy.

Collection

An agency collects personal information when it acquires it directly from the individual or indirectly from another source. If an agency uses Copilot to generate personal information that did not previously exist, for example by asking it to infer something about an individual based on existing data, it has collected personal information about that individual.

Collection of personal information in the context of Copilot or other generative AI systems must comply with QPP 3.

Complying with QPP 3 in the context of Copilot may have practical difficulties, as it is impossible to determine what personal information Copilot will generate in response to a user prompt.

Agencies should consider the best way to limit the potential breadth of Copilot's response when crafting user prompts. Where Copilot generates personal information which is unnecessary for, or unrelated to, the intended purpose, agencies should consider whether the information can be discarded, subject to applicable public records obligations.

QPP 3 requires collection to be fair. Using Copilot to generate or infer personal information is inherently less fair than collecting it directly from an individual or indirectly from another entity who collected it from the individual.

For directly collected personal information, the individual would generally have been aware of its collection and had input into what they provided. Where an agency uses Copilot to generate or infer personal information, the individual does not know the information has been collected, has no control over what Copilot generates or infers, and has no opportunity to provide corrections or context, or to challenge incorrect information.

This does not mean that collection of personal information using Copilot will automatically be unfair, but agencies need to carefully consider the possibility.

Notification of collection

Under QPP 5, when an agency collects personal information, they must take any reasonable steps to make the individual aware of the matters in QPP 5.2 which are reasonable in the circumstances. This obligation applies when the agency collects information directly from the individual or from someone else, including collecting via Copilot generation.

Quality

Under QPP 10, agencies must take reasonable steps to ensure the personal information they collect, use, and disclose is accurate, up to date and complete.

In the context of Copilot or other generative AI systems, this will require an awareness of the potential for generative AI systems to produce incorrect or irrelevant data, an assessment of the risk that it will occur, and crafting user prompts which limit the possibility of incorrect or irrelevant data being returned.

Content generated by generative AI systems like Copilot must be carefully assessed to determine its accuracy and relevance before it is used, eg relied on to make a decision in relation to an individual, or disclosed.

Use and disclosure

Under QPP 6, personal information can only be used or disclosed for the purpose it was collected or for one of the permitted secondary purposes. Use and disclosure are defined in the IP Act, with use having a very broad definition, meaning most things an agency does with personal information will be a use. This includes inputting personal information into Copilot or other generative AI systems as part of a user prompt or asking them to generate or infer personal information about an individual.

Personal information can only disclosed out of Australia if the disclosure complies with section 33 of the IP Act as well as QPP 6.

Agencies must ensure they comply with QPP 6 and section 33 when using Copilot.

Security of personal information

Under QPP 11, agencies must take reasonable steps to protect the personal information they hold from misuse, interference or loss and from unauthorised access or disclosure.

M365 Copilot is integrated into Microsoft 365. If an agency's Microsoft 365 environment complies with the IP Act's disclosure and security requirements, the use of M365 Copilot will generally reflect this compliance.

However, there may be a risk of uncontrolled or unauthorised data exposure, including by way of overseas disclosure, depending on the system configuration and use of enabled plugins are enabled. Agencies should refer to the Controlling the exposure of data with M365 copilot and Microsoft copilot in Queensland Government guideline for guidance on mitigating the possibility of uncontrolled or unauthorised data exposure.

Agencies could also consider disabling Copilot entirely.

Access and Correction

QPP 12 and 13 give individuals the right to access and correct their personal information. This includes personal information which is part of a user prompt and personal information generated or inferred by Copilot or other generative AI systems.

Transparency

Under QPP 1, agencies have transparency obligations in relation to the personal information they collect, hold, and use. If an agency is routinely using Copilot or other generative AI systems, particularly where they are being used to make or inform decisions, they should consider including details in their QPP privacy policy or other transparency documents.

Agencies may also be subject to other transparency obligations in relation to the use of generative AI systems. For example, the Use of generative AI for government information sheet requires content produced using generative AI tools to be clearly identified as such.

Privacy Impact Assessments

Any agency considering adopting Copilot or other generative AI systems as part of the agency's day to day practice should also consider conducting a Privacy Impact Assessment (PIA).

A PIA will assist an agency understand and evaluate the potential privacy impacts of adopting such a system and enable the identification and mitigation of potential privacy risks.

Additional considerations - Human Rights and Anti-Discrimination

All generative AI systems have the potential to produce biased, discriminatory, or other harmful materials. They reflect the cultural, economic, and social biases of the source materials they were trained on. The algorithms these systems use to parse and process content can also be a source of bias. The risks and dangers of algorithmic bias was a major focus of the Australian Human Rights Commission’s Human Rights and Technology Report in 2021.

Agencies should consider their obligations under the Human Rights Act 2019 (Qld) and the Anti-Discrimination Act 1991 (Qld) when using Copilot or other generative AI systems.

Agencies should also consider the HR Act, which contains a right to privacy, when assessing the privacy impacts of Copilot or other generative AI systems.

Data analytics is the process of examining data sets in order to draw conclusions about the information they contain. It involves processes that include analysing existing datasets and extracting new insights into various patterns, relations, and connections.

When an agency wants to use data analytics on data that includes personal information, regardless of whether that personal information was collected by the agency or not, there can be significant privacy challenges.

De-identified data

The privacy principles only apply to personal information, which is information that can be linked to an identifiable individual. If the information can be de-identified, or broken down into aggregated unidentified data, such as statistics, then it will no longer be personal information and the privacy principles will not apply.

Agencies considering data analytics projects that use data containing personal information may want to consider whether de-identified data could be used. De-identifying personal information enables it to be used, shared, or made publicly available without the agency having to consider compliance with the privacy principles.

It is important to note that de-identification is a risk-management exercise, not an exact science. De-identified datasets always carry the risk of re-identification. Datasets that don’t contain obvious personal information could be linked with additional datasets or be the subject of further deeper analysis from which re-identification of personal information could occur.

It is recommended that agencies seek specialist expertise when undertaking a de-identification exercise, particularly if the de-identified information is to be made public.

For some data analytics activities, however, de-identified data may not be suitable. Where datasets contain personal information, agencies must comply with the privacy principles.

Using and disclosing personal information

QPP 6.2(g) allows agencies to use or disclose personal information for research or the compilation or analysis of statistics in the public interest.

Additionally, under QPP 6.2 and schedule 4, part 2 of the IP Act, health agencies can use health information without the consent of the individual for research, or the compilation or analysis of statistics, relevant to public health or public safety.

QPP 6 also allows personal information to be used or disclosed with consent  or for a directly related or related purpose that the individual would expect.

Limits of authority

The IP Act operates subject to the provisions of other Acts relating to the use or disclosure of personal information, such as the Child Protection Act 1999 or the Hospital and Health Boards Act 2011. The QPPs do not override legislation that prohibits use or disclosure.

An agency intending to use or disclose data containing personal information must consider whether there are any legislative provisions that would impact or prohibit the use or disclosure.

Informing people about data analytics activities

QPP 5 requires an agency which collects personal information to take reasonable steps to inform the individual of the relevant matters listed in QPP 5.2. QPP 5 applies regardless of how the agency acquired the information.

If the agency knows when it collects personal information that it may use it for data analytics, it should include that purpose in the QPP 5 matters, however this purpose should not be included as a matter of course.

Outsourcing data analytics

If an agency is considering outsourcing data analytics activities that involve personal information, it must take all reasonable steps to ensure the contracted service provider is bound to comply with the privacy principles.

If the contracting agency does not take all reasonable steps to bind the contracted service provider, the contracting agency will be responsible for any breach of privacy arising from the actions of the contracted service provider.

Security of personal information

QPP 11 requires agencies to protect personal information from misuse, interference, loss, and from unauthorised access, modification or disclosure. In most cases, agencies will already have safeguards in place to appropriately protect the personal information they hold. The same security considerations should apply to analytical data that may be a variation on the personal information already held by the agency.

Access to agency personal information holdings for data analytics purposes should be limited both to those who have a business need to do so, and to the specific information required.

Assessing the privacy risks

A privacy impact assessment (PIA) is an assessment tool that can map the data flows involved in a project to make sure that data can be collected, used, processed, stored and shared in a manner in line with an agency’s privacy principle obligations.

Drones are playing an increasing role in government service delivery, including for law enforcement, emergency and disaster management, infrastructure inspections and environmental monitoring.

Not all information collected by a drone will be personal information, but agencies who use drones that will or are likely to collect personal information must ensure they comply with QPP 3 and QPP 5.

A drone may collect personal information, for example, if it records audio or video of individuals:

  • An individual’s image or voice is unique to that particular individual. Whether a recording of an individual’s image or voice could reasonably identify that individual will depend on the quality of the recording. Quality is determined by factors including the image size and resolution, position of the person to the camera, and the degree to which the individual’s face or other identifying characteristics are visible.
  • What an individual was doing, where they were at a particular time or what they said is clearly information about the individual as it reveals a fact or opinion about them.

Lawful collection

QPP 3 requires personal information to be collected lawfully. Agencies must ensure they comply with any relevant laws when collecting information by drone, for example:

  • Civil Aviation Safety Regulations 1988 (Cth)
  • Invasion of Privacy Act 1971 (Qld) in relation to audio recording of private conversations
  • section 227A of the Criminal Code 1899 (Qld) concerning observations or recordings in circumstances where a reasonable adult would expect to afforded privacy
  • Maritime Safety Queensland and Australian Maritime Safety Authority requirements for underwater drones; and/or
  • Major Events Act 2014 (Qld) in relation to operating an aircraft above a major event area.

Security of personal information

Drones collect information in one of two ways:

  • recordings are stored on-board (for example, on a memory card or hard drive); or
  • recordings are transmitted back to a central device where they are then stored.

Both methods have vulnerabilities. If a drone with on-board storage becomes lost or captured by an unauthorised third party, so too will any information it carries. If the drone transmits information through a wireless connection, this connection can be intercepted and used to access or modify the information in transmission. Adequate safeguards such as password protection and encryption should be utilised to address these vulnerabilities and meet an agency’s QPP 11 obligations.

Other safeguards could include:

  • limiting the staff who can access stored recordings to those who ‘need to know’
  • maintaining an audit log of who accesses stored recordings and when it was accessed; and
  • establishing clear protocols for responding to requests for access to, or copies of, recordings (for example, who has authority to release recordings in response to a request from a law enforcement agency).

An IP address (internet protocol address) is a string of numbers separated by decimal points (for example, 210.8.42.131) which identifies a specific piece of equipment, usually a computer, on the internet. IP addresses are generally assigned by an Internet Service Provider (ISP), either temporarily (a dynamic IP address) or permanently (a static IP address).

IP address locators

There are a number of IP address locator websites which will provide the name and geographical location of the entity to whom an IP address is registered. Because most internet users access the internet through an ISP or at their place of employment, which may use a third party ISP or act as its own ISP, the locator will reveal information about the ISP and not the individual internet user.

Are IP addresses personal information?

IP addresses are generally visible to any website visited by the internet user and many websites will collect and store that IP address on a permanent or temporary basis. While any website may collect and hold IP addresses, generally only an ISP can link it to the name of an individual account holder (State v. Reid (2008) 194 N.J. 386, 954 A.2d 503).

Because of this lack of ability to link an IP address to an identifiable individual, a number of authorities are of the view that an IP address in isolation is not personal information. See for example Canadian Federal Privacy Commissioner's findings in PIPEDA Case Summaries #2005-319 and #2009-010, the Irish High Court in EMI Records & ORS v Eircom Ltd [2010] IEHC 108 and the United States Court of Appeal in Klimas v Comcast Cable Communications Inc 465 F.3d 271.

The OIC supports the view that an IP address in the absence of any information enabling it to be connected with an identifiable individual, is not personal information within the meaning of section 12 of the IP Act. However, if an IP address is linked to other information which would allow an individual to be reasonably identified then it will become personal information and it is subject to the privacy principles, including the obligations limiting its disclosure out of Australia.

A mobile application or ‘mobile app’ is a software program designed to run on a smartphone, tablet computer or other mobile device. Mobile apps are increasingly being used as part of the government’s delivery of services to the community.

Does the IP Act apply to mobile apps

Any system which involves the collection, storage, use or disclosure of personal information by a Queensland government agency is subject to the privacy principle requirements, including the Queensland Privacy Principles (QPPs) and the overseas disclosure rules in section 33.

Because mobile apps potentially capture information about their users, privacy must be taken into account in both the app’s design and the information provided to users.

Privacy challenges for mobile apps

Mobile app capabilities present unique challenges for privacy protection. Mobile apps have the potential to collect significant amounts of personal information about users, often without them being aware of the collection. Mobile apps may be able to access:

  • the user’s phone and email contacts
  • call logs
  • internet data
  • calendar data
  • data about the device’s location
  • the device’s unique IDs; and
  • information about how the user uses the app.

The scope of personal information which can potentially be collected, combined with the speed at which apps are developed and distributed, could result in the personal information of hundreds of thousands of users being collected in a short space of time.

Privacy considerations when developing mobile apps

Like any other project involving personal information, privacy should be included in the planning phase of an app’s development. It will also be an important consideration for the entire life cycle of the app.

Portable Storage Devices (PSDs) are small, lightweight, easily transportable devices capable of storing and transferring digital data. Common PSDs include removable devices such as USB thumb drives or flash drives, rewritable CD/DVDs, memory cards and external hard drives and mobile devices with inbuilt storage such as tablets, laptops, and smartphones.

PSDs are capable of storing extremely large amounts of data. Due to their portable nature and attractiveness, PSDs are susceptible to loss or theft. The potential damages arising from this risk increase if the PSD holds unsecured non-public data.

Misuse, loss or unauthorised access, use, modification or disclosure of personal information can also arise through:

  • a PSD infecting devices into which it is subsequently plugged with malware;
  • insecure disposal of, or deletion of information from, PSDs; and/or
  • unfettered access by third parties to the content of the PSD.

In addition, employees’ use of personal PSDs to access, transfer or store agency data may increase the likelihood of a privacy breach. For example:

  • the agency has less control over the use of security measures such as anti-virus and malware software, operating system and application updates and password, encryption and remote wipe capabilities
  • departing employees may accidently take away personal PSDs containing agency information; and
  • unauthorised people are more likely to access personal PSDs at the employee's home, either inadvertently or by simply borrowing the device.

The damages arising out of a privacy breach involving a PSD increase where there is:

  • no classification of information which may and may not be transferred to a PSD
  • a lack of encryption or technical controls to protect data stored on the PSD
  • no obligation to report loss or stolen PSDs; and/or
  • a failure to promptly transfer agency records from the PSD to the agency network.

Managing the risk of using PSDs

A key strategy in minimising the risks of using PSDs is to develop and implement policies and procedures so that employees understand their obligations when using PSDs to access, store or transport agency data. Where possible, agencies should also use hardware and/or software controls to restrict or control the use of PSDs.

PSD policies and procedures should establish

  • What types of PSDs are permitted and under what circumstances?
  • Whether personal PSDs are permitted and if so, what conditions are placed on their use.
  • How the rules surrounding use of PSDs interact with remote access to the network.
  • Whether a central register of PSDs will be maintained and if so, what approved devices must be registered?
  • What information may be transferred to a PSD and details of any additional safeguards appropriate to the security classification or value of the information. Where PSDs will store personal information, agencies should require that it be encrypted.
  • What is considered acceptable use of PSDs for transport and storage of agency data.
  • How to securely erase data from PSDs.
  • What to do with damaged or obsolete PSDs.
  • What to do in case of lost or stolen PSDs or other suspected privacy breach.
  • What processes are in place to audit or monitor compliance with PSD policies and procedures.
  • Who employees can contact for advice on PSDs.

Social media are websites and apps that allow users to create and share content or to participate in social networking. Some of the more common social media platforms used by agencies are Facebook, YouTube, Instagram, and LinkedIn.

Agency social media plays an important part in agency engagement with the community, however the nature of social media means its creation and use can have privacy implications. It is important that agencies build appropriate protections into their policies on social media use and retention of records.

What is the purpose of the social media account?

The intended use of the social media account will impact the privacy precautions an agency has to take. For example, a Facebook account intended only to communicate news updates or emergency alerts which is set to disallow direct messages will require fewer privacy precautions than a Facebook account that allows people to ask questions and receive answers.

It is important that agencies define the purpose and limitations of the social media account as part of determining what steps must be taken to ensure it complies with the privacy principles. Agencies may want to consider a privacy impact assessment for social media accounts that are intended for more than just broadcasting information.

Social media policies

A social media policy that includes guidance on the handling and posting of personal information can be an important part of ensuring social media accounts are, and remain, privacy compliant.

Disclaimers and QPP 5 matters

When an agency collects personal information, it must take reasonable steps to make the individual aware of the relevant matters listed in QPP 5.2.

This obligation applies when collecting personal information directly from the individual and when it is collected from someone else. It does not apply to unsolicited information, however when a social media account allows people to submit information to it, information they provide will generally be solicited.

Ideally, the QPP 5 matters will be posted on the social media account itself, but if the platform does not have sufficient space, it can be posted on the agency's website with a prominent link from the social media account. Agencies may also want to include this information in their privacy policies under a social media heading.

Agencies will also need to include a disclaimer addressing the overseas disclosure of personal information, eg one that ensures individuals interacting with the account understand that by doing so their personal information will be disclosed out of Australia.

Acceptable use policies: other people's personal information

An acceptable use policy, e.g. setting out what content will and will not be permitted, generally forms part of an agency's social media policy. Agencies may want to consider including in their acceptable use policy a request that people do not post the personal information of third parties.

Security of social media

Social media accounts should be secured with a strong password and only specifically authorised employees—familiar with the agency's privacy obligations and social media policies—should have access to those credentials. This will help ensure the agency complies with QPP 11.

The privacy and security settings of the social media account need to be set to an appropriate level based on the purpose of the account. If the account is solely for one-way communication by the agency to the community, disabling direct or private messaging and/or disallowing commenting on posts may be appropriate. However, different settings will be required for an account that is intended to facilitate two-way communication, e.g. answering questions, and responding to issues.

Posting personal information to social media

Before posting personal information to social media, eg photos taken by agency officers at an event, the agency must ensure it is permitted to do so. This will generally require identifying what the individuals were told/what they agreed to when the photos were taken or contacting them to ask for consent.

Agencies should develop a photo/image consent form that covers online publication and/or posting to social media, to be used when agency officers are taking photos or videos. This will ensure that the agency has the appropriate authorisation to use those images online.

Keep personal information to a minimum

Personal information published on social media can be harvested and reused by anyone. This can lead to annoyances, such as targeted marketing, or more damaging outcomes, such as identity theft, fraud, or harassment.

Even when an agency has authority to publish personal information to its social media account, it should limit it to the minimum necessary to fulfill the purpose of the post.

Responding to social media enquiries

Many social media accounts are intended to provide a customer-centric platform through which people can interact with the agency and receive timely responses.

Given the immediacy of social media, and the general expectation that enquirers will receive a rapid response, social media activities should be conducted by staff who have relevant expertise. This includes knowledge of their agency's privacy obligations; care must be taken not to disclose personal information in breach of the privacy principles.

Enquiries which would require disclosing personal information, eg a comment asking for an update on the progress of the commenter's application, should not be answered, even if the message has been sent privately to the agency's account. This is because the agency has no way to verify that the person making the request is who they claim to be. Even though some platforms have a 'real name only' policy, that does not guarantee the identity of the enquirer. They should be advised to contact the agency in another way so their identity can be verified, and the requested update provided.

Responding to more general enquiries, e.g. comments asking how long until an agency completes a current project or the opening hours of a pool, should not raise privacy issues as they do not require the agency to confirm or disclose personal information beyond what the enquirer has already posted, e.g. the name/username of the enquirer.

Use and disclosure of information acquired through social media

Personal information collected through social media must be dealt with in accordance with QPP 6, the same as personal information collected through other channels.

Requests to access social media information

Social media records—including records of official business conducted through an employee's personal account—are documents of the agency. They may also be public records. Social media records can be applied for under the Right to Information Act 2009 and may need to be retained in accordance with the Public Records Act 2023.