By Leslie Veloz and Jennifer Ruehr
The Hong Kong Office of the Privacy Commissioner for Personal Data (“PCPD”) recently published its Checklist on Guidelines for the Use of Generative AI by Employees (“Checklist”). The goal of the Checklist is to help organizations draft internal policies and procedures governing employee use of generative AI (“GenAI”) tools, especially where GenAI is used to process personal data.
The Checklist provides recommendations for topics that should be covered in internal company policies and provides practical tips on supporting the use of GenAI by employees, as explained further below.
Internal GenAI policies
The PCPD recommends organizations implement GenAI Policies to address the following topics:
Permissible Use of GenAI.
Describe the GenAI tools that can be used internally (e.g. publicly available, licensed, or internally developed).
Describe how employees can use GenAI, and the applicability of relevant policies or guidelines.
Data Privacy Protections.
Provide clear guidance on the types and amounts of data that can (or cannot) be input into GenAI tools, and how output data can or cannot be used.
Describe use cases when privacy protective measures (e.g. anonymization) should be applied to output data.
Provide data security requirements for output data generated by GenAI that aligns with the organization’s existing policies.
Review other internal policies (e.g., data retention, personal data handling, information security, etc.) to decide if updates are necessary.
Lawful and Ethical Use, including Prevention of Bias.
Emphasize that employees must verify the accuracy of generated outputs, report biased or discriminatory outputs, label outputs, and refrain from using GenAI tools for unlawful or harmful activities.
In addition to the above, the PDPC also recommends that organizations offer specific security guidance to employees about the use of GenAI tools that address permitted devices, employees, robust user credentials, and security settings for GenAI tools, including an AI Incident Response Plan. Internal policies should also describe the possible consequences in the event the policies are violated.
Practical Tips
To address these recommendations, companies should:
Enhance Policy Transparency. Routinely communicate updates to GenAI policies or guidelines, ensuring that employees understand how and when they can use GenAI tools.
Conduct Employee Trainings. Conduct trainings on how to use GenAI tools effectively and responsibly, including explaining capabilities and GenAI tool limitations.
Develop AI Support Teams. Establish an AI support team to assist employees using GenAI tools in their work, provide technical assistance, and address any general AI related employee concerns.
Implement Feedback Mechanisms. Create channels or processes for employees to provide feedback on the use of AI, so that governance teams can improve and further tailor applicable AI policies.
In addition to publishing the Checklist, the PCPD also launched an “AI Security Hotline” and a “Data Security Training Series for SMEs,” aimed to assist organizations in driving high-quality AI development, expand the diverse application of AI, and complying with the requirements of the Personal Data (Privacy) Ordinance (“PDPO”).
Hintze Law PLLC is a Chambers-ranked and Legal 500-recognized, boutique law firm that provides counseling exclusively on global privacy, data security, and AI law. Its attorneys and data consultants support technology, ecommerce, advertising, media, retail, healthcare, and mobile companies, organizations, and industry associations in all aspects of privacy, data security, and AI law.
Leslie Veloz is an Associate at Hintze Law PLLC offering clients pragmatic and result-driven legal counsel for establishing, maintaining, and maturing effective privacy programs.
Jennifer Ruehr is Co-Managing Partner at Hintze Law PLLC and co-chair of the firm’s Workplace Privacy Group, Cybersecurity and Breach Response Group, and the Artificial Intelligence and Machine Learning Group.