How Private AI Can Help to Comply with Thailand’s PDPA

Share This Post

Thailand’s Personal Data Protection Act (PDPA) was signed into law in 2019 and came into force in mid 2022. It is in many ways inspired by Europe’s General Data Protection Regulation (GDPR) but with some notable differences, e.g., potential prison time for violations in addition to criminal and administrative fines and a greater social good carve-out from the consent requirement when processing sensitive information.

Similarities include the distinction between data controllers and processors, various legal bases for data processing such as consent and legitimate interest, data breach notification requirements, and obligatory security measures.

This article draws out the data protection measures required under the PDPA and supplemental legislation and how Private AI’s privacy-enhancing technology can aid compliance. We focus on data minimization, cross-border data transfers, and breach prevention and reporting measures.

Additional Guidance 

The first thing to note is that the Act establishes a Personal Data Protection Committee (PDPC) which is tasked with providing more detailed guidance addressing, for example 

  • the protection of personal data with which the Data Controller and the Data Processor shall comply, 
  • risk assessment and notification of personal data breach,
  • criteria for providing protection of personal data sent to a foreign country, and
  • rules for the deletion or anonymization of personal data.

At the time of writing, the PDPC has announced 18 distinct secondary pieces of legislation and four guidelines clarifying different aspects of the Act. Still missing is guidance on the deletion and anonymization of personal data.

Data Minimization

Section 22 requires that the collection of personal data shall be limited to the extent necessary in relation to the lawful purpose of the data controller. In other words, A data controller must limit the collection to the personal data that is required to achieve the processing purpose it determined at the outset, and which also had to be communicated to the individual prior to the collection. 

The Act and the supplemental legislation and guidance are otherwise silent on what the data controller is supposed to do precisely to achieve data minimization. For example, there is no requirement to pseudonymize personal data; in fact, the PDPA does not mention this concept. The same is true for de-identification of data, and anonymization is merely mentioned in the context of being an alternative to deletion of data, without any details of what is required for data to be anonymized.

Yet, the data minimization principle stands. Considering the uncertainty of what the exact requirements are, taking recourse to global standards and industry best practices can help. Global standards propose de-identification of personal data as a helpful method to protect the data and to comply with data minimization requirements. The National Institute of Standards and Technology, for example, defines de-identification as a “general term for any process of removing the association between a set of identifying data and the data subject.” ISO/IEC 27559:2022(E), the Privacy enhancing data de-identification framework, provides clear guidance on how to achieve this.

Private AI’s machine-learning models are trained to recognize over 50 different types of personal data entities across 53 languages, including sensitive information categories as listed in Section 26 PDPA (e.g., ethnic origin and health information). This capability allows organizations to accurately identify the personal information they process, whether in structured, semi-structured, or unstructured form. Subsequently, Private AI can replace these data with placeholders or synthetic data resembling the original. Fine-grained selection of each data point that is not required for individual use cases facilitates compliance with the data minimization principle, which mandates an assessment of the data necessary for each processing purpose. 

Even though there is not yet any guidance on what is required for data anonymization as an alternative to deletion, a first step is always the removal of direct identifiers, which Private AI’s technology is built for. 

Cross-Border Data Transfers

For cross-border data transfers, the secondary legislation largely just restates what the law already says, but importantly adds that if a controller is unsure whether the recipient jurisdiction provides adequate data protection, a prerequisite for the transfer, it must notify the PDPC which can decide that the transfer must not take place. In doing so, the PDPC is required to consider standards for international data transfers in accordance with regulations the PDPC itself prescribes. 

Compared to the GDPR this is considerably stricter. Under Section 28 of the PDPA, if the destination country does not have adequate data protection standards and the Committee finds the standards insufficient, the transfer may be blocked entirely. The law doesn’t provide mechanisms like the GDPR’s Binding Corporate Rules (BCRs) or Standard Contractual Clauses (SCCs) as fallback options to guarantee the lawfulness of the transfer. Essentially, even if an organization has strong internal safeguards in place, such as contractual obligations or internal rules, the PDPC could decide to prohibit the transfer if the external standards in the destination country are inadequate. This gives the PDPC the ultimate say, and controllers have to comply with this decision without clear alternatives beyond exceptions like consent or contractual necessity.

Section 29 provides an exemption for cross-border data transfers within affiliated businesses or groups of undertakings. If a Data Controller or Data Processor in Thailand implements a Personal Data protection policy for transferring data within the same group, and that policy is reviewed and certified by the Office (the regulatory authority), they can transfer Personal Data abroad without needing to comply with the stricter requirements of Section 28. This allows for internal transfers between group entities across borders as long as the certified policy is followed.

Private AI’s technology may make it possible to share data across borders without any complications, so long as the personal identifiers contained in the data are not required by the recipient. Removing personal identifiers may render personal data no longer identifiable, which means it should fall outside of the scope of the PDPA, which defined personal data as “any information relating to a Person, which enables the identification of such Person, whether directly or indirectly, but not including the information of the deceased Persons in particular.” Whether data can identify an individual is very context-specific and an expert should be consulted to determine whether re-identification remains possible after removing certain personal identifiers.

Security Measures and Breach Reporting

Personal data controllers are required to implement strict security measures to protect personal data from unauthorized access, use, alteration, modification, or disclosure. The 2022 announcements from the PDPC outline key security requirements that controllers and processors must meet, covering all formats of personal data—whether in document, electronic, or other forms. These measures must include both organizational and technical controls, such as access controls, identity proofing, and data integrity management, all tailored to the specific risk levels based on the type and purpose of data collection, use, and disclosure.

The PDPA mandates that security measures be implemented across the entire data lifecycle, from identifying risks to responding to and recovering from breaches. This includes ongoing monitoring, review, and updates as technology evolves. Personal data controllers must also ensure that their processors implement comparable security measures, ensuring that any third-party handling of data remains compliant with the minimum standards. Processors, in turn, are required to keep detailed processing records for each activity they undertake for the controller on the Committee to review upon request.

Private AI’s solutions can significantly aid organizations in achieving compliance with these stringent security requirements. By using its advanced machine learning capabilities, Private AI helps organizations protect personal data by identifying and redacting sensitive information in real-time, even within unstructured data formats like text documents, emails, and images. This preemptive approach minimizes the volume of sensitive personal data stored and processed, reducing the risk of breaches. Additionally, Private AI’s technology can enhance data access controls by redacting specific data points that are not necessary for business operations, reducing exposure to unauthorized access. Data loss prevention tools can be enhanced by using Private AI’s contextually aware methods of detection as opposed to common regexes-based approaches.

When a breach does occur, organizations are required to report it to the PDPC within 72 hours of becoming aware of the breach, especially if there is a high risk of affecting individuals’ rights and freedoms. The breach report must include the nature of the incident, affected individuals, types of data compromised, and the remedial actions taken. Notification to affected individuals is also required if the breach is deemed to have significant risks to their rights.

Private AI’s redaction solution can play a crucial role in helping organizations comply with these breach reporting requirements. By automatically detecting and redacting over 50 types of personal information across multiple languages and formats, Private AI ensures that organizations do not retain more personal data than necessary, reducing the potential for breaches in the first place. In cases where retention of personal data is unavoidable, Private AI can assist with incident reporting by accurately identifying the types of personal data affected in a breach. This aids organizations in compiling comprehensive and accurate reports to the PDPC, while also helping them assess the severity of the breach and whether reporting is required. 

Conclusion

Thailand’s PDPA and additional guidance by the PDPC establish strict data protection rules, including data minimization, security measures, and breach reporting inspired by the GDPR. Private AI helps organizations meet these requirements by detecting and redacting sensitive data, reducing exposure, and assisting with accurate breach reporting. 

To see the tech in action, try our web demo, or get an API key to try it yourself on your own data.

Subscribe To Our Newsletter

Sign up for Private AI’s mailing list to stay up to date with more fresh content, upcoming events, company news, and more! 

More To Explore

Download the Free Report

Request an API Key

Fill out the form below and we’ll send you a free API key for 500 calls (approx. 50k words). No commitment, no credit card required!

Language Packs

Expand the categories below to see which languages are included within each language pack.
Note: English capabilities are automatically included within the Enterprise pricing tier. 

French
Spanish
Portuguese

Arabic
Hebrew
Persian (Farsi)
Swahili

French
German
Italian
Portuguese
Russian
Spanish
Ukrainian
Belarusian
Bulgarian
Catalan
Croatian
Czech
Danish
Dutch
Estonian
Finnish
Greek
Hungarian
Icelandic
Latvian
Lithuanian
Luxembourgish
Polish
Romanian
Slovak
Slovenian
Swedish
Turkish

Hindi
Korean
Tagalog
Bengali
Burmese
Indonesian
Khmer
Japanese
Malay
Moldovan
Norwegian (Bokmål)
Punjabi
Tamil
Thai
Vietnamese
Mandarin (simplified)

Arabic
Belarusian
Bengali
Bulgarian
Burmese
Catalan
Croatian
Czech
Danish
Dutch
Estonian
Finnish
French
German
Greek
Hebrew
Hindi
Hungarian
Icelandic
Indonesian
Italian
Japanese
Khmer
Korean
Latvian
Lithuanian
Luxembourgish
Malay
Mandarin (simplified)
Moldovan
Norwegian (Bokmål)
Persian (Farsi)
Polish
Portuguese
Punjabi
Romanian
Russian
Slovak
Slovenian
Spanish
Swahili
Swedish
Tagalog
Tamil
Thai
Turkish
Ukrainian
Vietnamese

Rappel

Testé sur un ensemble de données composé de données conversationnelles désordonnées contenant des informations de santé sensibles. Téléchargez notre livre blanc pour plus de détails, ainsi que nos performances en termes d’exactitude et de score F1, ou contactez-nous pour obtenir une copie du code d’évaluation.

99.5%+ Accuracy

Number quoted is the number of PII words missed as a fraction of total number of words. Computed on a 268 thousand word internal test dataset, comprising data from over 50 different sources, including web scrapes, emails and ASR transcripts.

Please contact us for a copy of the code used to compute these metrics, try it yourself here, or download our whitepaper.