On February 8, 2023, the International Organization for Standardization adopted privacy by design in ISO 31700:2023 as a voluntary standard for organizations to implement into their operations. The adoption of this standard further manifests a shift in the field of data privacy. After Article 25 of the GDPR made it a mandatory requirement, the ISO standard is another clear sign that data privacy concerns of consumers are taken so seriously that businesses are expected to bake protections into the fabric of their every product and make it the foundation of their services. Yet, far from being a trade off with profitability, this development could be regarded as a wake-up call for organizations to invest into building a durable customer base by signaling trustworthiness. In an era where disclosing personal information is often a prerequisite for participation in society and the marketplace, responsible data processing will make companies stand out and gain a competitive edge.
Privacy by design is defined as design methodologies in which privacy is considered and integrated into the initial design stage and throughout the complete lifecycle of products, processes or services that involve processing of personally identifiable information (PII), including product retirement and the eventual deletion of any associated PII.
Privacy-enhancing technologies can help implement privacy by design, not only to ensure GDPR compliance but also to build a trustworthy brand reputation. Private AI can help companies handling customer data to meet the new standard, in particular by reducing the identifiability of individuals and helping with data minimization, but also with breach reporting, privacy risk assessments, and PII deletion.
Section 4.8 – Privacy Controls
A key component of privacy by design is the implementation of privacy controls. While transparency and communication with customers is important for the relationship-building aspect of building trust, having robust technical solutions in place does the actual work of keeping PII safe and ensures that organizations can deliver on the commitments they make in their privacy policies. The ISO does not single out particular technologies that constitute the state-of-the-art in the industry, but it lists “de-identification or anonymization tools, up-to-date PII inventory, [and] consumer PII locator,” among others, as privacy enhancing services that organizations should consider implementing. Developers and managers looking to integrate privacy into their software pipelines and products can refer to our Privacy Enhancing Technologies Decision Tree to see what solution would best suit their needs.
Section 5.6 – Prepare Data Breach Communications
If a data breach occurs, most privacy laws now require organizations to report the breach to a privacy authority in a timely manner, and, in severe cases, to their customers as well. An important part of informing the authorities and customers about a data breach is, of course, which type of data was affected by the breach. This may be less than straightforward if the organization possesses unstructured datasets; e.g., unstructured text such as emails, chat transcripts, or video and audio materials. Private AI can generate a report which accurately shows exactly where PII is found in the affected data, and what type of PII it is. This saves a significant amount of time, which is critical given potentially tight deadlines for reporting privacy breaches and considering the high demand on everyone’s time during a breach and in its direct aftermath.
Section 6.2 – Privacy Risk Assessments
The ISO standard suggests conducting a privacy risk assessment (PIA) prior to the release or production of the consumer product. It identifies the development of a data map as a useful tool to determine data flow and potential unanticipated risks thereof. Similar to the benefit explained in the previous paragraph, knowing what types of PII are in the possession of the organization, and where it is in the system is critical for gaining insights into the risk exposure.
Section 8.2 – Designing Privacy Controls for Retirement and End of Use
Having to dispose of data once it has served the purpose for which it was collected flows from the data minimization principle. There is no better way to keep data safe than not having it in your system in the first place. However, an acceptable alternative to destroying data under privacy regulations, as well as the new ISO standard is the anonymization of data. This alternative is important for businesses that require large datasets to train ML models, for example. Private AI’s solution can replace PII with contextually relevant synthetic data so that the accuracy of the data is not compromised in exchange for enhanced privacy.
Conclusion
Many organizations will have to fundamentally rethink and redevelop their internal processes to comply with the voluntary ISO standard adopting privacy by design. There will be no one-size-fits-all solution that can be implemented and then things go back to business as usual. Privacy by design entails a commitment that penetrates virtually all aspects of a business’ operation. However, there are sophisticated tools that can help with important aspects of this work. Private AI, as we have seen, can provide effective privacy controls, facilitate breach reporting and risk assessments, and assist with the anonymization of data to meet data minimization obligations. Try our web demo to see for yourself, or talk to an expert today.