After the outbreak of COVID-19, the risk of privacy leakage is increasing
At the beginning of 2020, the sudden outbreak of the COVID-19 pandemic brought a huge impact on everyone’s work and life. On the one hand, the working-from-home scenario has pushed the development of digital office. Many companies have had to transfer the materials and assets from offline to online within a short time. On the other hand, to meet the demands of the public health and pandemic control, personal health data is collected and processed digitally to be used in areas like pandemic prevention and control, health services, and tracing inquiries. For instance, travel health codes have been used in countries like China and Spain for COVID-tracing, and many countries around the world have provided the online services of test appointment and test results inquiries. The outbreak of the COVID-19 pandemic has inspired people to rethink the construction and development of infrastructures. The large scale of data transmission and exchange during the pandemic period requires the support from new technologies. In the global context, in order to adapt to the ever-increasing demand for data exchange and promote the digitalization process, many countries have prioritized and invested on cutting-edge technologies like 5G, Big Data, Artificial Intelligence, Cloud Computing, etc.
Considering the abnormal conditions of the pandemic, it can be understood and accepted to a certain extent if the data is used by relevant agencies and organizations. In spite of this, after the pandemic, when the life is back to normal, can the data privacy be properly protected in the daily data collection, storage and computing? Can the basic rights of users be properly protected? With the development of the new technologies, when a large amount of data is collected, transmitted, analyzed and processed from the end devices through Internet of Things (IoT) and 5G technologies, the risk of data breaching will increase accordingly. In this case, how to get effective protection on the data privacy?
Privacy computing paves the road
Fingerprint recognition, face recognition, voice recognition and other technologies collect user information to provide users with convenient and efficient services, while also putting these basic personal data information at risk of being leaked and abused. With the successive promulgation and implementation of the European Union’s GDPR and the California’s CCPA Act, it is necessary for relevant parties to reflect their attitudes and operations towards data resources. In this context, Privacy Computing technology has gradually caught researchers’ attention.
Privacy Computing refers to a type of information security technology based on modern cryptography, which is represented by technologies like Secure Multi-party Computing (MPC), Homomorphic Encryption, Zero-knowledge Proof, Differential Privacy, and Trusted Execution Environment (TEE), etc. Privacy Computing can realize the data computing and analysis while ensuring the security and privacy of the source data.
So far, three major technologies are available to achieve Privacy Computing: Cryptography, Trusted Execution Environment, and Federated Learning. Among them, the privacy protection scheme based on cryptography is a type of computing method that protects private information throughout the entire life cycle, and supports operations such as analysis and computing without the data itself leaking to a third party. Its core concept is to form a set of symbolic, formulaic and quantitative evaluation criteria for Privacy Computing theory when processing information flow.  Compared with the problems brought by the traditional end-to-end encryption methods on the key management and performance impact, Privacy Computing can fundamentally protect data from the entire life cycle. In large-scale public infrastructure, the introduction and deployment of Privacy Computing support systematic data privacy protection.
At present, in addition to the in-depth research on Privacy Computing in the academia, the realistic application scenarios and large-scale commercial solutions of Privacy Computing are also being actively explored by the industry. Take PlatON Network as an example. PlatON has been developing and researching the field of Privacy Computing since 2016. The AI network of Privacy Computing provided by PlatON is a solution for privacy contract implementation based on cryptographic algorithms such as Secure Multi-party Computing (MPC). In general, the Privacy Computing algorithm is released through the contract. The data provider and the computing nodes that require privacy protection need to cooperate to execute the MPC protocol, so that to realize the collaborative data computing. In this case, data privacy is protected at the same time when data is shared, allowing users to obtain the economic benefits of data reuse while maintaining data ownership.
After the era of the pandemic, Artificial Intelligence (AI) also has caught in privacy issues during the process of data reuse. The development of AI technology turns data as a special asset, and its scale can affect both the model accuracy and the reliability of the services, which in turn affects business costs and risks. From the perspective of improving the accuracy of the AI model, the more data collected, the better the model performs, but the collection of large amounts of data also brings higher risks of privacy exposure. If AI is planned be applied in large-scale commercial scenarios, privacy is a crucial issue that cannot be bypassed.
Privacy Computing technology is born to be suitable for solving the privacy flaws of AI technology. A large amount of data is computed in an encrypted way, which not only solves the privacy problem in the traditional AI model, but also ensures the efficient data management and utilization. Considering the user’s personal health IoT data involved in this pandemic as an example, PlatON’s “DataBank” product can effectively protect the core data and provide lifecycle data-management services. Working as a data asset value-trading platform, PlatON’s “DataBank” product can guarantee the privacy and security of personal health data during the verification and transmission process based on Privacy Computing technology. With the AI technology introduced in PlatON’s “DataBank” product, multi-source and multi-mode local joint verification can be efficiently achieved. The results got from Privacy Computing can provide accurate delivery services among organizations and agencies, which not only addresses the issue of mutual recognition between different regions, but also supports the sharing of data marketing channels among various agents. In this case, it can jointly improve the accuracy of health data verification, and help industries realize and establish public health economic data services under the rapid development of 5G, Cloud Computing and other technologies for digitization. As the pioneer in privacy protection area, PLATON has researched Privacy Computing technologies for many years. The Privacy Computing network infrastructure developed by PLATON can guarantee the secure and free data flow across every corner of the world. All types of Applications, relying on the data exchange and data computing as the core capability, can enjoy the convenience brought by global data collaboration and computing power sharing on PlatON.
 Li, F., H. Li, Ben Niu and J. Chen. “Privacy Computing: Concept, Computing Framework and Future Development Trends.” IACR Cryptol. ePrint Arch. 2018 (2018): 1145.
 Sun Lilin: How to protect personal privacy to the utmost extent in pandemic prevention and tracing control (in Chinese), https://mp.weixin.qq.com/s/gwiZyfdTC6tEk83VfpnBtQ
 Scientists have something to say series: From 5G rich news and health code, talk about whether new infrastructure and data privacy may not be available at the same time (in Chinese), https://mp.weixin.qq.com/s/0UVaXxOCTwVbQ8dyDxpbAQ
A series of articles in “Linlithgowshire Journal & Gazette” interviewing some of the oldest inhabitants of We…
Disclaimer: The views, suggestions, and opinions expressed here are the sole responsibility of the experts. No People Reportage journalist was involved in the writing and production of this article.