Five Safes, One TRUSTed Platform: How TRUST is Securing Data Exchange
Singapore has ambitious plans for precision medicine—with large-scale implementation and comprehensive Singaporean genomic databases by 2030. But before that can happen, large volumes of data must be collated, standardised, shared and analysed.
TRUST (Trusted Research and Real world-data Utilisation and Sharing Tech) is Singapore 's national data exchange platform that aims to facilitate the process—enabling health-related research and anonymised data sharing between public institutions, as well as between public and private sectors. But given the immutability and sensitivity of healthcare information such as genetic sequences and biometrics, how does TRUST facilitate data exchange while still ensuring its security?
Adopting the Five Safes
The team tapped on an established framework. Dr Sue-Ann Lee, Director (HealthTech Policy Division), Ministry of Health (MOH), explained, “MOH has already adopted the Five Safes Framework—an internationally recognised system for organisations to design safeguards around data security and protection. As TRUST falls under the Ministry’s purview, we naturally extended the use of Five Safes to TRUST’s data governance.”
But while the Framework is easily applicable to a wide range of purposes, it still had to be adapted to TRUST’s context. Dr Low Pin Yan, Lead, Planning & Policy, TRUST Office said, “Within the Framework, there are many controls—which could be adapted in accordance with TRUST’s objectives. So for instance, TRUST enables research and drives innovations. This means that while our controls should ensure the safe and secure use of data, they shouldn’t perturb data to the point where it loses all utility.”
Implementing the Five Safes
Dr Low continued, “That said, the Five Safes should not be seen as merely a set of controls in place. Instead, it’s a synergistic deployment of many different measures.”
● Safe Purpose. TRUST’s Data Access Committee (DAC) is delegated with the responsibility of reviewing data requests to see if they fulfil public interest and is assessed to be of social value as well as comply with data permissions as established with the data owners. The DAC comprises members from government, research institutions, healthcare clusters, domain experts from law and ethics as well as laypersons.
● Safe People. Only researchers or employees of partner organisations bound by the TRUST Data Request Agreement can access data on TRUST. The users will also have to abide by TRUST’s code of conduct. Internally, TRUST has proper user governance for the access and processing of data.
● Safe Settings. TRUST is hosted on the government commercial cloud with government-standard safeguards. While TRUST can be accessed remotely via a virtual desktop interface, TRUST partner organisations ensure that their users log in from a secure environment. For certain data (e.g. sensitive data), users are required to access TRUST via certified micro access labs.
● Safe Data. TRUST has implemented policies and safeguards to ensure data anonymisation according to MOH standards. For example, direct and indirect identifiers such as NRIC, names, and date of birth are anonymised. TRUST also features legal and technical safeguards on use of its data.
● Safe Output. Users are not allowed to extract microlevel data from TRUST—only approved aggregated analytical output can be downloaded. Research publication drafts using TRUST data will also be reviewed by TRUST for re-identification risks and alignment with approved research purposes.
Safeguarding the Five Safes
Taken as a whole, the Five Safes Framework requires the cooperation of multiple parties—TRUST, government, partner institutions, data owners, researchers—to function well. Yet, studies show that almost nine in 10 data breaches are caused by human error.[1] With so many parties involved at every stage of the Framework, how does TRUST intend to circumvent bad actors and human errors?
Dr Low said, “The Framework doesn’t just rely on one lever to ensure data safety and security. The multiple measures including policies, legal, and technology act as a failsafe—and we are constantly working on communicating best practices to our community to ensure that everyone is on the same page. But I’m glad that thus far, the research community has been responsive and is working together to make this happen.”
However, the risk of bad actors still exists. Dr Lee said, “It would be impossible and impractical to completely eradicate risks. But what we can do is try to reduce them to the acceptable risk level. So for example, TRUST uses a combination of controls to ensure that users won’t be able to exfiltrate data without a great amount of effort. Add to that the contracts and undertakings placed on organisations and users before they can access TRUST data—these combined technical and legal controls serve to deter any unauthorised behaviour.”
Refining the Five Safes
Nonetheless, four years since the conceptualisation of TRUST—the team emphasises that TRUST remains a work-in-progress on multiple fronts. One such aspect is further strengthening data anonymisation through privacy enhancing technologies. Dr Lee said, “We are currently actively exploring homomorphic encryption—in future, users could potentially run analyses on encrypted data without needing the raw data at all.”
“Adding on to that, we are also looking to better support industry partnerships,” said Dr Low. “TRUST was only opened up to the wider research community in end 2022. We are still learning how best to support the ecosystem and enable public-private partnerships. With feedback from our partners, we will continue to refine and redefine current measures under the Five Safes Framework to bring health-related research and real-world data together to support innovation and improve health outcomes.”
[1] Psychology of human error could help businesses prevent security breaches, CISOMAG