Healthcare Data Safe Havens allow the sharing of healthcare data in a secure infrastructure for extensive, data-intensive experiments being demanded by precision and stratified medicine. A key architectural challenge is how to maintain control of patient data within the governance of local data jurisdictions, while also allowing these jurisdictions to engage with experiment designs that (because of the need to scale to large population sizes) may require analyses across several jurisdictions.
Safe Haven providers have to employ a meticulous process in order to ensure privacy and security concerns are addressed. This includes vetting the experimenter and the purpose of their research, cleaning and de-identifying the data, and providing a controlled and capable environment to perform the experiments.
In the Farr Institute, one of the primary providers for healthcare data in the UK, most of the process is manual and lengthy, taking several months from data request to actual data provision. We are working with them to automate parts of their workflow in order to reduce the human effort and time required, while maintaining the high levels of rigour and privacy in the data governance.
This includes the following tasks among others:
- Task analysis of as-is and to-be processes in close collaboration with primary stakeholders.
- Formalisation of workflows.
- Integration of ontology-based data management technologies developed by researchers at the University of Trento, Italy.