From science to practice my work revolves around bridging the gap between causality theory and its practical applications, particularly in the context of healthcare data, addressing the intricate nuances that arise when dealing with multiple data sets.
In many instances data are collected at different points in time, where baseline information is gathered from a prior study and the outcome data are collected later.
At the intersection of causal inference and record linkage I seek to develop statistical methods that propagates the uncertainty inherent in record linkage procedures to ensure reliable causal estimates.
I will present a poster at the European Causal Inference Meeting 2025 in Gent, on Causal Record Linkage: Critical Issues and Novel Approaches to False Discovery Propagation. [abstract]
Mar, 2025
COMEECON a seminar series organised by Nuria Senar Villadeamigo and myself in the Netherlands as part of the VVSOR. We gather young researchers to talk about computational statistics methods through time and developments, thought processes and theoretical, practical frameworks. More details to come soon.
Combining data from various sources empowers researchers to explore innovative questions, for example those raised by conducting healthcare monitoring studies. However, the lack of a unique identifier often poses challenges. Record linkage procedures determine whether pairs of observations collected on different occasions belong to the same individual using partially identifying variables (e.g. birth year, postal code). Existing methodologies typically involve a compromise between computational efficiency and accuracy. Traditional approaches simplify this task by condensing information, yet they neglect dependencies among linkage decisions and disregard the one-to-one relationship required to establish coherent links. Modern approaches offer a comprehensive representation of the data generation process, at the expense of computational overhead and reduced flexibility. We propose a flexible method, that adapts to varying data complexities, addressing registration errors and accommodating changes of the identifying information over time. Our approach balances accuracy and scalability, estimating the linkage using a Stochastic Expectation Maximization algorithm on a latent variable model. We illustrate the ability of our methodology to connect observations using large real data applications and demonstrate the robustness of our model to the linking variables quality in a simulation study. The proposed algorithm FlexRL is implemented and available in an open source R package.
@article{robachetal2025,author={Robach, Kayané and {van der} Pas, Stéphanie L and {van de} Wiel, Mark A and Hof, Michel H},title={A flexible model for record linkage},journal={Journal of the Royal Statistical Society Series C: Applied Statistics},pages={qlaf016},year={2025},month=feb,issn={0035-9254},doi={10.1093/jrsssc/qlaf016},url={https://doi.org/10.1093/jrsssc/qlaf016},eprint={https://academic.oup.com/jrsssc/advance-article-pdf/doi/10.1093/jrsssc/qlaf016/62206007/qlaf016.pdf},}
Contact: k dot c dot robach at amsterdamumc dot nl Linkedin: GitHub: Scholar: