Back
Development

Technology with purpose: Tune Insight’s co-founder & CTO reveals the vision behind the company’s technological choices

03/03/2026 7 min of reading
Written by Tune Insight
Expert in data collaboration

Three technologies, one vision: using health data without compromising on privacy

A new approach to health data

As health systems go digital, institutions are producing vast amounts of data – data that is both extremely sensitive, requiring the highest levels of protection, and highly valuable for advancing research, accelerating therapeutic innovation and optimizing hospital organization. Progress in the sector has been stalled for years as data either remains locked away in silos, or is centralized using environments that raise issues of sovereignty, trust and conformity.

The only way to collaborate across jurisdictions is to federate,” emphasizes Romain Bouyé, Tune Insight’s CTO. This belief is not merely a technical principle, it’s at the very heart of Tune Insight’s approach: combining three technologies, rather than relying on any one alone, to maintain maximum data security while leveraging its full potential.

But what does this actually look like on a practical level? Romain Bouyé shares how Tune Insight has had to make bold technological choices – sometimes going against industry trends – to overcome a simple problem with game-changing potential: allowing healthcare actors to share sensitive data without ever compromising on privacy, sovereignty or analytical quality.

Combining the three building blocks of differential privacy, homomorphic encryption and federated architecture is the revolution health data sharing needs.

1. Differential privacy: protecting the person without destroying the information

At first glance, differential privacy looks like a simple data anonymization operation. But it’s much more subtle than that. In reality, the more variables we add, the greater the risk of a patient being re-identified – even with aggregated data.

Anonymous statistics are possible without differential privacy if there are many patients and little information… But as soon as you add variables, aggregation alone just isn’t enough,” Romain Bouyé explains.

Differential privacy provides an extra layer of protection by adding controlled statistical noise after aggregation to prevent unnecessary data degradation. This solution simultaneously preserves the analytical value of the data and individual privacy.
Another challenge lies in applying the restrictions needed to comply with the data contributor privacy policies. This may mean guaranteeing sufficiently secure aggregation or limiting the number of requests.

Implementing these restrictions is particularly complex, not just technically but also in terms of user experience. The rules must be clearly expressed—a crucial and non-trivial work.

This first building block provides a mathematical guarantee: no individual information can be extracted, not even indirectly.

2. Homomorphic encryption: performing calculations without ever seeing the data


+3k subscribers

Stay connected to follow
our latest news