In recent years, a range of innovative solutions have emerged that enable organizations to collaborate using their own data—and that of their partners—without unnecessarily exposing sensitive information. Technologies such as clean rooms, trusted research environments, and tokenization are paving the way for what’s known as collaborative intelligence. This approach recognizes a key limitation: relying solely on internal data often leads to incomplete or unrepresentative datasets.
No single organization has access to all the data it needs. This creates a significant opportunity—especially in highly regulated sectors like life sciences and financial services—for institutions to work together. By diversifying datasets through partnerships, everyone benefits: better data leads to better insights. Equally valuable is the ability to share real-time intelligence—such as cybersecurity threats or fraud signals—with collaborators through an “always-on” network.
Given the immense potential, how can companies collaborate effectively while staying within regulatory boundaries, addressing information security concerns, and ensuring data remains protected?
The solution lies in Federated Computing.
In an exclusive interview, we spoke with Chris Laws, Chief Commercial Officer at Rhino Federated Computing, about why traditional centralized data models are no longer viable—and how federating data can unlock value across siloed systems within organizations.

Federated computing
Federated Computing enables organizations to connect disparate data sources without ever exposing the raw data. Only the resulting insights are shared—keeping sensitive information and intellectual property (IP) securely in place.
Founded in 2021, Rhino works with leading institutions including Eli Lilly (via its TuneLab program), the Cancer AI Alliance (CAIA)—a collaboration among Dana-Farber Cancer Institute, Fred Hutch Cancer Center, Memorial Sloan Kettering Cancer Center, Johns Hopkins Medicine, and the Johns Hopkins Whiting School of Engineering—and the Society for Worldwide Interbank Financial Telecommunication (SWIFT), which aims to analyze data from multiple sources for research and AI model training.
Federated Learning (FL) has existed for over a decade. It allows AI models to be trained directly where the data resides, eliminating the need to collect and centralize it. Google first used FL in 2017 to improve predictive typing on mobile devices. Since then, data scientists have increasingly adopted the technology to solve real-world business challenges. Notable examples include:
- The FAITE Consortium, which uses federated and active learning to predict biologics properties.
- NVIDIA Merlin, working with Toshiba Tec and McKinsey, to turn retail data into real-time business decisions.
- MELLODDY, a consortium developing predictive drug discovery models using Machine Learning Ledger Orchestration.
Rhino developed the Rhino Federated Computing Platform (Rhino FCP) to overcome common challenges associated with open-source FL frameworks. It offers a production-grade system with robust security and privacy controls, designed to integrate seamlessly into existing enterprise technology stacks—making it ready to tackle real business problems from day one.
“AI and data science leaders are increasingly being asked by business stakeholders: ‘How can we solve problems that require cross-silo collaboration—while keeping Legal, InfoSec, and Compliance teams comfortable?’” said Chris. “Often, they don’t even know the term ‘federated computing’—but that’s exactly what they need.”
Rhino FCP answers those needs. It’s not a tool searching for justification, nor a solution in search of a problem—it’s built to deliver immediate, practical value.
A platform built on local control
Organizations don’t need to centralize or merge their data to collaborate effectively. Traditional data lakes are often static and rely on data replication. In contrast, Rhino FCP operates alongside or on top of existing infrastructure—without requiring data to move.
“We’re designed to work well with other technologies,” Chris explained. For instance, if a company has invested heavily in an Azure-based data pipeline and needs to incorporate a supplier’s data, Rhino FCP provides a secure environment for joint analysis—without either party exposing their underlying datasets.
One major roadblock in collaboration arises when different teams or organizations use incompatible data schemas—or when strict privacy regulations make data integration complex. To solve this, Rhino created the Data Harmonization Engine.
This engine automatically translates one data model into another, enabling all parties to work with a consistent structure—without requiring either side to manually normalize, cleanse, or anonymize their data.
“We’ve intentionally built a platform that’s as flexible as possible,” Chris said. “It doesn’t matter what your data looks like—structured tables, images, video, or waveform data—it can all be federated.” This flexibility allows organizations to achieve targeted business outcomes using diverse, distributed data sources.
Teams can even bring their own applications to the platform as containers and run them directly against federated data. Developers and data scientists can explore Rhino’s documentation library and continue using familiar tools like SQL, Python, NVIDIA FLARE, or Rhino’s Generalized Compute Code via the SDK.
Whether running existing algorithms or preparing data for AI model training, Rhino FCP eliminates a core limitation of traditional architectures: the need to physically move or centralize sensitive data before analysis can begin.
Practical and realistic
Chris takes a grounded, pragmatic approach. He acknowledges that Rhino FCP and its Data Harmonization Engine get customers “90% of the way there” toward a unified data resource—some customization is usually needed. This stands in stark contrast to competitors who claim a single black-box large language model can handle everything.
This measured perspective comes from Rhino’s deep roots in highly regulated industries—life sciences, healthcare, and the public sector—where intellectual property and compliance are non-negotiable. As a result, the platform is now attracting interest from new sectors including energy, financial services, automotive, and agriculture.
By providing a collaborative environment where data normalization and federation are built in, Rhino enables organizations to finally put their data to work—generating insights that address their most pressing challenges.
For companies looking to move beyond the limitations of centralized data while maintaining strong security, we recommend visiting the upcoming TechEx Edge Computing North America Expo in San Jose, May 18–19. Meet Chris Laws at booth #269 to learn more. If this article has sparked your curiosity, explore Rhino’s website for further details.
(Image source: Sources, Pixabay (header image) under licence, RhinoFCP (interviewee))


Want to learn more about IoT from industry leaders? Check out IoT Tech Expo taking place in Amsterdam, California, and London. The comprehensive event is part of TechEx and co-located with other leading technology events. Click here for more information.
IoT News is powered by TechForge Media. Explore other upcoming enterprise technology events and webinars here.



