Beyond Data Sharing: Architecting for Digital Sovereignty

Author: Denis Avetisyan


A new approach to distributed computing allows organizations to collaborate securely without relinquishing control of their data.

The architecture binds core services and business interactions through proof-carrying artifacts originating from declarative contracts, establishing a runtime boundary check focused on artifact validity and capability matching-a verification step distinct from online policy evaluation-and thereby yielding a shared declarative model for infrastructure and policy automation while preserving workflow and domain logic within core and business components.
The architecture binds core services and business interactions through proof-carrying artifacts originating from declarative contracts, establishing a runtime boundary check focused on artifact validity and capability matching-a verification step distinct from online policy evaluation-and thereby yielding a shared declarative model for infrastructure and policy automation while preserving workflow and domain logic within core and business components.

Federated Computing as Code leverages cryptographic governance and capability-based security to enforce data sovereignty at execution boundaries.

While federated computing promises collaborative data analysis without centralizing sensitive information, existing systems often rely on fragile runtime policies and trusted intermediaries to enforce data sovereignty constraints. This paper introduces ‘Federated Computing as Code (FCaC): Sovereignty-aware Systems by Design’, an architectural discipline that compiles authority and delegation into cryptographically verifiable artifacts-enabling secure boundary admission and decentralized authorization. FCaC achieves this through Virtual Federated Platforms (VFPs) leveraging proof-carrying capabilities and a cryptographic trust chain, effectively separating constitutional governance from procedural controls. Could this approach unlock truly sovereign, scalable, and auditable federated ecosystems, fostering collaboration without compromising data control?


The Inevitable Friction of Boundaries

Conventional security frameworks, designed for isolated systems and centralized control, often clash with the requirements of modern, collaborative environments. These models typically prioritize protecting data within defined boundaries, creating significant friction when that data needs to be shared or jointly processed across organizational or national borders. This struggle arises because traditional approaches struggle to reconcile the need for access with the imperative of maintaining data sovereignty – the principle that data is subject to the laws and governance structures of the jurisdiction in which it is collected. Consequently, multi-party systems attempting to leverage shared data for innovation or efficiency frequently encounter legal complexities, operational hurdles, and increased security risks as they navigate conflicting regulatory requirements and struggle to establish mutually acceptable trust mechanisms.

Current data security protocols often depend on a central authority to verify identities and permissions, utilizing extensive access control lists that detail who can access what information. This centralized approach, while seemingly robust, introduces significant limitations in dynamic, multi-party environments. The reliance on a single point of control creates bottlenecks, slowing down data sharing and collaborative workflows, and hindering organizational agility. Furthermore, this architecture represents a concentrated risk; a compromise of the central authority exposes the entire system, and maintaining these complex access control lists is prone to errors and administrative overhead. Consequently, organizations struggle to balance security requirements with the need for seamless data exchange, ultimately limiting their ability to participate in collaborative ventures and innovate effectively.

The escalating demand for secure data sharing and distributed computation is fundamentally challenging established security frameworks. Modern data-intensive applications, from collaborative scientific research to global supply chain management, necessitate the seamless exchange of information across organizational boundaries. However, conventional centralized security models, reliant on strict access controls and single points of trust, prove increasingly inadequate for these dynamic, multi-party environments. This inadequacy isn’t simply a matter of scalability; it represents a paradigm shift. The current approaches often create friction, impede innovation, and introduce vulnerabilities as data proliferates beyond traditional perimeters. Consequently, a new security paradigm, built on principles of decentralized trust, fine-grained access control, and verifiable computation, is no longer just desirable – it is essential for enabling secure and efficient collaboration in the digital age.

Federated governance regimes range from stateful-only federations-a simplified form of fully capable access control (FCaC)-to hybrid and ultimately FCaC approaches, with FCaC specifically designed to ensure portable admission control using locally verifiable artifacts.
Federated governance regimes range from stateful-only federations-a simplified form of fully capable access control (FCaC)-to hybrid and ultimately FCaC approaches, with FCaC specifically designed to ensure portable admission control using locally verifiable artifacts.

Constructing Boundaries in a Decentralized World

A Virtual Federated Platform (VFP) establishes a structural framework for secure inter-organizational interaction by defining explicit boundaries and interaction protocols. This framework moves beyond perimeter-based security, enabling granular control over data and service access. Instead of relying on network-level trust, a VFP facilitates agreements defining what each organization can access and how, fostering a trust-but-verify approach. These boundaries are not necessarily physical or network-defined, but logical constructs enforced through contractual agreements and technical mechanisms, allowing organizations to collaborate securely without requiring full network integration or exposing internal systems directly. The VFP architecture aims to reduce the attack surface by limiting exposure and establishing clear accountability for data access and usage.

A Virtual Federated Platform (VFP) establishes interaction boundaries through three contract types. Core contracts define foundational technical requirements for interoperability, such as data exchange formats and communication protocols. Business contracts detail the specific services provided and consumed between organizations, including service level agreements and pricing models. Finally, Governance contracts outline the policies and procedures governing data usage, security, and compliance, establishing accountability and dispute resolution mechanisms. These contracts collectively define the parameters of interaction, ensuring secure and controlled data exchange within the federated environment.

Capability-based security fundamentally differs from access control lists (ACLs) by shifting the focus from who has access to what they are authorized to do. Instead of centralizing permissions based on user or group identity, capability-based systems utilize unforgeable ā€˜capabilities’ – essentially tokens – that explicitly grant permission to perform a specific action on a particular resource. These capabilities are transferable and can be delegated, allowing for fine-grained access control. Critically, capabilities are time-bound, automatically expiring after a pre-defined duration, which minimizes the impact of compromised or stolen tokens and enforces the principle of least privilege. Verification relies on cryptographic signatures ensuring the capability’s authenticity and integrity, without requiring a central authorization server for every access request.

This fragment of the policy.json file demonstrates the core elements-operation sets, capability profiles, default assignments, caveats, and metadata-used by FCaC to define and manage system capabilities.
This fragment of the policy.json file demonstrates the core elements-operation sets, capability profiles, default assignments, caveats, and metadata-used by FCaC to define and manage system capabilities.

Anchoring Trust in a Fragmented Landscape

ā€œKey Your Organizationā€ utilizes X.509 certificates as the foundation for establishing a root of trust, enabling the issuance of verifiable credentials to each participating entity. These digital certificates, conforming to the PKI standard, cryptographically bind a public key to an identity, allowing for authentication and non-repudiation. Verifiable credentials, issued based on these certificates, provide a digitally signed assertion of attributes about the credential holder. This system ensures unique identification of each participant through the certificate’s subject field and allows for selective disclosure of verified information, bolstering trust and enabling secure interactions without relying on centralized authorities.

Trusted Execution Environments (TEEs) enhance the security of verifiable credential processes by providing a hardware-isolated execution environment. These environments, such as ARM TrustZone and Intel SGX, protect cryptographic keys used for signing and verifying credentials from software-based attacks. Sensitive operations, including key generation, storage, and cryptographic calculations, are performed within the TEE, minimizing the attack surface and preventing unauthorized access even if the main operating system is compromised. This isolation ensures the integrity and confidentiality of credentials and strengthens the overall trust framework by providing a robust root of trust for cryptographic material.

Envelope Capability Tokens utilize the JSON Web Token (JWT) standard to securely convey authorization claims. These tokens encapsulate specific capabilities, defining precisely what actions a credential holder is permitted to perform. This approach moves beyond broad permissions, enabling fine-grained authorization where access is granted on a per-operation basis. The JWT structure includes digitally signed claims, ensuring both authenticity and integrity of the capability being asserted. This encapsulation minimizes the scope of potential compromise; even if a token is intercepted, its limited scope restricts the attacker’s ability to perform unauthorized actions beyond the defined capability.

The FCaC cryptographic trust chain leverages Key Your Organization (KYO), Envelope Capability Tokens (ECTs), and Proof-of-Possession (PoP) to establish secure and verifiable trust.
The FCaC cryptographic trust chain leverages Key Your Organization (KYO), Envelope Capability Tokens (ECTs), and Proof-of-Possession (PoP) to establish secure and verifiable trust.

Encoding Sovereignty in a Dynamic System

Federated Computing as Code defines an architectural discipline focused on explicitly representing and mechanically enforcing sovereignty constraints-the limitations on data access and processing dictated by legal, regulatory, or organizational policies-through the use of verifiable artifacts. These artifacts, which can include policies, access control lists, and cryptographic attestations, are treated as code, allowing for automated verification and enforcement at runtime. This approach shifts from relying on trust between federated parties to a system based on cryptographic proofs and auditable evidence, ensuring that data remains within defined boundaries and under the control of its owner. The ā€œas codeā€ element enables versioning, testing, and continuous integration/continuous delivery (CI/CD) practices to be applied to sovereignty constraints, improving reliability and facilitating updates to policies as needed.

Proof of Possession (PoP) is a security mechanism that establishes cryptographic proof of access to a private key associated with a token, without revealing the key itself. Specifically, techniques like Decentralized Proof of Possession (DPoP) allow a client to cryptographically sign requests using a private key, generating a proof that demonstrates ownership of the corresponding public key – and thus, authorization to use the associated token – for each individual request. This is achieved through the creation of a digital signature attached to the request, verifiable by the relying party using the public key. The scope of the proof is limited to that specific request, preventing replay attacks and mitigating the risk of token theft, as possession of the token alone is insufficient for authorization.

The integration of Federated Computing as Code, Proof of Possession, and cryptographic token binding allows for secure cross-organizational data exchange governed by defined policies. This approach establishes verifiable trust by cryptographically linking requests to the entity possessing the associated key – demonstrated through technologies like DPoP. Consequently, interactions are not only secured via established cryptographic methods, but also auditable through the verifiable artifacts generated, providing a clear record of access and data flow. This enables organizations to confidently share data and collaborate while maintaining sovereignty and adherence to pre-defined governance rules.

Fine-grained Credential Control (FCaC) establishes identity, defines authorization, and binds actions under verifiable authority using locally held artifacts.
Fine-grained Credential Control (FCaC) establishes identity, defines authorization, and binds actions under verifiable authority using locally held artifacts.

Towards an Interoperable Future of Sovereign Systems

The evolving landscape of data collaboration is witnessing a fundamental shift, moving away from centralized data silos towards a model where organizations can securely interact without relinquishing control over their information. This new paradigm enables entities to participate in shared initiatives – such as joint research, supply chain optimization, or fraud detection – while maintaining complete sovereignty over their datasets. Instead of transferring data across organizational boundaries, computations are brought to the data, allowing insights to be generated without ever exposing the raw information itself. This approach not only addresses growing privacy concerns and regulatory requirements, like GDPR, but also unlocks the potential for innovation by fostering trust and enabling previously impossible collaborations, ultimately redefining how organizations share value and build partnerships in a data-driven world.

The emergence of Federated Computing as Code is catalyzing a shift towards decentralized, collaborative business models. This approach treats the rules governing data sharing and computation as executable code, fostering trust through transparency and verifiability. Rather than relying on centralized intermediaries, organizations can now define precisely how their data is used and combined with others’, unlocking new revenue streams from data assets without relinquishing control. This paradigm enables secure data ecosystems where value is created through interoperability, allowing for innovations like collaborative research, supply chain optimization, and personalized services – all built on a foundation of cryptographic guarantees and automated policy enforcement. Ultimately, it transforms data from a potential liability into a dynamic, trusted resource for growth and collaboration.

The advent of federated systems necessitates a fundamental shift in how policies are enforced and data integrity is maintained, increasingly relying on cryptographic governance. This approach moves beyond traditional, centralized control mechanisms by leveraging cryptographic proofs and verifiable computation to ensure adherence to pre-defined rules across participating organizations. Instead of relying on trust between systems, cryptographic governance establishes trust in the system itself, through mathematically verifiable guarantees. Data access, modification, and sharing become subject to cryptographic conditions, meaning actions are permitted only if accompanied by valid proofs demonstrating policy compliance. This not only strengthens security against malicious actors, but also provides auditable trails and facilitates automated enforcement, creating a robust framework for collaboration where data sovereignty is paramount and integrity is continuously assured.

The pursuit of decentralized authorization, as detailed in this exploration of Federated Computing as Code, inherently acknowledges the transient nature of control. Systems, much like any complex entity, inevitably evolve and adapt. As John von Neumann observed, ā€œThe best way to predict the future is to invent it.ā€ This sentiment resonates deeply with the core concept of FCaC, which proactively designs for data sovereignty rather than reacting to its erosion. The architecture doesn’t seek to prevent change, but to manage it through cryptographically verifiable artifacts at execution boundaries, allowing systems to age gracefully while maintaining integrity and enabling secure collaboration. Sometimes, observing and preparing for the process is more valuable than attempting to halt it altogether.

What Lies Ahead?

The discipline of Federated Computing as Code, as presented, isn’t a destination, but a versioning strategy for distributed trust. Every boundary admission, every proof-carrying artifact, represents a negotiation with the arrow of time – an attempt to constrain entropy. The elegance of capability-based security, codified and cryptographically governed, merely delays the inevitable refactoring required as system contexts inevitably diverge. The true measure of its success won’t be initial deployment, but graceful degradation-how well these sovereign systems accommodate obsolescence.

Current limitations are not technical, but conceptual. FCaC assumes a clarity of intent regarding data sovereignty that rarely exists in practice. The boundaries themselves, though technically enforceable, are still defined by human actors, each with their own evolving understanding of ā€˜ownership’ and ā€˜control’. Future work must address the meta-problem of boundary definition – a kind of recursive sovereignty, where the rules governing boundary creation are themselves subject to cryptographic governance.

The field’s trajectory isn’t toward ā€˜solving’ data sovereignty-a static ideal-but toward building systems that remember their provenance and constraints. Each artifact becomes a historical record, a testament to past agreements. This memory, encoded in cryptographic form, is not merely about security, but about accountability – a way to trace the lineage of data and the evolution of trust across distributed landscapes. The challenge isn’t to eliminate risk, but to build systems that can gracefully absorb and adapt to its constant presence.


Original article: https://arxiv.org/pdf/2603.17331.pdf

Contact the author: https://www.linkedin.com/in/avetisyan/

See also:

2026-03-19 15:59