Author: Denis Avetisyan
A new analysis shows that upgrading the 5G core network with post-quantum cryptography is feasible and won’t significantly impact performance.

This review details the implementation of quantum-resistant algorithms – including KEMs and digital signatures – within the 5G core network to safeguard against future quantum computing attacks.
The looming threat of quantum computing necessitates a proactive shift in cryptographic standards, yet assessing the practical implications of this transition within complex network infrastructure remains a significant challenge. This paper, ‘Post-Quantum Cryptography in the 5G Core’, investigates the feasibility of replacing conventional cryptographic algorithms with post-quantum alternatives within the 5G core network. Our simulations demonstrate that deploying these new algorithms introduces a measurable, though modest, performance overhead without substantially impacting network usability or function efficiency. Will this pave the way for widespread adoption of quantum-resistant protocols in future generations of wireless communication?
The Looming Quantum Disruption: A System’s Inevitable Cascade
The foundation of secure online transactions, data protection, and digital communication rests heavily on public-key cryptography, most notably systems like RSA and Diffie-Hellman. These algorithms enable secure exchange of information by employing a pair of mathematically linked keys – a public key for encryption, widely distributed, and a private key, kept secret by the owner. This asymmetry allows anyone to encrypt a message for a recipient, but only the holder of the corresponding private key can decrypt it. From securing $https:// connections when browsing the web and protecting email confidentiality, to facilitating secure financial transactions and enabling virtual private networks, these cryptographic methods are interwoven into the fabric of modern digital life. Their widespread adoption, however, creates a single point of vulnerability, as a breakthrough in breaking these systems would have cascading consequences for global cybersecurity.
Shor’s algorithm, developed by mathematician Peter Shor in 1994, represents a fundamental challenge to the security of widely-used public-key cryptographic systems. Classical algorithms require exponential time to factor large numbers – the basis of RSA encryption – rendering them secure for practical purposes. However, Shor’s algorithm, leveraging the principles of quantum mechanics such as superposition and quantum entanglement, can achieve factorization in polynomial time. This means a sufficiently powerful quantum computer – one with enough stable qubits – could break RSA encryption in a timeframe rendering current security measures ineffective. The algorithm’s efficiency stems from its ability to exploit the wave-like properties of qubits to simultaneously explore numerous potential factors, dramatically reducing the computational effort. While building such a quantum computer remains a significant technological hurdle, the theoretical existence of Shor’s algorithm necessitates a preemptive transition to quantum-resistant cryptographic alternatives.
The anticipated arrival of quantum computing compels a fundamental reassessment of contemporary cryptographic practices. Current public-key systems, relied upon for secure communication and data protection, are increasingly vulnerable to algorithms like Shor’s, which, when executed on a sufficiently powerful quantum computer, can efficiently factor large numbers and solve discrete logarithms – the mathematical foundations of RSA and Diffie-Hellman encryption. Consequently, the cybersecurity community is actively developing and standardizing post-quantum cryptography (PQC) – a suite of algorithms believed to be resistant to both classical and quantum attacks. These emerging methods, encompassing lattice-based cryptography, code-based cryptography, multivariate cryptography, and hash-based signatures, represent a crucial step towards ensuring long-term data security in a post-quantum world, and their widespread implementation is no longer a matter of future preparedness but an immediate necessity to safeguard sensitive information against evolving threats.

Building the Bastion: A New Foundation for Secure Exchange
Post-Quantum Cryptography (PQC) addresses the potential threat posed by quantum computers to currently deployed public-key cryptosystems, such as RSA and ECC. These algorithms rely on the computational hardness of mathematical problems that are efficiently solvable by quantum algorithms, specifically Shor’s algorithm. PQC seeks to develop cryptographic algorithms that are resistant to attacks from both classical and quantum computers. This is achieved by basing security on mathematical problems believed to be hard for both types of computers, such as lattice-based cryptography, code-based cryptography, multivariate cryptography, and hash-based signatures. The goal is not to replace symmetric-key algorithms, which are not directly threatened by quantum computers, but to provide quantum-resistant alternatives for asymmetric cryptography, ensuring continued confidentiality, integrity, and authentication of digital communications and data.
Post-Quantum Cryptography (PQC) relies fundamentally on two cryptographic primitives: Key Encapsulation Mechanisms (KEMs) and Digital Signature Algorithms. KEMs are used to establish shared secret keys between parties, functioning by encapsulating a symmetric key using public key encryption and providing the encapsulated key to the intended recipient. Digital Signature Algorithms, conversely, provide authentication and non-repudiation by allowing a party to digitally sign data, verifying both the origin and integrity of the message. These algorithms differ from traditional public-key cryptography, which is vulnerable to attacks from quantum computers utilizing algorithms like Shor’s algorithm. The selection and standardization of robust KEMs and digital signature schemes are critical for building a future-proof cryptographic infrastructure resilient against both classical and quantum threats.
Several algorithms are currently under evaluation as potential standards for post-quantum cryptography. FrodoKEM and BIKE are Key Encapsulation Mechanisms (KEMs) based on the Learning With Errors (LWE) and Module-LWE problems, respectively, offering varying performance characteristics and security assumptions. ML-DSA and SLH-DSA represent digital signature schemes utilizing multivariate quadratic equations and Learning with Signatures, respectively. Falcon is a lattice-based digital signature algorithm designed for compact signature sizes and efficient verification. These algorithms are undergoing rigorous analysis by the National Institute of Standards and Technology (NIST) as part of its post-quantum cryptography standardization process, with assessments focusing on security, performance, and implementation feasibility.

5G as a Testbed: Integrating Resilience into the Core
The 5G Core network fundamentally depends on secure communication protocols to establish trust and protect data in transit. Transport Layer Security (TLS) and Internet Protocol Security (IPsec) are integral to multiple network functions, including authentication, session management, and data encryption between network elements. Specifically, TLS is used for securing control plane signaling, while IPsec often secures user plane data transmission. These protocols rely on asymmetric cryptography for key exchange and digital signatures, creating vulnerabilities if the underlying algorithms are compromised. The widespread adoption of these protocols within the 5G architecture means their security is paramount to the overall integrity and confidentiality of the network.
The continued reliance of 5G Core networks on TLS and IPsec for secure communication necessitates proactive mitigation of potential threats posed by advancements in quantum computing. Current implementations of these protocols utilize public-key cryptographic algorithms, such as RSA and ECC, which are vulnerable to attacks from sufficiently powerful quantum computers. Replacing these algorithms with post-quantum cryptographic (PQC) alternatives, like those integrated into KEMTLS, is therefore essential for maintaining long-term confidentiality and integrity. KEMTLS, for example, utilizes Key Encapsulation Mechanisms (KEMs) and post-quantum digital signatures to provide a drop-in replacement for vulnerable components within the TLS handshake, effectively future-proofing the network against quantum-based decryption and forgery attacks. This proactive replacement is crucial as the timeline for practical quantum computing capabilities remains uncertain, but the potential for “store now, decrypt later” attacks necessitates immediate action to protect sensitive data transmitted over 5G networks.
Performance evaluations conducted within the 5G Core network demonstrated that integrating post-quantum cryptographic algorithms yields minimal impact on network usability. Testing focused on Key Encapsulation Mechanisms (KEMs) and digital signature schemes, revealing performance differences of less than 200 milliseconds when compared to conventional cryptographic implementations. These results indicate that transitioning to post-quantum cryptography within the 5G Core can be achieved without introducing substantial latency or disrupting existing network operations, suggesting a feasible path toward quantum-resistant security.
The National Institute of Standards and Technology (NIST) plays a critical role in the adoption of post-quantum cryptography (PQC) by establishing standardized algorithms for secure communication. NIST’s ongoing standardization process involves rigorous evaluation of candidate PQC algorithms based on security, performance, and implementation characteristics. This process is designed to identify algorithms suitable for long-term security against both classical and quantum computing attacks. Successful completion of standardization, as evidenced by the selection of algorithms for protocols like TLS and IPsec, is essential for ensuring interoperability between different systems and fostering trust in the security of 5G infrastructure. Without standardized algorithms, widespread adoption would be hindered by compatibility issues and a lack of confidence in the security assurances provided by various implementations.
Beyond Immediate Threats: A System’s Inevitable Evolution
The escalating threat of “Harvest Now, Decrypt Later” attacks fundamentally shifts the perspective on cryptographic security. This model posits that malicious actors are actively collecting encrypted data today, anticipating the future availability of quantum computers capable of breaking currently used public-key algorithms. Consequently, long-term data confidentiality isn’t merely about the strength of encryption today, but about its resilience against decryption capabilities that may emerge decades from now. This necessitates a proactive approach, prioritizing cryptographic agility – the ability to quickly and efficiently transition to new, more secure algorithms as threats evolve. Organizations must move beyond solely focusing on immediate security needs and instead implement systems designed for sustained cryptographic protection, acknowledging that data lifecycles often extend far beyond the current technological horizon. The potential for retroactive decryption demands a forward-looking strategy, emphasizing algorithm diversification and streamlined key management to safeguard data against future computational advances.
Hybrid cryptography represents a strategically balanced approach to securing digital infrastructure during the ongoing transition to post-quantum cryptography. Rather than immediately abandoning well-established classical algorithms – those like RSA and Elliptic Curve Cryptography currently safeguarding much of the internet – this method layers post-quantum algorithms alongside them. This creates a system where communication is protected by at least one secure algorithm, even if a future quantum computer were to break the classical components. The benefit lies in minimizing disruption; systems can adopt post-quantum defenses incrementally, and if a post-quantum algorithm proves vulnerable before widespread adoption, the classical layer continues to provide security. This pragmatic strategy acknowledges the current uncertainties surrounding post-quantum standardization and implementation, offering a robust and adaptable path toward long-term cryptographic resilience.
Recent performance analysis reveals a notable increase in network registration times as user equipment (UE) density grows. Specifically, a 200-millisecond jump was observed in the 95th percentile registration duration when 40 UEs were simultaneously active, indicating potential bottlenecks within queuing mechanisms as the system scales. Furthermore, investigations into post-quantum cryptographic algorithms demonstrated a significant divergence in outlier – or 100th percentile – registration times compared to conventional algorithms; this suggests that while typical performance remains comparable, a small fraction of users might experience considerably slower connections, highlighting a need for optimization to ensure consistently smooth network access for all.
The longevity of cryptographic security hinges on the underlying mathematical principles governing algorithms. Unlike empirical security-where robustness is demonstrated through repeated attacks-algorithms founded on well-established mathematical problems offer sustained resilience. Structures like Module Lattices, a complex area of abstract algebra, and Stateless Hash Functions, which derive security from one-way functions independent of past inputs, represent this approach. These aren’t simply code implementations; they are embodiments of mathematical truths, meaning a successful attack would require breaking the mathematics itself-a significantly higher barrier than exploiting a software vulnerability. Consequently, algorithms leveraging these foundations, such as those being developed for post-quantum cryptography, offer a more enduring defense against future computational advances and the potential for cryptanalysis, ultimately ensuring long-term data confidentiality and integrity.
The pursuit of post-quantum cryptography within the 5G core reveals a familiar pattern. The article details a transition, an adaptation of existing structures rather than a wholesale creation. This echoes a fundamental truth: systems aren’t built, they evolve. The demonstrated minimal performance impact of integrating post-quantum algorithms isn’t a victory of engineering, but a testament to the inherent resilience of well-designed networks. As Ada Lovelace observed, “The Analytical Engine has no pretensions whatever to originate anything. It can do whatever we know how to order it to perform.” This aligns perfectly with the work; the 5G core doesn’t become quantum-resistant, it’s ordered to be, utilizing known cryptographic principles adapted for a new threat landscape. The study foresees the eventual necessity of this adaptation, acknowledging that even robust architectures are subject to entropy and require proactive evolution.
What Lies Ahead?
The demonstrated feasibility of integrating post-quantum cryptography into the 5G core offers little comfort. It merely postpones the inevitable reckoning with entropy. One successfully replaces one set of assumptions with another, trading known vulnerabilities for the unknown. The performance impact, while presently acceptable, is a fleeting victory; the network will invariably grow, demands will escalate, and optimization will once more become a frantic chase after diminishing returns.
The focus now shifts, predictably, to standardization and deployment. But to believe this is about selecting the ‘right’ algorithms is to fundamentally misunderstand the nature of systems. Technologies change, dependencies remain. The true challenge isn’t mathematical elegance, but the logistical nightmare of key management at scale, and the quiet, persistent threat of implementation errors-the flaws that no theorem can foresee.
One might envision a future of ‘crypto-agility’, of dynamically swapping algorithms as threats evolve. A charming notion, but architecture isn’t structure-it’s a compromise frozen in time. Such flexibility will introduce its own fragility, a cascade of potential failures masked by layers of abstraction. The network will adapt, yes, but not through design, but through the slow, relentless pressure of necessity.
Original article: https://arxiv.org/pdf/2512.20243.pdf
Contact the author: https://www.linkedin.com/in/avetisyan/
See also:
- Jujutsu Zero Codes
- Jujutsu Kaisen Modulo Chapter 16 Preview: Mahoraga’s Adaptation Vs Dabura Begins
- All Exploration Challenges & Rewards in Battlefield 6 Redsec
- One Piece Chapter 1169 Preview: Loki Vs Harald Begins
- Best Where Winds Meet Character Customization Codes
- Battlefield 6: All Unit Challenges Guide (100% Complete Guide)
- Top 8 UFC 5 Perks Every Fighter Should Use
- Upload Labs: Beginner Tips & Tricks
- Where to Find Prescription in Where Winds Meet (Raw Leaf Porridge Quest)
- Everything Added in Megabonk’s Spooky Update
2025-12-24 07:15