mood global services mgs logo
PQC
REGULATION
LEGAL ASPECTS OF ARTIFICIAL INTELLIGENCE

Data Encrypted Today, Breached Tomorrow: "Harvest Now, Decrypt Later" and the Time Dimension of Data Protection Rights

Giovanni Piccirillo

Giovanni Piccirillo

Apr 20, 2026

15 min read

Harvest Now, Decrypt

An intelligence service intercepting the encrypted communications of a foreign ministry today doesn't need to break the encryption in real time: it simply archives the traffic and waits for quantum computers, in ten or twenty years, to render the underlying mathematics obsolete. The cost of this operation, given the current price of large-scale storage, is negligible. The potential information gain, however, could be enormous. This asymmetry, simple in logic and devastating in its implications, has a precise name: "Harvest Now, Decrypt Later" (HNDL). And it's already underway.

A paradigm that rewrites the premises of law

The right to personal data protection is fundamentally built on a present-day approach. Security measures are assessed at the time they are adopted; regulatory compliance is assessed against the current state of the art; damage is detected, reported, and sanctioned when it occurs. The General Data Protection Regulation (GDPR, Regulation (EU) 2016/679) consistently follows this approach: Article 32 requires data controllers and processors to implement technical and organizational measures "appropriate" to the risk, but the assessment of adequacy is, in established regulatory practice, anchored to the current situation.

The HNDL paradigm challenges this fundamental premise. It introduces a temporal dimension into the very concept of data security: protection is no longer a matter of maintaining a defensive perimeter today, but of ensuring cryptographic resilience throughout the entire lifespan of the information. For health data, which can be retained for forty years or more in various European jurisdictions, or for genomic data, which retains its sensitivity for generations, this horizon extends far beyond any reasonable forecast of short-term technological developments. The question that legislators and regulators have not yet systematically addressed is this: can data that is predictably vulnerable tomorrow be considered secure today?

The threat that cannot be seen

What makes HNDL structurally different from any other known cyber threat is its invisibility. A ransomware attack manifests itself: systems crash, data becomes inaccessible, and the affected organization suffers immediately perceptible damage. A traditional data breach leaves traces: anomalous access logs, detectable network exfiltrations, notifications that must be sent to supervisory authorities within seventy-two hours pursuant to Article 33 of the GDPR. HNDL leaves none of this. The adversary simply passively copies the encrypted traffic passing through public or semi-public infrastructures, without altering anything, without triggering any alarms, and without causing any immediately verifiable damage. The breach has already occurred, but its concrete effect—the loss of confidentiality of information—has been deferred indefinitely.

The U.S. National Security Agency (NSA), the Cybersecurity and Infrastructure Security Agency (CISA), and the European Union Agency for Cybersecurity (ENISA) have published technical advisories explicitly recognizing this threat as operational and not speculative (CISA, "Preparing for Post-Quantum Cryptography," 2022; ENISA, "Post-Quantum Cryptography: Current State and Quantum Mitigation," 2021). State actors with significant passive interception capabilities, able to access global communications transit nodes, have every rational incentive to conduct systematic encrypted traffic collection campaigns. The strategic calculation is elementary: the marginal cost of storing a terabyte of encrypted data is currently less than a few cents; the potential information value of that data, once decipherable, may be incalculable. HNDL is, in essence, a near-zero-cost financial option on future intelligence.

The most disturbing aspect, from a legal perspective, is that the incident notification and response system established by the GDPR is structurally inadequate to detect this threat. The data controller will never know, directly and promptly, that their encrypted communications have been intercepted and archived. The seventy-two-hour deadline for notifying the supervisory authority will never begin to run. Data subjects will never receive notification of the breach. The entire enforcement architecture built around the notification requirement is being circumvented not because the system is flawed, but because the threat operates on a timescale not contemplated by the legislator.

Quantum computing and the end of classical cryptography

To understand why HNDL represents a real threat rather than a technological dystopia, it is necessary to understand the mechanics of the cryptographic vulnerability that makes it possible. Modern asymmetric cryptography, which underpins nearly all secure digital communications, relies on assumptions of computational difficulty: certain mathematical problems, such as factoring very large integers or calculating discrete logarithms on elliptic curves, are virtually unsolvable by any classical computer within relevant time horizons. RSA, the elliptic curve cryptography (ECC) used in TLS, SSH, and digital signatures, and the Diffie-Hellman key exchange protocol derive their security entirely from this difficulty.

Quantum computers operate on radically different physical principles, exploiting quantum superposition, entanglement, and interference to perform certain classes of computations exponentially more efficiently than any classical machine. The algorithm formulated by mathematician Peter Shor in 1994, running on a sufficiently large quantum computer with adequate error correction, can factor arbitrarily large integers and compute discrete logarithms in polynomial time, destroying the security guarantees of RSA, ECC, and Diffie-Hellman (Shor, P.W., "Polynomial-Time Algorithms for Prime Factorization and Discrete Logarithms on a Quantum Computer," SIAM Journal on Computing, 1997). A second relevant algorithm, Lov Grover's, effectively halves the key length in symmetric encryptions, weakening but not catastrophically compromising standards like AES-256.

The scientific debate over the timeframe for the availability of cryptographically relevant quantum computers (CRQCs) remains open. Estimates vary widely, reflecting profound uncertainties about progress in quantum error correction, qubit coherence, and processor scalability. However, uncertainty about when doesn't postpone the risk; it merely postpones knowledge of its manifestation. National investments in quantum research in the United States, China, the European Union, and the United Kingdom collectively amount to tens of billions of dollars (McKinsey Global Institute, "Quantum Technology Monitor," 2023), and roadmaps published by Google and IBM project processors of sufficient scale for experimental demonstrations within the current decade. For an organization handling data that will remain confidential for the next fifteen to twenty years, this uncertainty isn't reassuring; it's a risk factor that must be quantified and managed.

The technical answer exists, but migration is a governance problem

The international cryptography community has responded to the quantum challenge with a decade-long effort of research and standardization that has produced concrete results. The field of post-quantum cryptography (PQC) involves the development of algorithms based on mathematical problems, such as lattice-based ones, which are believed to be resistant to attacks by both classical and quantum computers. Unlike quantum key distribution (QKD), which requires specialized and expensive hardware infrastructure, PQC algorithms are implementable via software on conventional computing architectures, making them the most immediately viable technical solution on an industrial scale.

The most significant moment in this process was the publication, in August 2024, of the first definitive standards by the US National Institute of Standards and Technology (NIST), concluding a standardization process initiated in 2016 with an open international call. The published standards include FIPS 203 (ML-KEM, based on CRYSTALS-Kyber) for key encapsulation mechanisms, FIPS 204 (ML-DSA, based on CRYSTALS-Dilithium), and FIPS 205 (SLH-DSA, based on SPHINCS+) for digital signatures (NIST, "Post-Quantum Cryptography Standards," August 2024). These algorithms are based on mathematical problems, particularly lattice problems and hash functions, which the current scientific consensus considers resistant to quantum attacks.

The availability of mature standards, however, does not equate to the availability of a ready-to-use solution. Cryptographic migration is one of the most complex systems engineering exercises a medium or large organization can undertake. Cryptographic algorithms are embedded in hardware, firmware, software libraries, communication protocols, and operational infrastructure at every level of the technology stack. Before migrating, an organization must develop a comprehensive cryptographic inventory, often called a "cryptographic bill of materials," to identify all dependencies on vulnerable algorithms. This process alone can take months. Subsequently replacing legacy algorithms with PQC alternatives, while maintaining interoperability and performance, represents an infrastructure transition measured in years, not weeks.

ENISA has strongly emphasized the concept of "cryptographic agility"—the ability of a system to change cryptographic algorithms witha best practice (ENISA, "Post-Quantum Cryptography: Preparation for the out requiring a complete rearchitecture—as a design imperative, not simply Quantum Era," 2022). Organizations that have not built this flexibility into their infrastructure will face a painful choice: face retrofit costs far higher than those of adopting by design, or remain exposed for a prolonged period. The direct consequence, from a strategic planning perspective, is that the time to begin PQC migration is not when CRQCs become operational, but today.

GDPR is not blind, but regulators read it short-sightedly

The European data protection regulatory framework is not without the interpretative resources needed to address the temporal dimension of cryptographic security. The problem lies not in the lack of legal instruments, but in the enforcement inertia of supervisory authorities and the established tendency to view security obligations as static assessments, anchored to the moment of processing.

Article 32 of the GDPR requires data controllers and processors to implement technical and organizational measures appropriate to the risk, taking into account, among other factors, the "state of the art." This concept, inherently dynamic and inherently evolving, is the most promising interpretative key for incorporating HNDL risk into the current regulatory framework. In product liability law, the "state of the art" refers to the highest level of development of a process, product, or scientific field at the relevant time. Transposed to data protection law, this concept highlights that security obligations are not static but must evolve in parallel with the evolution of the technological landscape and threats. A forward-looking reading of Article 32 would require assessing the state of the art not only with respect to current threats, but also with respect to those that are reasonably foreseeable throughout the entire data processing lifecycle.

This interpretation is supported by the principle of data protection by design and by default enshrined in Article 25 of the GDPR, which requires the controller to consider the risks to the rights and freedoms of natural persons when determining the purposes of the processing. If an organization today decides to process health data with a retention period of fifteen years, and the risk of quantum decryption is reasonably foreseeable within that period, then the assessment of appropriate security measures should cover the entire risk window, not just the initial moment. European supervisory authorities, however, have not yet systematically implemented this interpretation. Encryption guidelines tend to point to current standards, such as AES-256 for data at rest or TLS 1.3 for data in transit, without explicitly addressing the question of how those standards will hold up throughout the data's entire lifecycle. This interpretative gap is, for now, the most significant gap in the European enforcement system.

NIS2, Cyber ​​Resilience Act, and the European Regulatory Framework

The GDPR does not operate in isolation. In recent years, the European Union has built an increasingly complex cybersecurity regulatory architecture, within which data protection obligations are intertwined with broader security governance requirements. Two instruments are particularly relevant to the HNDL challenge.

The NIS2 Directive (EU Directive 2022/2555), which entered into force in January 2023 with a transposition deadline for Member States of October 2024, significantly expands the scope and depth of European cybersecurity obligations. Article 21 of NIS2 explicitly includes "the use of cryptography and, where appropriate, encryption" among the risk management measures required of essential and important entities. This provision, when combined with growing public documentation on quantum risks, creates a legal basis for requiring covered entities to consider and plan for the post-quantum transition in their cryptographic risk management frameworks. NIS2's emphasis on supply chain security is also directly relevant: the quantum threat does not respect organizational boundaries, and the security of encrypted communications depends on the cryptographic posture of each intermediary in the chain.

The Cyber ​​Resilience Act (EU Regulation 2024/2847), which entered into force in 2024, introduces horizontal cybersecurity requirements for products with digital elements placed on the EU market. The regulation requires manufacturers to implement security-by-design principles and ensure that products remain secure throughout their intended lifecycle. This "lifetime" security obligation has profound implications in the quantum context: a connected device with an expected operational life of fifteen or twenty years, such as a smart meter, a medical device, or an industrial control system, will need cryptographic solutions capable of resisting quantum attacks throughout that lifetime. Products placed on the market today with cryptographic implementations vulnerable to quantum computers, and which cannot be updated via software mechanisms, may already be non-compliant with the prospective requirements of the CRA, depending on how those requirements are interpreted by regulators and jurisdictions.

The emerging regulatory landscape therefore presents a multilayered framework in which the temporal dimension of cryptographic security is already, at least implicitly, present. What is missing is explicit and authoritative guidance from supervisory authorities, ENISA, or the European Data Protection Board, capable of translating these general principles into concrete expectations for organizations.

Legal liability for deferred damages

The problem of legal liability in the HNDL context is as practically urgent as it is theoretically complex, because it forces an enforcement system built around the contemporaneity of the damage to collide with a threat whose damage is structurally deferred.

Article 83 of the GDPR provides for administrative fines of up to four percent of annual worldwide turnover for violations of Article 32, and Article 82 recognizes the right to compensation for material or non-material damage suffered as a result of a violation of the regulation. The first interpretative difficulty concerns the classification of interception in a HNDL scenario. Article 4, paragraph 12, of the GDPR defines a "personal data breach" as any breach of security leading to the accidental or unlawful destruction, loss, alteration, unauthorized disclosure of, or access to, personal data. On a first reading, the interception of encrypted data already constitutes, in itself, unauthorized access to the data, even if the data has not yet been decrypted. On a second reading, the relevant event is decryption, when the confidentiality of the data is effectively compromised and the harm to the data subjects becomes concrete. This temporal ambiguity has immediate consequences for notification obligations: if the data subject is never aware that their encrypted communications have been intercepted, the seventy-two-hour deadline never begins to run.

The second dimension of analysis concerns the applicable standard of care. If it is demonstrated that, at the time of processing, the risk of HNDL attacks was foreseeable and addressable through available technical measures, including the adoption of PQC algorithms or hybrid schemes, a supervisory authority could conclude that the controller's failure to comply constituted a breach of the obligation to provide security appropriate to the risk. The foreseeability of the quantum threat is increasingly documented: NIST launched its PQC standardization program in 2016; ENISA, the NSA, and national cybersecurity agencies have published guidelines; and the academic literature on the HNDL threat has proliferated. An organization that processes highly sensitive long-term data and has not conducted a quantum cryptographic risk assessment will have a difficult time arguing that the threat was not reasonably foreseeable.

The third dimension concerns causation and damages in civil actions under Article 82. An interested party seeking compensation for damages resulting from the quantum decryption of their data must demonstrate that the data controller's failure to take adequate measures caused the breach and the resulting damage. In the HNDL context, causal argumentation is complicated by the involvement of multiple actors over extended time horizons: the adversary who collected the data, the cryptographic state of the art at the time of encryption, the data controller's security choices, and the subsequent development of quantum computing capabilities. This framework recalls the doctrinal solutions developed in environmental law and product liability for delayed-onset latent damages, where traditional mechanisms for establishing causality have been adapted to address the specific nature of harm that manifests years or decades after the causal act.

The cross-border dimension adds further complexity. HNDL campaigns are more likely to be attributed to state-sponsored actors operating under foreign jurisdictions, making traditional law enforcement and civil litigation largely impractical. The primary legal response therefore lies not in ex post accountability, but in ex ante regulation: ensuring that regulatory requirements are sufficient to incentivize organizations to adopt the cryptographic measures necessary to render HNDL attacks futile, before they produce results.

What does all this mean, taken together?

The HNDL paradigm isn't a technical problem that law can delegate to engineers while waiting for the situation to be clarified. It's a structural information governance problem that requires a coordinated response at three distinct levels.

In terms of interpretation, national supervisory authorities and the European Data Protection Board should explicitly state that the notion of "state of the art" in Article 32 of the GDPR incorporates a prospective temporal dimension: for highly sensitive data processing operations with long retention periods, the adequacy of security measures must be assessed over the entire lifespan of the data, not just at the time of its initial processing. This does not require a regulatory change; it requires a regulatory will to consistently apply existing tools.

At the institutional level, the concept of "cryptographic adequacy" could emerge as a distinct regulatory standard, similar to the concept of adequacy in international data transfers under Chapter V of the GDPR, but applied to the prospective sufficiency of cryptographic protections with respect to the expected retention horizon. Sector authorities in finance, healthcare, and critical infrastructure, already engaged with the quantum transition more advanced than data protection authorities, could be the most agile laboratory for developing these standards before they become generalized.

Organizationally, companies and institutions that process long-term sensitive data can no longer afford to treat PQC migration as a matter of routine monitoring. They must conduct a harvest risk assessment that cross-references the categories of data processed with their respective retention periods and estimated time horizons for CRQC availability. They must build, or acquire, cryptographic agility into their infrastructure. Where technically mature, they must adopt hybrid schemes that combine classic and PQC algorithms, providing protection against both classes of attacks during the transition.

The window is narrowing

Data already intercepted today won't wait for the law to update. The most insidious feature of HNDL is precisely this: the damage is invisible in the present, irreversible in the future. Every year of regulatory inertia, every procurement cycle that releases devices with vulnerable quantum encryption to the market, every critical infrastructure that postpones PQC risk assessment represents an attack surface that no future regulation will be able to recover from. European data protection law already possesses, in its current architecture, the conceptual resources to address this challenge. The question is not whether the regulatory framework is adequate in principle: the question is whether regulators, legislators, and organizations have the will to embrace the temporal dimension that the HNDL paradigm has made impossible to ignore.


Share on