Post-Quantum Cryptography and Storage: Why “Ready” Is the Wrong Question

There’s a growing tension in cybersecurity conversations leading to a deceptively simple question: Is your storage PQC Ready?

Key takeaways:

    • Why it’s important to begin to prepare for quantum computing.
    • Help customers understand that quantum computing is closer than they think.
    • Why storage is so important to PQC adoption.
    • What Dell is doing in its storage platforms to help customers be ready for the PQC transition.

There’s a growing tension in cybersecurity conversations. Customers are asking about post-quantum cryptography. Executives are asking for roadmaps. And vendors are feeling pressure to answer a deceptively simple question:

“Are you PQC ready?”

The problem is, that question assumes this is a feature. It isn’t. It’s a transition, and one that no single vendor can complete alone. Nowhere is that more obvious than in storage.

The timeline is moving – Faster than expected

For years, quantum computing was treated as a distant concern. That assumption is breaking down.

Recent research is reducing the resources required to break widely used public-key cryptography like RSA, ECDSA and ECDH. At the same time, industry timelines are compressing, with some organizations now targeting post-quantum readiness by the end of this decade rather than the mid-2030s.

The exact date of “Q-Day” is still debated.

However, the timeline is shifting left—and possibly faster than many expect.

The risk is already in motion

While timelines get debated, attackers are not waiting.

The “Harvest Now, Decrypt Later” (HNDL) attack means adversaries are already collecting encrypted data today with the expectation that it will be decrypted in the future. For storage, this risk is acute: storage systems are where long-lived data resides (backups, archives, compliance records, IP). Data meant to persist for decades, if captured today, its future confidentiality is no longer guaranteed.

But there is a second threat: “Trust Now, Forge Later” (TNFL) to consider.

Quantum capabilities don’t just threaten encryption—they threaten digital trust. Digital signatures underpin firmware validation, secure boot, software updates and certificate chains. If those signatures can be forged, attackers can introduce malicious firmware or compromise trusted update mechanisms, with immediate operational impact.

For storage, a forged firmware signature doesn’t expose data in the future; it compromises the system.

This is why the conversation is shifting from protecting data to protecting the platform itself.

Storage is where PQC migration becomes real

With NIST finalizing the first post-quantum cryptographic standards in 2024—ML-KEM for key , ML-DSA for digital signatures and SLH-DSA for hash-based signatures – the building blocks now exist. The challenge is deploying them across real systems.

Storage platforms are layered systems of hardware, firmware and software. They protect data at rest and in flight, manage keys and certificates, validate firmware and establish trust through secure boot. Every one of these functions depends on cryptography. And every one of them must evolve, without breaking the system.

PQC ready is NOT a product claim – It’s a dependency problem

A storage platform is an ecosystem—controllers, drives, firmware, silicon, networking and software—all with their own cryptographic dependencies.

Because each layer carries its own cryptographic dependencies, a single non-quantum-resistant component can undermine the integrity of the entire system.

That’s why “PQC ready” is the wrong framing.

Readiness is not defined by the platform. It is defined by the entire dependency chain—and that chain includes suppliers.

Foundational elements like firmware authentication, system BIOS, and hardware trust modules are evolving first. Peripherals—network interfaces, IO subsystems and storage media—follow on different timelines, each dependent on upstream silicon and firmware ecosystems.

This is not a delay. It is what rebuilding trust across a distributed system actually looks like.

Regulatory pressure is catching up

Government directives are establishing concrete requirements with specific deadlines. criteria.

Financial services, telecom, healthcare and critical infrastructure organizations are already being asked to demonstrate PQC migration strategies as a condition of doing business.

The only strategy that holds up: Cryptographic agility

Most organizations are still asking when to switch to PQC.

The better question is whether they can adapt.

Standards are evolving, hybrid models are emerging and best practices are still being defined.

The organizations that succeed will be the ones that can evolve as the landscape changes. That is cryptographic agility.

In storage environments, that agility has to exist across multiple layers simultaneously. It means supporting hybrid cryptographic models across encryption libraries, certificate validation and key lifecycle management. It means integrating with external key management systems that may transition at different speeds. And it means enabling core storage workflows such as replication, backup, clustering to adopt new cryptographic methods without forcing disruptive redesigns.

Why Dell: Engineering the transition across the stack

There is a temptation to make bold claims about being “PQC ready.” It sounds decisive. It’s also incomplete.

At Dell, PQC migration is treated as an architectural transition.

Work is underway to enable quantum-resistant firmware and software validation across platform components. Dell prioritizes transitioning firmware authentication and hardware root mechanisms ahead of data encryption, because if the system foundation is compromised, encrypted data cannot be reliably protected.

Because Dell designs and integrates infrastructure from server platforms and storage controllers through to the software that manages data, the transition can be coordinated across layers rather than left to the organization to patchwork vendor dependencies.

The bottom line

Post-quantum cryptography is not a distant milestone. The timelines are compressing, the threats are evolving, and the data being stored today is already part of the risk equation.

For storage, this isn’t about flipping a switch. It’s about managing dependencies, maintaining system integrity, and designing architectures that can evolve.

Because in the quantum era, the real question isn’t whether your storage platform is ready.

The real measure is whether your environment can execute the transition across the systems and dependencies it relies on—without disruption.

Steve Kenniston

About the Author: Steve Kenniston

Steve Kenniston has been in the storage industry for over 25 years. From startup to global 2000 company, Steve has been a part of a number of storage inflection points in his career. He has worked for a number of startup companies including Connected Corp, Avamar, and Storwize all leading to acquisitions.

He later worked for Iron Mountain, EMC, IBM and now Dell. Steve is currently the lead for the cybersecurity messaging.