top of page
  • Writer's pictureNilesh Dhande

Why National Data, Networks, and Systems could be at Risk?

Updated: Oct 31, 2023

Renowned mathematician and cryptographer Daniel Bernstein has raised concerns about the collaboration between NIST and the NSA in defining post-quantum cryptography (PQC) standards. Bernstein has taken legal action, filing a lawsuit that highlights the lack of transparency in the selection process, which could potentially compromise global security systems. The lawsuit mirrors a past incident where NIST unknowingly standardized a flawed algorithm championed by the NSA. The intentional insertion of backdoors in cryptographic algorithms introduces significant vulnerabilities and poses a substantial risk. The critical question arises: Will history repeat itself, allowing a faulty PQC algorithm, potentially supported by the NSA, to be standardized? Decision-makers must address these concerns and establish a transparent process for selecting cryptographic algorithms.



The collaboration between National Institute of Standards and Technology (NIST) and National Security Agency (NSA) have been defining new security standards with robust post quantum cryptography (PCQ) random number generations. These PCQ algorithms hold the promise of safeguarding vital systems and protecting national security interests against cryptanalytic attacks by quantum computers.

The intent is great – to secure sensitive data, networks, and systems. However, amidst this noble pursuit, the mighty Daniel Bernstein sounded the alarm. What could possibly be wrong? The answer lies within the hidden corridors of these cryptographic algorithms – backdoors – the concealed vulnerabilities that could compromise the very security they are meant to enhance.


The Backdoor Conundrum


A backdoor is a hidden trapdoor, an intentional flaw in a cryptographic algorithm that allows an individual to bypass the very security mechanisms that were put in place to provide protection. It’s a secret way for someone to get something that they would not be able to otherwise. These backdoors in the cryptographic algorithms are a security vulnerability, allowing sly hackers and nefarious individuals to easily exploit the system.


The concept of a backdoor has been casting a shadow over the security industry for a long time. But why create these backdoors? Well, motivations can vary. Sometimes, it’s a short cut for lazy programmers to bypass their own security systems for debugging reasons. However, what’s more concerning is when these backdoors are intentionally inserted to weaken a system. Government agencies have been known to insert these trapdoors into commonly used software. Their intention? Mass surveillance.



National Data Security Risks


The Lawsuit: Bernstein, NSA, NSIT and Crypto


Daniel Bernstein is a renowned mathematician, cryptographer, and a professor at the University of Illinois at Chicago. In August 2022, he filed a lawsuit against NIST and the NSA, challenging the integrity of the ongoing call for proposal project. The lawsuit is Bernstein’s claim revolving around the lack of transparency and accountability of the cryptographic algorithm selection process—a process that has the potential to compromise our global security systems if left unchecked.

Bernstein has his own domain where he talks about these discrepancies and his work on PCQ. Check it out here.



Learning from the Past


Intriguingly, Bernstein's lawsuit against NIST and the National Security Agency (NSA) echoes a familiar story —one where doubts shook the foundation of trust. It goes back to a time when the standardization of the Dual EC random number generator, without the knowledge of NIST, harbored a hidden trapdoor. This algorithm, championed by the NSA, concealed vulnerabilities that compromised security and fueled skepticism within the cryptographic community.


In 2013, the National Institute of Standards and Technology (NIST) standardized the Dual EC random number generator, unaware initially that it contained a trapdoor inserted by NSA researchers. Subsequently, it was revealed that this algorithm, Dual EC, possessed significant vulnerabilities. Bernstein's published paper in 2015 shed light on the NSA's involvement and their influence in pushing for the standardization of a flawed algorithm. Furthermore, evidence emerged indicating that the NSA had incentivized software agencies to adopt this algorithm, thereby compromising global security systems.


The consequences of such actions are far-reaching, as exemplified by the revelations made by whistleblower Edward Snowden.


Bernstein’s Argument


At the heart of Bernstein's argument is the assertion that NIST, in its call for proposal, promised transparency and openness in evaluating and selecting cryptographic algorithms. Specifically, it was stated that applicants would be required to provide comprehensive details about their work, and the reasons for not selecting particular algorithms would be publicly disclosed.

However, Bernstein contends that NIST has failed to fulfill this commitment. Despite repeated requests through emails, NIST has not provided the paper or electronic documents justifying the exclusion of certain algorithms from the final rounds. This lack of transparency raises concerns about the integrity of the selection process and casts doubt on the trustworthiness of the algorithms that will be standardized.

The ongoing debate surrounding backdoors in cryptography and the usage of end-to-end encryption is one of the defining discussions of the 21st century. And Bernstein has been consistent in his efforts to enhance security and reduce the vulnerabilities exploited by cybercriminals.


Charting a Secure Path Forward


In the realm of digital security, the role of robust and trustworthy cryptographic algorithms cannot be overstated.

Now, as we stand at the precipice of standardizing a new generation of cryptographic algorithms, the stakes are higher than ever. The critical question looms before us: Will history repeat itself? Will the lack of transparency surrounding NIST's selection process pave the way for the standardization of a faulty Post Quantum Cryptography (PQC) algorithm, potentially backed by the NSA, undermining the global security infrastructure?


The nation’s decision makers have a crucial role in ensuring the integrity and security of the nation. Therefore, it’s imperative to address the concerns raised by Daniel Bernstein and take proactive steps to establish a transparent and accountable process for evaluating and selecting cryptographic algorithms.





150 views0 comments

Recent Posts

See All
bottom of page