A couple reflections on the quantum computing breakthrough we just announced... Most of us grew up learning there are three main types of matter that matter: solid, liquid, and gas. Today, that changed. After a nearly 20 year pursuit, we’ve created an entirely new state of matter, unlocked by a new class of materials, topoconductors, that enable a fundamental leap in computing. It powers Majorana 1, the first quantum processing unit built on a topological core. We believe this breakthrough will allow us to create a truly meaningful quantum computer not in decades, as some have predicted, but in years. The qubits created with topoconductors are faster, more reliable, and smaller. They are 1/100th of a millimeter, meaning we now have a clear path to a million-qubit processor. Imagine a chip that can fit in the palm of your hand yet is capable of solving problems that even all the computers on Earth today combined could not! Sometimes researchers have to work on things for decades to make progress possible. It takes patience and persistence to have big impact in the world. And I am glad we get the opportunity to do just that at Microsoft. This is our focus: When productivity rises, economies grow faster, benefiting every sector and every corner of the globe. It’s not about hyping tech; it’s about building technology that truly serves the world. Read more about our discovery, and why it matters, here: https://aka.ms/AAu76rr
Quantum Computing Developments
Explore top LinkedIn content from expert professionals.
-
-
🚨 New OMB Report on Post-Quantum Cryptography (PQC)🚨 The Office of Management and Budget (OMB) has released a critical report detailing the strategy for migrating federal information systems to Post-Quantum Cryptography. This report is in response to the growing threat posed by the potential future capabilities of quantum computers to break existing cryptographic systems. **Key Points from the Report:** 🔑 **Start Migration Early**: The report emphasizes the need to begin migration to PQC before quantum computers capable of breaking current encryption become operational. This proactive approach is essential to mitigate risks associated with "record-now-decrypt-later" attacks. 🔑 **Focus on High-Impact Systems**: Priority should be given to high-impact systems and high-value assets. Ensuring these critical components are secure is paramount. 🔑 **Identify Early**: It's crucial to identify systems that cannot support PQC early in the process. This allows for timely planning and avoids migration delays. 🔑 **Cost Estimates**: The estimated cost for this transition is approximately $7.1 billion over the period from 2025 to 2035. This significant investment underscores the scale and importance of the task. 🔑 **Cryptographic Module Validation Program (CMVP)**: To ensure the proper implementation of PQC, the CMVP will play a vital role. This program will validate that the new cryptographic modules meet the necessary standards. The full report outlines a comprehensive strategy and underscores the federal government’s commitment to maintaining robust cybersecurity in the quantum computing era. This is a critical step in safeguarding our digital infrastructure against future threats. #Cybersecurity #PQC #QuantumComputing #FederalGovernment #Cryptography #DigitalSecurity #OMB #NIST
-
World’s First Quantum Large Language Model (QLLM) Launched, Advancing AI A UK-based company, SECQAI, has developed and launched the world’s first Quantum Large Language Model (QLLM), integrating quantum computing with traditional AI models to enhance efficiency, problem-solving, and linguistic understanding. This breakthrough marks a major step forward in AI and quantum machine learning, with the potential to transform multiple industries. Key Features of the QLLM • Quantum-Enhanced Computation: Utilizes quantum computing principles to improve efficiency and decision-making in AI models. • Quantum Attention Mechanism: Introduces gradient-based learning and a quantum attention mechanism, allowing for more complex and nuanced AI responses. • In-House Quantum Simulator: SECQAI developed a custom quantum simulator to train and refine the QLLM, bridging classical AI with quantum advantages. Why This Is Significant • More Powerful AI Capabilities: Quantum computing enables exponentially faster problem-solving, unlocking new applications in natural language processing, data analysis, and optimization. • Revolutionizing AI Efficiency: Traditional LLMs require massive computational resources—quantum-enhanced models could reduce energy consumption and improve scalability. • Cross-Industry Impact: The QLLM could redefine AI applications in finance, healthcare, cybersecurity, and scientific research, offering new levels of precision and adaptability. What’s Next? • SECQAI plans to continue refining QLLM capabilities, exploring how quantum computing can further enhance AI performance. • Future developments may include real-world applications of quantum-enhanced AI, pushing the boundaries of what AI systems can achieve. • As quantum hardware advances, QLLMs could become mainstream AI solutions, setting a new industry standard for efficiency and intelligence. This landmark achievement in Quantum Machine Learning signals the beginning of a new AI era, where quantum-enhanced models could redefine AI’s capabilities and computational efficiency.
-
Thought you knew which #quantumcomputers were best for #quantum optimization? The latest results from Q-CTRL have reset expectations for what is possible on today's gate-model machines. Q-CTRL today announced newly published results that demonstrate a boost of more than 4X in the size of an optimization problem that can be accurately solved, and show for the first time that a utility-scale IBM quantum computer can outperform competitive annealer and trapped ion technologies. Full, correct solutions at 120+ qubit scale for classically nontrivial optimizations! Quantum optimization is one of the most promising quantum computing applications with the potential to deliver major enhancements to critical problems in transport, logistics, machine learning, and financial fraud detection. McKinsey suggests that quantum applications in logistics alone are worth over $200-500B/y by 2035 – if the quantum sector can successfully solve them. Previous third-party benchmark quantum optimization experiments have indicated that, despite their promise, gate-based quantum computers have struggled to live up to their potential because of hardware errors. In previous tests of optimization algorithms, the outputs of the gate-based quantum computers were little different than random outputs or provided modest benefits under limited circumstances. As a result, an alternative architecture known as a quantum annealer was believed – and shown in experiments – to be the preferred choice for exploring industrially relevant optimization problems. Today’s quantum computers were thought to be far away from being able to solve quantum optimization problems that matter to industry. Q-CTRL’s recent results upend this broadly accepted industry narrative by addressing the error challenge. Our methods combine innovations in the problem’s hardware execution with the company’s performance-management infrastructure software run on IBM’s utility-scale quantum computers. This combination delivered improved performance previously limited by errors with no changes to the hardware. Direct tests showed that using Q-CTRL’s novel technology, a quantum optimization problem run on a 127-qubit IBM quantum computer was up to 1,500 times more likely than an annealer to return the correct result, and over 9 times more likely to achieve the correct result than previously published work using trapped ions These results enable quantum optimization algorithms to more consistently find the correct solution to a range of challenging optimization problems at larger scales than ever before. Check out the technical manuscript! https://lnkd.in/gRYAFsRt
-
The (possible) future of Cyber security… Where Quantum Key Distribution (QKD) has completely replaced today’s Public Key Infrastructure (PKI), and within 5-15 years asymetric cryptographic algorithms are rendered entirely or partially unusable (Forrester)… but it’s not Armageddon, we can be prepared 😅 Thank you Yvette Lejins and ADAPT for a fantastic ’fireside chat’ and discussion about what CIOs and CSIOs can do now to prepare for Quantum: 🔒 Know your risk appetite: what is your migration time (to new cryptography or QKD); Security/ Data Shelf Life (time data needs to be protected); Risk exposure timeframe (I.e. when will Quantum computing crack Shores’ algorithm - take your pick of expert probabilities!) 🔒Re-design your infrastructure for cryptographic agility. Reduce the number of data encryption/decryption points to reduce the threat surface and complexity of cryptographic migration processes. 🔒 Implement post-quantum algorithms. Adopt algorithms that have been approved by NIST or an equivalent standards body to ensure the smoothest transition. 🔒Invest in capability. Less than 50% of quantum computing jobs expected to be filled by 2025 (McKinsey & Company) Tenar Larsen Jim Berry Matt Boon Maushumi (Maya) Mazid Jenny Francis David Gee GAICD Nick Haigh Jayden Cooke Gabby Fredkin #adaptsecurityedge #cyberrisk #riskappetite #quantumcomputing
-
This image is from an Amazon Braket slide deck that just did the rounds of all the Deep Tech conferences I've been at recently (this one from Eric Kessler). It's more profound than it might seem. As technical leaders, we're constantly evaluating how emerging technologies will reshape our computational strategies. Quantum computing is prominent in these discussions, but clarity on its practical integration is... emerging. It's becoming clear however that the path forward isn't about quantum versus classical, but how quantum and classical work together. This will be a core theme for the year ahead. As someone now on the implementation partner side of this work, and getting the chance to work on specific implementations of quantum-classical hybrid workloads, I think of it this way: Quantum Processing Units (QPUs) are specialised engines capable of tackling calculations that are currently intractable for even the largest supercomputers. That's the "quantum 101" explanation you've heard over and over. However, missing from that usual story, is that they require significant classical infrastructure for: - Control and calibration - Data preparation and readout - Error mitigation and correction frameworks - Executing the parts of algorithms not suited for quantum speedup Therefore, the near-to-medium term future involves integrating QPUs as accelerators within a broader classical computing environment. Much like GPUs accelerate specific AI/graphics tasks alongside CPUs, QPUs are a promising resource to accelerate specific quantum-suited operations within larger applications. What does this mean for technical decision-makers? Focus on Integration: Strategic planning should center on identifying how and where quantum capabilities can be integrated into existing or future HPC workflows, not on replacing them entirely. Identify Target Problems: The key is pinpointing high-value business or research problems where the unique capabilities of quantum computation could provide a substantial advantage. Prepare for Hybrid Architectures: Consider architectures and software platforms designed explicitly to manage these complex hybrid workflows efficiently. PS: Some companies like Quantum Brilliance are focused on this space from the hardware side from the outset, working with Pawsey Supercomputing Research Centre and Oak Ridge National Laboratory. On the software side there's the likes of Q-CTRL, Classiq Technologies, Haiqu and Strangeworks all tackling the challenge of managing actual workloads (with different levels of abstraction). Speaking to these teams will give you a good feel for topic and approaches. Get to it. #QuantumComputing #HybridComputing #HPC
-
Is Quantum Machine Learning useful? When we think about this question, we tend to wonder if quantum computing could accelerate our known ML algorithms. But that could be the wrong way to go about it. A quantum processing unit is a different type of hardware with different computation principles, and as such, it is a great candidate to develop new ML algorithms with purely quantum principles. Quantum ML can actually mean multiple things. There are 2 components to ML: data generation and the data processing device, and each component could be quantum or classical: - If both the data generation process and the data processing device are classical, that would be typical Machine Learning as we know it. - Typically, when people think about QML, they think of the data generation process being classical and the data processing being done on a quantum computer. The data could be text, images, or time series, and we need a quantum-classical interface to convert that data into quantum data. The quantum computer can only process quantum data, and a quantum algorithm would generate outputs that need to be converted into classical data. Converting the data back and forth requires at least linear time complexity in the size of the data, preventing any exponential speed of learning tasks. Many people doubt this process will ever be beneficial. - One interesting avenue for QML is if the data generation is intrinsically quantic. For example, in the Physics, Chemistry, or Biology departments, researchers deal with quantum "data" on a daily basis. Electrons in your CPU or medication molecules abide by quantum mechanical laws. A typical way to study those phenomena is to build numerical simulations using synthetic classical data simulating quantum particles, with those simulations being run on a classical computer. This is very slow, and we can simulate a limited number of particles at once. But if we could use quantum data to simulate quantum particles, we could run quantum ML algorithms directly on those data. There is evidence that this would lead to a quantum speed-up of the process. QML could lead to huge scientific leaps in the near future! A few hybrid quantum-classical architectures have been proposed where models are spread across classical and quantum processing units. This allows the processing of quantum data with a computer but benefits from the advantage of well-understood computations on classical computers. For example, you can use classical computers as outer loop optimizers for quantum neural networks. An example is Tensorflow Quantum (https://lnkd.in/eziVB4q9), which is mainly intended for applications involving quantic data generation but can also be used for classical data. Here is an example of how to run a ConvNet on quantum data: https://lnkd.in/dx9nmY9n -- 👉 Early-bird deal for my ML Fundamentals Bootcamp: https://lnkd.in/gasbhQSk -- #machinelearning #datascience #artificialintelligence
-
Is this the "Attention Is All You Need" moment for Quantum Computing? Oxford University scientists in Nature have demonstrated the first working example of a distributed quantum computing (DQC) architecture. It consists of two modules, two meters apart, which "act as a single, fully connected universal quantum processor." This architecture "provides a scalable approach to fault-tolerant quantum computing". Like how the famous "Attention Is All You Need" paper from Google scientists introduced the Transformer architecture as an alternative to classical neural networks, this paper introduces Quantum gate teleportation (QGT) as an alternative to the direct transfer of quantum information across quantum channels. The benefit? Lossless communication. But not only communication: computation also. This is the first execution of a distributed quantum algorithm (Grover’s search algorithm) comprising several non-local two-qubit gates. The paper contains many pointers to the future, which I am sure will be pored over by other labs, startups and VCs. I am excited to follow developments in: - Quantum repeaters to increase the distance between modules - Removal of channel noise through entanglement purification - Scaling up the number of qubits in the architecture Amid all the AI developments, this may be the most important innovation happening in computing now. https://lnkd.in/e8qwh9zp
-
𝗤𝘂𝗮𝗻𝘁𝘂𝗺 𝗖𝗼𝗺𝗽𝘂𝘁𝗶𝗻𝗴: 𝗔 𝗥𝗲𝘃𝗼𝗹𝘂𝘁𝗶𝗼𝗻 𝗼𝗻 𝘁𝗵𝗲 𝗛𝗼𝗿𝗶𝘇𝗼𝗻 🚀 Quantum computing represents a paradigm shift in how we approach computation. Unlike classical computers that use bits (0 or 1), quantum computers leverage qubits. Qubits can exist in multiple states simultaneously due to superposition, allowing quantum computers to explore countless possibilities and solve complex problems exponentially faster. This opens doors to breakthroughs in fields ranging from medicine and materials science to finance and artificial intelligence. 𝗪𝗶𝗹𝗹𝗼𝘄 (𝗚𝗼𝗼𝗴𝗹𝗲) Google's "Willow" chip showcases substantial progress in both quantum error correction and performance. Willow has achieved "below threshold" error rates, meaning that as the number of qubits scales up, errors decrease exponentially. It also achieved a standard benchmark computation in under five minutes that would take one of today's fastest supercomputers an unfathomable amount of time. Google's strategy revolves around improving qubit quality and error correction to achieve practical quantum advantage, with a clear focus on demonstrating real-world applications. 𝗠𝗮𝗷𝗼𝗿𝗮𝗻𝗮 𝟭 (𝗠𝗶𝗰𝗿𝗼𝘀𝗼𝗳𝘁) Microsoft is taking a bold step with its "Majorana 1" chip, built upon a Topological Core architecture. This innovative design harnesses topoconductors to control Majorana particles, creating more stable and scalable qubits. Microsoft envisions this as the "transistor for the quantum age," paving the way for million-qubit systems capable of tackling industrial-scale challenges like breaking down microplastics or designing self-healing materials. Their strategy is to focus on creating inherently stable qubits that require less error correction, a significant hurdle in quantum computing. 𝗢𝗰𝗲𝗹𝗼𝘁 (𝗔𝗺𝗮𝘇𝗼𝗻 𝗪𝗲𝗯 𝗦𝗲𝗿𝘃𝗶𝗰𝗲𝘀) Amazon Web Services (AWS) is addressing quantum error correction directly with their "Ocelot" chip. Ocelot employs a novel architecture utilizing 'cat qubits' that are designed to reduce error correction costs significantly. This is a crucial advancement as quantum computers are incredibly sensitive to noise, and error correction is essential for reliable computation. AWS's strategy is to lower the barrier to entry for quantum computing through its Amazon Braket service, providing access to diverse quantum hardware and tools while focusing on making quantum computing more cost-effective and accessible. 𝗤𝘂𝗮𝗻𝘁𝘂𝗺 𝗖𝗼𝗺𝗽𝘂𝘁𝗶𝗻𝗴 𝗮𝗻𝗱 𝗔𝗜: 𝗕𝗲𝘆𝗼𝗻𝗱 𝘁𝗵𝗲 𝗟𝗶𝗺𝗶𝘁𝘀 𝗼𝗳 𝗚𝗣𝗨𝘀 While GPUs have revolutionized AI by accelerating the training of complex models, quantum computing offers the potential for an even greater leap in AI capabilities. Quantum computers, by harnessing superposition and entanglement, can potentially solve optimization, machine learning, and simulation problems that are intractable for even the most powerful GPUs. #QuantumComputing #AI #GPU
-
What is Google’s Quantum Chip "Willow" and Why It Matters for Healthcare? Google’s latest breakthrough in quantum computing, Willow, represents a significant leap forward. Unlike classical computers that use bits (0s and 1s), quantum computers use qubits, which can exist in multiple states simultaneously. This allows them to process exponentially more data at once, making them uniquely capable of solving problems that classical computers can't even approach. 🔬 What Willow Achieved: Willow solved a complex problem, the Random Circuit Sampling (RCS) benchmark, in just 5 minutes—a task that would take the fastest classical supercomputers 10 septillion years. For context, our universe has only existed for 13.8 billion years. This level of computational power opens the door to applications we’ve only dreamed of. The Potential for Healthcare: Quantum computing could transform the way we approach medical research, diagnostics, and treatment: 1️⃣ Accelerated Drug Discovery: Simulating molecular interactions to develop new medicines could take days instead of years, drastically reducing costs and timelines. For diseases like cancer or Alzheimer’s, this means faster access to life-saving treatments. 2️⃣ Personalized Medicine: By analyzing genomic, proteomic, and environmental data simultaneously, quantum computing could create highly individualized treatment plans tailored to each patient’s biology. 3️⃣ Real-Time Pandemic Modeling: Imagine predicting the spread of pandemics in real-time, optimizing resources, and even simulating the outcomes of interventions before deploying them. While Willow is still in its experimental stage, its advancements in error correction are paving the way for scalable, practical quantum computers by the end of the decade. This technology is not a replacement for AI but a complement, enabling AI to process data and solve problems on an unprecedented scale. The fusion of AI and quantum computing could create a future where healthcare is more personalized, predictive, and efficient—solving humanity’s biggest medical challenges faster than ever. What do you think of Willow? #QuantumComputing #AI #DigitalHealth #HealthcareInnovation #FutureOfMedicine #MedTech #HealthTech #QuantumAI #TechForGood #ScienceBreakthrough