Page 11234..1020..»

You are currently browsing the Quantum Computer category

Small, diamond-based quantum computers could be in our hands within five years – Cosmos Magazine

§ August 27th, 2021 § Filed under Quantum Computer Comments Off on Small, diamond-based quantum computers could be in our hands within five years – Cosmos Magazine

Small, affordable, plug-and-play quantum computing is one step closer. An Australian startup has won $13 million to make its diamond-based computing cores shine. Now it needs to grow.

ANU research spinoff Quantum Brilliance has found a way to use synthetic diamonds to drive quantum calculations. Now its on a five-year quest to produce commercially viable Quantum Accelerators. The goal is a card capable of being plugged into any existing computer system similar to the way graphics cards are now.

Were not deluding ourselves, says CEO Dr Andrew Horsley. Theres still a lot of work to do. But weve now got a five-year pathway to produce a lunchbox-sized device.

To do this, Quantum Brilliance is hiring 20 engineers, scientists, physicists, software engineers, and control engineers. The resulting quantum accelerator card will be valuable for self-driving car manufacturers, materials research labs, logistics hubs and financial services firms.

Weve understood electricity and magnetism for a long time, Dr Horsley says. We now understand quantum phenomena and are in the process of turning that into technology. Its very exciting. And its not just an iterative improvement. This is a whole new way of computing. And were doing it here, in Australia.

Read more: Innovation with spin qubits sparks breakthrough in quantum computing

Its about big-time boosts in performance.

If youve got one inside your self-driving car, it will be much better able to interpret its environment and make smarter decisions, Dr Horsley says. Or you could have a stack of them in a supercomputer, working through combinations of chemical properties to quickly simulate new battery materials or drugs.

The goal is to demonstrate a 50 qubit accelerator card by 2025. A qubit is the quantum equivalent of a traditional computers basic unit of data a bit.

Quantum Brilliances success has been using diamond as the engine for quantum processing in the same way silicon drives existing chips.

Most importantly, this can be done at room temperature with relatively simple control systems.

Competing techniques need cryogenic cooling or complex lasers to calm subatomic vibrations that can disrupt fragile quantum states.

Diamond is so rigid that, even at room temperature, we have long-lived quantum properties, Dr Horsley says. Thats the key. We have a diamond with ultra-high-density qubits inside of it, sitting there, in ambient conditions.

The technology is ready. Now the challenge is to turn it into a commercially viable reality.

We need to scale up the number of qubits that weve got while at the same time shrinking down the size of the control systems into a portable package, he says.

At the same time, different companies and institutions will be acting as testbeds for simulated quantum computing to design the software needed for the real thing.

This is helping Australian companies understand quantum computing and their own applications so that theyre ready to commercially exploit these powerful devices as soon as they become available, he adds.

The $13 million investment is led by QxBranch founders and Main Sequence investment consortium.

Read more from the original source:

Small, diamond-based quantum computers could be in our hands within five years - Cosmos Magazine

Read the Rest...

Quantum computers could read all your encrypted data. This ‘quantum-safe’ VPN aims to stop that – ZDNet

§ August 27th, 2021 § Filed under Quantum Computer Comments Off on Quantum computers could read all your encrypted data. This ‘quantum-safe’ VPN aims to stop that – ZDNet

The trial successfully demonstrated, according to Verizon, that it is possible to replace current security processes with protocols that are quantum-proof.

To protect our private communications from future attacks by quantum computers, Verizon is trialing the use of next-generation cryptography keys to protect the virtual private networks (VPNs) that are used every day by companies around the world to prevent hacking.

Verizon implemented what it describes as a "quantum-safe" VPN between one of the company's labs in London in the UK and a US-based center in Ashburn, Virginia, using encryption keys that were generated thanks to post-quantum cryptography methods meaning that they are robust enough to withstand attacks from a quantum computer.

According to Verizon, the trial successfully demonstrated that it is possible to replace current security processes with protocols that are quantum-proof.

VPNs are a common security tool used to protect connections made over the internet, by creating a private network from a public internet connection. When a user browses the web with a VPN, all of their data is redirected through a specifically configured remote server run by the VPN host, which acts as a filter that encrypts the information.

This means that the user's IP address and any of their online activities, from sending emails to paying bills, come out as gibberish to potential hackers even on insecure networks like public WiFi, where eavesdropping is much easier.

Especially in the last few months, which have seen many employees switching to full-time working from home,VPNs have become an increasingly popular tool to ensure privacy and security on the internet.

The technology, however, is based on cryptography protocols that are not un-hackable. To encrypt data, VPN hosts use encryption keys that are generated by well-established algorithms such as RSA (RivestShamirAdleman). The difficulty of cracking the key, and therefore of reading the data, is directly linked to the algorithm's ability to create as complicated a key as possible.

In other words, encryption protocols as we know them are essentially a huge math problem for hackers to solve. With existing computers, cracking the equation is extremely difficult, which is why VPNs, for now, are still a secure solution. But quantum computers are expected to bring about huge amounts of extra computing power and with that, the ability to hack any cryptography key in minutes.

"A lot of secure communications rely on algorithms which have been very successful in offering secure cryptography keys for decades," Venkata Josyula, the director of technology at Verizon, tells ZDNet. "But there is enough research out there saying that these can be broken when there is a quantum computer available at a certain capacity. When that is available, you want to be protecting your entire VPN infrastructure."

One approach that researchers are working on consists ofdeveloping algorithms that can generate keys that are too difficult to hack, even with a quantum computer. This area of research is known as post-quantum cryptography, and is particularly sought after by governments around the world.

In the US, for example, the National Institute of Standards and Technology (NIST) launched a global research effort in 2016 calling on researchers to submit ideas for algorithms that would be less susceptible to a quantum attack. A few months ago, the organization selected a group of 15 algorithms that showed the most promise.

"NIST is leading a standardization process, but we didn't want to wait for that to be complete because getting cryptography to change across the globe is a pretty daunting task," says Josyula. "It could take 10 or even 20 years, so we wanted to get into this early to figure out the implications."

Verizon has significant amounts of VPN infrastructure and the company sells VPN products, which is why the team started investigating how to start enabling post-quantum cryptography right now and in existing services, Josyula adds.

One of the 15 algorithms identified by NIST, called Saber, was selected for the test. Saber generated quantum-safe cryptography keys that were delivered to the endpoints in London and Ashburn of a typical IPsec VPN through an extra layer of infrastructure, which was provided by a third-party vendor.

Whether Saber makes it to the final rounds of NIST's standardization process, in this case, doesn't matter, explains Josyula. "We tried Saber here, but we will be trying others. We are able to switch from one algorithm to the other. We want to have that flexibility, to be able to adapt in line with the process of standardization."

In other words, Verizon's test has shown that it is possible to implement post-quantum cryptography candidates on infrastructure links now, with the ability to migrate as needed between different candidates for quantum-proof algorithms.

This is important because, although a large-scale quantum computer could be more than a decade away, there is still a chance that the data that is currently encrypted with existing cryptography protocols is at risk.

The threat is known as "harvest now, decrypt later" and refers to the possibility that hackers could collect huge amounts of encrypted data and sit on it while they wait for a quantum computer to come along that could read all the information.

"If it's your Amazon shopping cart, you may not care if someone gets to see it in ten years," says Josyula. "But you can extend this to your bank account, personal number, and all the way to government secrets. It's about how far into the future you see value for the data that you own and some of these have very long lifetimes."

For this type of data, it is important to start thinking about long-term security now, which includes the risk posed by quantum computers.

A quantum-safe VPN could be a good start even though, as Josyula explains, many elements still need to be smoothed out. For example, Verizon still relied on standard mechanisms in its trial to deliver quantum-proof keys to the VPN end-points. This might be a sticking point, if it turns out that this phase of the process is not invulnerable to quantum attack.

The idea, however, is to take proactive steps to prepare, instead of waiting for the worst-case scenario to happen. Connecting London to Ashburn was a first step, and Verizon is now looking at extending its quantum-safe VPN to other locations.

Link:

Quantum computers could read all your encrypted data. This 'quantum-safe' VPN aims to stop that - ZDNet

Read the Rest...

IBM partners with the University of Tokyo on quantum computer – Illinoisnewstoday.com

§ August 27th, 2021 § Filed under Quantum Computer Comments Off on IBM partners with the University of Tokyo on quantum computer – Illinoisnewstoday.com

Tokyo IBM and the University of Tokyo have announced one of Japans most powerful quantum computers.

According to IBM, IBM Quantum System One is part of the Japan-IBM quantum partnership between the University of Tokyo and IBM, advancing Japans quest for quantum science, business and education.

IBM Quantum System One is currently in operation for researchers at both Japanese scientific institutions and companies, and access is controlled by the University of Tokyo.

IBM is committed to growing the global quantum ecosystem and facilitating collaboration between different research communities, said Dr. Dario Gil, director of IBM Research.

According to IBM, quantum computers combine quantum resources with classical processing to provide users with access to reproducible and predictable performance from high-quality qubits and precision control electronics. Users can safely execute algorithms that require iterative quantum circuits in the cloud.

see next: IBM partners with Atos on contract with Dutch Ministry of Defense

IBM Quantum System One in Japan is IBMs second system built outside the United States. In June, IBM unveiled the IBM Quantum System One, managed by the scientific research institute Fraunhofer Geselleschaft, in Munich, Germany.

IBMs commitment to quantum is aimed at advancing quantum computing and fostering a skilled quantum workforce around the world.

We are thrilled to see Japans contributions to research by world-class academics, the private sector, and government agencies, Gil said.

Together, we can take a big step towards accelerating scientific progress in different areas, Gil said.

Teruo Fujii, President of the University of Tokyo, said, In the field of rapidly changing quantum technology, it is very important not only to develop elements and systems related to quantum technology, but also to develop the next generation of human resources. To achieve a high degree of social implementation.

Our university has a wide range of research capabilities and has always promoted high-level quantum education from the undergraduate level. Now, with IBM Quantum System One, we will develop the next generation of quantum native skill sets. Further refine it.

In 2020, IBM and the University of Tokyo Quantum Innovation Initiative Consortium (QIIC) aims to strategically accelerate the research and development activities of quantum computing in Japan by bringing together the academic talents of universities, research groups and industries nationwide.

Last year, IBM also announced partnerships with several organizations focusing on quantum information science and technology. Cleveland Clinic, NS Science and Technology Facilities Council in the United Kingdom, And that University of Illinois at Urbana-Champaign..

see next: Public cloud computing provider

Read more from the original source:

IBM partners with the University of Tokyo on quantum computer - Illinoisnewstoday.com

Read the Rest...

Who will dominate the tech arms race? – The Jerusalem Post

§ August 27th, 2021 § Filed under Quantum Computer Comments Off on Who will dominate the tech arms race? – The Jerusalem Post

It is almost impossible to overstate what a quantum computer will be able to do, Christopher Monroe told the Magazine in a recent interview.

Monroe a professor at both the University of Maryland and Duke University, as well as co-founder of the quantum computing company IonQ discussed how quantum computing will change the face of the planet, even if this might take some more time.

The Magazine also interviewed four other experts in the quantum field and visited seven of their labs at the University of Maryland.

cnxps.cmd.push(function () { cnxps({ playerId: '36af7c51-0caf-4741-9824-2c941fc6c17b' }).render('4c4d856e0e6f4e3d808bbc1715e132f6'); });

These labs the full likes of which do not yet exist in Israel hosted all kinds of qubits (the basis of quantum computers), lasers blasting targets to cause plasma to come off to form distinctive films, infrared lasers, furnaces reaching 2,000C, a tetra arc furnace for growing silicon crystals, special dilution refrigerators to achieve cryostorage (deep freezing) and a variety of vacuum chambers that would seem like an alternate reality to the uninitiated.

Before entering each lab, there needed to be a conversation about whether this reporter should be wearing the special goggles that were handed out to avoid getting blinded.

One top quantum official at Maryland, Prof. Dr. Johnpierre Paglione, assured the Magazine that the ultrahazardous materials warning on many of the lab doors was not a concern at that moment.

From cracking the Internet as we know it, to military and economic dominance, to changing the way people manage their lives, quantum computers are predicted to make mincemeat of todays supercomputers. Put simply, they are made out of and operate from a completely different kind of material and set of principles connected to qubits and quantum mechanics, with computing potential that dwarfs classical computers capabilities.

But lets say the US wins the race who in the US would win it? Would it be giants like Google, Microsoft, Amazon, IBM and Honeywell? Or might it be a lean and fast solely quantum-focused challenger like Monroes IonQ?

At first glance, Google has no real challenger. In 2019, Google said it achieved quantum supremacy when its quantum computer became the first to perform a calculation that would be practically impossible for a classical machine, by checking the outputs from a quantum random-number generator.

The search-engine giant has already built a 54-qubit computer whereas IonQs largest quantum computer only has 32 qubits. Google has also promised to achieve the holy grail of quantum computing, a system large enough to revolutionize the Internet, military and economic issues, by 2029. Although China recently reproduced Googles experiment, Google is still regarded as ahead of the game.

Why is a 32-qubit quantum computer better than a 54-qubit one?

So why is Monroe so confident that his company will finish the race long before Google?

First, he takes a shot at the Google 2019 experiment.

It was a fairly academic exercise. The problem they attacked was one of those rare problems where you can prove something and you can prove the super computer cannot do it. Quantum mechanics works. It is not a surprise. The problem Google tackled was utterly useless. The system was not flexible enough to program to hit other problems. So a big company did a big academic demonstration, he said with a sort of whoop-dee-do tone and expression on his face.

Google had to repeat its experiment millions of times The signal went down by orders of magnitude. There are special issues to get the data. There are general problems where it cannot maintain [coherence]. The Google experiment and qubits decayed by seven times the constant. We gauge on one time for the constant and we can do 100 operations, with IonQs quantum computers.

In radioactive decay, the time constant is related to the decay constant and essentially represents the average lifetime of a decaying system, such as an atom. Some of the tactics for potentially overcoming decay go back to the lasers, vacuum chambers and cryostorage refrigerators mentioned above.

Monroe said from a business perspective, the experiment was a big distraction, and you will hear this from Google computer employees. They had to run simulations to prove how hard it would be to do what they were doing with old computers instead of building better quantum computers and solving useful algorithms.

We believe quantum computers work now it is time to build them, he stressed.

Describing IonQs quantum computers, Monroe said, The 32-qubit computer is fifth generation. The third and fourth generation is available to [clients of] Microsoft, Amazon and Google Cloud. It is 11 qubits, which is admittedly small, but it still runs more than any IBM machine can run. An 11-qubit computer is very clean operationally. It can run 100 or so ops [operations] before the laser noise causes coherence to be lost [before the qubits stop working]. That is many more ops [operations] than superconductors. If [a computer] has one million qubits, but can only run a few ops [operations], it is boring. But trapped ions adding more qubits at the same time makes things cheaper.

He added, The 32-qubit computer is not yet on the cloud. We are working in private with customers financials, noting that a future publication will discuss the baby version of an algorithm which could be very interesting when you start to scale it up. Maybe in the next generation, we can engineer it to solve an optimization problem something we dont get from the cloud, where we dont get any telemetry, which would be an unusual benefit for clients.

According to Monroe, that he will be able to build a 1,000-qubit computer by 2025 practically tomorrow in the sphere of new inventions will in and of itself be game-changing. This is true even if it is not yet capable of accomplishing all the extreme miracles that much larger quantum computers may someday accomplish.

A major innovation or risk (depending on your worldview) by Monroe is how he treats the paramount challenge of quantum computers and error correction basically the idea that for quantum computers to work, some process must be conceived to prevent qubits from decaying at the rate they currently decay at otherwise crucial calculations get interrupted mid-calculation.

Here, Monroe critiques both the Google approach and responds to criticism from some of his academic colleagues about his approach to error correction. Google is trying to get to one million qubits that do not work well together.

In contrast, a special encoding process could allow IonQ to create what Monroe called a single sort of super qubit, which would eliminate 99.9% of native errors. This is the easiest way to get better at quantum computing, as opposed to the quantity over quality path Google is pursuing.

But he has to defend himself from others poking holes in his approach as unrealistic, including some of his colleagues at University of Maryland (all sides still express great respect for each other). Confronted by this criticism, he responded that their path of attack was based on the theory of error correction. It implies that you will do indefinitely long computations, [but] no one will ever need this high a standard to do business.

We do not use error correction on our CPU [central processing unit] because silicon is so stable. We call it OK if it fails in one year, since that is more than enough time to be economically worthwhile. Instead of trying to eliminate errors, his strategy is to gradually add more qubits, which achieves slightly more substantial results. His goal is to work around the error-correction problem.

Part of the difference between Monroe and his academic colleagues relates to his having crossed over into a mix of business and academia. Monroes view on this issue? Industry and academia do not always see things the same way. Academics are trained to prove everything we do. But if a computer works better to solve a certain problem, we do not need to prove it.

For example, if a quantum computer doubled the value of a financial portfolio compared to a super computers financial recommendations, the client is thrilled even if no one knows how.

He said that when shortcuts solve problems and certain things cannot be proven but where quantum computing finds value academics hate it. They are trained to be pessimists. I do believe quantum computers will find narrow applications within five years.

Besides error correction, another question is what the qubits themselves, the basis of different kinds of quantum computers, should be made out of. The technique that many of his competitors are using to make computers out of a particular kind of qubit has the benefit of being not hard to do, inexpensive and representing beautiful physics.

However, he warned, No one knows where to find it if it exists So stay in solid-state physics and build computers out of solid-state systems. Google, Amazon and others are all invested in solid-state computers. But I dont see it happening without fundamental physics breakthroughs. If you want to build and engineer a device if you want to have a business you should not be reliant on physics breakthroughs.

Instead of the path of his competitors, Monroe emphasized working with natural quantum atoms and tricking and engineering them to act how he wants using low pressure instead of low temperatures.

I work with charged atoms or ions. We levitate them inside a vacuum chamber which is getting smaller every year. We have a silicon chip. Just electrodes, electric force fields are holding up these atoms. There are no solids and no air in the vacuum chamber, which means the atoms remain extremely well isolated. They are the most perfect atoms we know, so we can scale without worrying about the top of the noise [the threshold where qubits decay]. We can pick qubit levels that do not yet decay.

Why are Google and IBM investing in natural qubits? Because they have a blind spot. They have been first in solid-state physics and engineering for 50 years. If there is a silicon solid-state quantum computer, Intel will make that, but I dont see how it will be scaled, he declared.

MONROE IS far from the full quantum show at Maryland.

Paglione has been a professor at University of Maryland for 13 years and the director of the Maryland Quantum Materials Center for the last five years.

In 1986, the center was working on high-temperature superconductors, Paglione said, noting that work on quantum computers is a more recent development. The development has not merely altered the focus of the centers research. According to Paglione, it has also helped grow the center from around seven staff members 30 years ago to around 100 staff members when all of the affiliate members, students and administrative staff are taken into account.

Similarly, Dr. Gretchen Campbell, director of the Joint Quantum Institute, told the Magazine that a big part of her institutions role and her personal role has been to first bring together people from atomic physics and condensed-matter physics even within physics, we do not always talk to each other, followed by connecting these experts with computer science experts.

Campbell explained it was crucial to explore the interaction between the quantum realm and quantum algorithms, for which they needed more math and computer science backgrounds and to continue to move from laboratories to real-world applications to translating into technology and interacting more with industry.

She also guided the Magazine, adorning goggles, through a lab with a digital micromirror device and laser beams relating to atom clouds and light projectors.

Add in some additional departments at Maryland as well as a partnership with the National Institute of Standards and Technology (NIST) and the number of staff swells way past 100. What are their many different teams working on? The lab studies and experiments are as varied as the different disciplines, with Paglione talking about possibilities for making squid devices or sensitive magnetic sensors that could be constructed by using a superconducting quantum interference device.

Paglione said magnetometer systems could be used with squids to sense the magnetic field of samples. These could be used as detectors in water. If they were made sensitive enough, they could sense changes in a magnetic field, such as when a submarine passes by and generates a changed magnetic field.

This has drawn attention from the US Department of Defense.

A multidisciplinary mix of Pagliones team recently captured the most direct evidence to date of a quantum quirk, which permits particles to tunnel through a barrier as if it is not even there. The upshot could be assisting engineers in designing more uniform components to build both future quantum computers and quantum sensors (reported applications could detect not only submarines but aircraft).

Pagliones team, headed by Ichiro Takeuchi, a professor of materials science and engineering at Maryland, successfully carried out a new experiment in which they observed Klein tunneling. In the quantum world, tunneling enables particles, such as electrons, to pass through a barrier even if they lack sufficient energy to actually climb over it. A taller barrier usually makes climbing over harder and fewer particles are able to cross through. The phenomenon, known as Klein tunneling, happens when the barrier becomes completely transparent and opens up a portal that particles can traverse regardless of the barriers height.

Scientists and engineers from Marylands Center for Nanophysics and Advanced Materials, the Joint Quantum Institute and the Condensed Matter Theory Center along with the Department of Materials Science and Engineering and Department of Physics, succeeded in making the most compelling measurements of the phenomenon to date.

Given that Klein tunneling was initially predicted to occur in the world of high-energy quantum particles moving close to the speed of light, observing the effect was viewed as impossible. That was until scientists revealed that some of the rules governing fast-moving quantum particles can also apply to the comparatively sluggish particles traveling near the surface of some highly unusual materials.

It was a piece of serendipity that the unusual material and an elemental relative of sorts shared the same crystal structure, said Paglione. However, the multidisciplinary team we have was one of the keys to this success. Having experts on topological physics, thin-film synthesis, spectroscopy and theoretical understanding really got us to this point.

Bringing this back to quantum computing, the idea is that interactions between superconductors and other materials are central ingredients in some quantum computer architectures and precision-sensing devices. Yet, there has always been a problem that the junction, or crossover spot, where they interact is slightly different. Takeuchi said this led to sucking up countless amounts of time and energy tuning and calibrating to reach the best performance.

Takeuchi said Klein tunneling could eliminate this variability, which has played havoc with device-to-device interactions.

AN ENTIRELY separate quantum application could be physics department chairman Prof. Steve Rolstons work on establishing a quantum communications network. Rolston explained that when a pair of photons are quantum entangled you can achieve quantum encryption over a communications network, by using entangled particles to create secure keys that cannot be hacked. There are varying paths to achieve such a quantum network and Rolston is skeptical of others in the field who could be seen as cutting corners.

He also is underwhelmed by Chinas achievements in this area. According to Rolston, no one has figured out how to extend a secure quantum network over any space sizable enough to make the network usable and marketable in practical terms.

Rather, he said existing quantum networks are either limited to very small spaces, or to extend their range they must employ gimmicks that usually impair how secure they are. Because of these limitations, Rolston went as far as to say that his view is that the US National Security Agency views the issue as a distraction.

In terms of export trade barriers or issues with China, he said he opposes controls and believes cooperation in the quantum realm should continue, especially since all of his centers research is made public anyway.

Rolston also lives up to Monroes framing of the difference between academics and industry-focused people. He said that even Monroe would have to admit that no one is close to the true holy grail of quantum computers computers with a massive number of qubits and that the IonQ founder is banking on interesting optimization problems being solvable for industry to an extent which will justify the hype instead.

In contrast, Rolston remained pessimistic that such smaller quantum computers would achieve sufficient superiority at optimization issues in business to justify a rushed prediction that transforming the world is just around the corner.

In Rolstons view, the longer, more patient and steadier path is the one that will eventually reap rewards.

For the moment, we do not know whether Google or IonQ, or those like Monroe or Rolston will eventually be able to declare they were right. We do know that whoever is right and whoever is first will radically change the world as we know it.

See the article here:

Who will dominate the tech arms race? - The Jerusalem Post

Read the Rest...

Why Quantum Resistance Is the Next Blockchain Frontier – Tech Times

§ August 27th, 2021 § Filed under Quantum Computer Comments Off on Why Quantum Resistance Is the Next Blockchain Frontier – Tech Times

(Photo : Why Quantum Resistance Is the Next Blockchain Frontier)

As decentralized networks secured by potentially thousands of miners and/or nodes, blockchains are widely considered to be an incredibly secure example of distributed ledger technology.

On the back of this, they also have dozens of potential applications - ranging from decentralized content storage networks, to medical records databases, and supply chain management. But to this day, they're most commonly thought of as the ideal platform hosting the financial infrastructure of tomorrow - such as decentralized exchanges and payment settlement networks.

But there's a problem. While the blockchains of today are practically unhackable - due to the type of encryption they use to secure private keys and transactions - this might not be the case for much longer. This is due to the advent of so-called "quantum computers", that is, computers that can leverage the properties of quantum mechanics to solve problems that would be impossible with traditional computers... such as breaking the cryptography that secures current generation blockchains.

Many blockchains of today use at least two types of cryptographic algorithms - asymmetric key algorithms and hash functions.

The first kind, also known as public-key cryptography, is used to produce pairs of private and public keys that are provably cryptographically linked. In Bitcoin, this private key is used to spend UTXOs - thereby transferring value from one person to another. The second kind - the hash function - is used to securely process raw transaction data into a block in a way that is practically irreversible.

As you might imagine, a sufficiently powerful quantum computer capable of breaking either of these security mechanisms could have devastating consequences for susceptible blockchains - since they could be used to potentially derive private keys or even mine cryptocurrency units much faster than the expected rate (leading to supply inflation).

So, just how far away from this are we? Well, according to recent estimates, a quantum computer possessing 4,000 qubits of processing power could be the minimum necessary to break the public key cryptography that secures Bitcoin user funds. A sufficiently flexible quantum computer with this processing power could, theoretically, take over the funds contained in any Bitcoin p2pk address - that's a total of around 2 million BTC (circa $67 billion at today's rates).

Fortunately, this isn't an immediate concern. As it stands, the world's most powerful quantum computer - the Zuchongzhi quantum computer- currently clocks in at an impressive (albeit insufficient) 66 qubits. However, given the rapid pace of development in the quantum computing sector, some experts predict that Bitcoin's Elliptic Curve Digital Signature Algorithm (ECDSA) could meet its quantum match within a decade.

(Photo : The Next Platform)

The algorithm that could be potentially used to break ECDSA has already been developed. If generalized and applied by a powerful enough quantum computer, it is widely thought that Peter Shor's polynomial time quantum algorithm would be able to attack the Bitcoin blockchain - while similar algorithms could be applied to other forms of traditional encryption.

But this might not be a concern for much longer, thanks to the introduction of what many consider to be the world's first truly quantum-resistant blockchain. The platform, known as QANplatform, is built to resist all known quantum attacks by using lattice cryptography. QAN manages to achieve quantum resistance while simultaneously tackling the energy concerns that come with some other blockchains through its highly efficient consensus mechanism known as Proof-of-Randomness (PoR).

Unlike some other so-called quantum-resistant blockchains, QAN is unusual in that it also supports decentralized applications (DApps) - allowing developers to launch quantum-resistant DApps within minutes using its free developer tools.

Besides platforms like QAN, the development communities behind several popular blockchains are already beginning to consider implementing their own quantum-resistance solutions, such as the recently elaboratedcommit-delay-reveal scheme - which could be used to transition Bitcoin to a quantum-resistant state. Nonetheless, the future of post-quantum cryptography still remains up in the air, as none of the top ten blockchains by user count have yet committed to a specific quantum-resistant signature scheme.

2021 TECHTIMES.com All rights reserved. Do not reproduce without permission.

Tags:

Follow this link:

Why Quantum Resistance Is the Next Blockchain Frontier - Tech Times

Read the Rest...

Life, the universe and everything Physics seeks the future – The Economist

§ August 27th, 2021 § Filed under Quantum Computer Comments Off on Life, the universe and everything Physics seeks the future – The Economist

Aug 25th 2021

A WISE PROVERB suggests not putting all your eggs in one basket. Over recent decades, however, physicists have failed to follow that wisdom. The 20th centuryand, indeed, the 19th before itwere periods of triumph for them. They transformed understanding of the material universe and thus peoples ability to manipulate the world around them. Modernity could not exist without the knowledge won by physicists over those two centuries.

Your browser does not support the

Get The Economist app and play articles, wherever you are

In exchange, the world has given them expensive toys to play with. The most recent of these, the Large Hadron Collider (LHC), which occupies a 27km-circumference tunnel near Geneva and cost $6bn, opened for business in 2008. It quickly found a long-predicted elementary particle, the Higgs boson, that was a hangover from calculations done in the 1960s. It then embarked on its real purpose, to search for a phenomenon called Supersymmetry.

This theory, devised in the 1970s and known as Susy for short, is the all-containing basket into which particle physicss eggs have until recently been placed. Of itself, it would eliminate many arbitrary mathematical assumptions needed for the proper working of what is known as the Standard Model of particle physics. But it is also the vanguard of a deeper hypothesis, string theory, which is intended to synthesise the Standard Model with Einsteins general theory of relativity. Einsteins theory explains gravity. The Standard Model explains the other three fundamental forceselectromagnetism and the weak and strong nuclear forcesand their associated particles. Both describe their particular provinces of reality well. But they do not connect together. String theory would connect them, and thus provide a so-called theory of everything.

String theory proposes that the universe is composed of minuscule objects which vibrate in the manner of the strings of a musical instrument. Like such strings, they have resonant frequencies and harmonics. These various vibrational modes, string theorists contend, correspond to various fundamental particles. Such particles include all of those already observed as part of the Standard Model, the further particles predicted by Susy, which posits that the Standard Models mathematical fragility will go away if each of that models particles has a heavier supersymmetric partner particle, or sparticle, and also particles called gravitons, which are needed to tie the force of gravity into any unified theory, but are not predicted by relativity.

But, no Susy, no string theory. And, 13 years after the LHC opened, no sparticles have shown up. Even two as-yet-unexplained results announced earlier this year (one from the LHC and one from a smaller machine) offer no evidence directly supporting Susy. Many physicists thus worry they have been on a wild-goose chase.

They have good reason to be nervous. String theory already comes with a disturbing conceptual price tagthat of adding six (or in one version seven) extra dimensions to the universe, over and above the four familiar ones (three of space and one of time). It also describes about 10{+500} possible universes, only one of which matches the universe in which human beings live. Accepting all that is challenging enough. Without Susy, though, string theory goes bananas. The number of dimensions balloons to 26. The theory also loses the ability to describe most of the Standard Models particles. And it implies the existence of weird stuff such as particles called tachyons that move faster than light and are thus incompatible with the theory of relativity. Without Susy, string theory thus looks pretty-much dead as a theory of everything. Which, if true, clears the field for non-string theories of everything.

The names of many of these do, it must be conceded, torture the English language. They include causal dynamical triangulation, asymptotically safe gravity, loop quantum gravity and the amplituhedron formulation of quantum theory. But at the moment the bookies favourite for unifying relativity and the Standard Model is something called entropic gravity.

Entropy is a measure of a systems disorder. Famously, the second law of thermodynamics asserts that it increases with time (ie, things have a tendency to get messier as they get older). What that has to do with a theory of gravity, let alone of everything, is not, perhaps, immediately obvious. But the link is black holes. These are objects which have such strong gravitational fields that even light cannot escape from them. They are predicted by the mathematics of general relativity. And even though Einstein remained sceptical about their actual existence until the day he died in 1955, subsequent observations have shown that they are indeed real. But they are not black.

In 1974 Stephen Hawking, of Cambridge University, showed that quantum effects at a black holes boundary allow it to radiate particlesespecially photons, which are the particles of electromagnetic radiation, including light. This has peculiar consequences. Photons carry radiant heat, so something which emits them has a temperature. And, from its temperature and mass, it is possible to calculate a black holes entropy. This matters because, when all these variables are plugged into the first law of thermodynamics, which states that energy can be neither created nor destroyed, only transformed from one form (say, heat) into another (say, mechanical work), what pops out are Einsteins equations of general relativity.

That relationship was discovered in 2010 by Erik Verlinde of Amsterdam University. It has serious implications. The laws of thermodynamics rely on statistical mechanics. They involve properties (temperature, entropy and so on) which emerge from probabilistic descriptions of the behaviour of the underlying particles involved. These are also the particles described by quantum mechanics, the mathematical theory which underpins the Standard Model. That Einsteins equations can be rewritten thermodynamically implies that space and time are also emergent properties of this deeper microscopic picture. The existing forms of quantum mechanics and relativity thus do indeed both seem derivable in principle from some deeper theory that describes the underlying fabric of the universe.

String theory is not so derivable. Strings are not fundamental enough entities. But entropic gravity claims to describe the very nature of space and timeor, to use Einsteinian terminology, spacetime. It asserts this is woven from filaments of quantum entanglement linking every particle in the cosmos.

The idea of quantum entanglement, another phenomenon pooh-poohed by Einstein that turned out to be true, goes back to 1935. It is that the properties of two or more objects can be correlated (entangled) in a way which means they cannot be described independently. This leads to weird effects. In particular, it means that two entangled particles can appear to influence each others behaviour instantaneously even when they are far apart. Einstein dubbed this spooky action at a distance, because it seems to violate the premise of relativity theory that, in the speed of light, the universe has a speed limit.

As with black holes, Einstein did not live long enough to see himself proved wrong. Experiments have nevertheless shown he was. Entanglement is real, and does not violate relativity because although the influence of one particle on another can be instantaneous there is no way to use the effect to pass information faster than light-speed. And, in the past five years, Brian Swingle of Harvard University and Sean Carroll of the California Institute of Technology have begun building models of what Dr Verlindes ideas might mean in practice, using ideas from quantum information theory. Their approach employs bits of quantum information (so-called qubits) to stand in for the entangled particles. The result is a simple but informative analogue of spacetime.

Qubits, the quantum equivalent of classical bitsthe ones and zeros on which regular computing is builtwill be familiar to those who follow the field of quantum computing. They are the basis of quantum information theory. Two properties distinguish qubits from the regular sort. First, they can be placed in a state of superposition, representing both a one and a zero at the same time. Second, several qubits can become entangled. Together, these properties let quantum computers accomplish feats such as performing multiple calculations at once, or completing certain classes of calculation in a sensible amount of time, that are difficult or impossible for a regular computer.

And because of their entanglement qubits can also, according to Dr Swingle and Dr Carroll, be used as stand-ins for how reality works. More closely entangled qubits represent particles at points in spacetime that are closer together. So far, quantum computers being a work in progress, this modelling can be done only with mathematical representations of qubits. These do, though, seem to obey the equations of general relativity. That supports entropic-gravity-theorys claims.

All of this modelling puts entropic gravity in poll position to replace strings as the long-sought theory of everything. But the idea that spacetime is an emergent property of the universe rather than being fundamental to it has a disturbing consequence. It blurs the nature of causality.

In the picture built by entropic gravity, spacetime is a superposition of multiple states. It is this which muddies causality. The branch of maths that best describes spacetime is a form of geometry that has four axes at right angles to each other instead of the more familiar three. The fourth represents time, so, like the position of objects, the order of events in spacetime is determined geometrically. If different geometric arrangements are superposed, as entropic gravity requires, it can therefore sometimes happen that the statements A causes B and B causes A are both true.

This is not mere speculation. In 2016 Giulia Rubino of the University of Bristol, in England, constructed an experiment involving polarised photons and prisms which achieved exactly that. This spells trouble for those who have old-fashioned notions about causalitys nature.

However, Lucien Hardy of the Perimeter Institute, in Canada, has discovered a way to reformulate the laws of quantum mechanics to get around this. In his view, causality as commonly perceived is like data compression in computing: it is a concept that gives you more bang for your buck. With a little bit of information about the present, causality can infer a lot about the futurecompressing the amount of information needed to capture the details of a physical system in time.

But causality, Dr Hardy thinks, may not be the only way to describe such correlations. Instead, he has invented a general method for building descriptions of the patterns in correlations from scratch. This method, which he calls the causaloid framework, tends to reproduce causality but it does not assume it, and he has used it to reformulate both quantum theory (in 2005) and general relativity (in 2016). Causaloid maths is not a theory of everything. But there is a good chance that if and when such a theory is found, causaloid principles will be needed to describe it, just as general relativity needed a geometry of four dimensions to describe spacetime.

Entropic gravity has, then, a lot of heavy-duty conceptual work to back it up. But it is not the only candidate to replace string theory. Others jostling for attention include an old competitor called loop quantum gravity, originally proposed in 1994 by Carlo Rovelli, then at the University of Pittsburgh, and Lee Smolin, of the Perimeter Institute. This, and causal dynamical triangulation, a more recent but similar idea, suggest that spacetime is not the smooth fabric asserted by general relativity, but, rather, has a structureeither elementary loops or triangles, according to which of the two theories you support.

A third option, asymptotically safe gravity, goes back still further, to 1976. It was suggested by Steven Weinberg, one of the Standard Models chief architects. A natural way to develop a theory of quantum gravity is to add gravitons to the model. Unfortunately, this approach got nowhere, because when the interactions of these putative particles were calculated at higher energies, the maths seemed to become nonsensical. However, Weinberg, who died in July, argued that this apparent breakdown would go away (in maths speak, the calculations would be asymptotically safe) if sufficiently powerful machines were used to do the calculating. And, with the recent advent of supercomputers of such power, it looks, from early results, as if he might have been right.

One of the most intriguing competitors of entropic gravity, though, is the amplituhedron formulation of quantum theory. This was introduced in 2013 by Nima Arkani-Hamed of the Institute of Advanced Study at Princeton and Jaroslav Trnka of the University of California, Davis. They have found a class of geometric structures dubbed amplituhedrons, each of which encodes the details of a possible quantum interaction. These, in turn, are facets of a master amplituhedron that encodes every possible type of physical process. It is thus possible to reformulate all of quantum theory in terms of the amplituhedron.

Most attempts at a theory of everything try to fit gravity, which Einstein describes geometrically, into quantum theory, which does not rely on geometry in this way. The amplituhedron approach does the opposite, by suggesting that quantum theory is actually deeply geometric after all. Better yet, the amplituhedron is not founded on notions of spacetime, or even statistical mechanics. Instead, these ideas emerge naturally from it. So, while the amplituhedron approach does not as yet offer a full theory of quantum gravity, it has opened up an intriguing path that may lead to one.

That space, time and even causality are emergent rather than fundamental properties of the cosmos are radical ideas. But this is the point. General relativity and quantum mechanics, the physics revolutions of the 20th century, were viewed as profound precisely because they overthrew common sense. To accept relativity meant abandoning a universal notion of time and space. To take quantum mechanics seriously meant getting comfortable with ideas like entanglement and superposition. Embracing entropic gravity or its alternatives will require similar feats of the imagination.

No theory, though, is worth a damn without data. That, after all, is the problem with Supersymmetry. Work like Dr Rubinos points the way. But something out of a particle-physics laboratory would also be welcome. And, though their meaning is obscure, the past few months have indeed seen two experimentally induced cracks in the Standard Model.

On March 23rd a team from CERN, the organisation that runs the LHC, reported an unexpected difference in behaviour between electrons and their heavier cousins, muons. These particles differ from one another in no known properties but their masses, so the Standard Model predicts that when other particles decay into them, the two should each be produced in equal numbers. But this appears not to be true. Interim results from the LHC suggest that a type of particle called a B-meson is more likely to decay into an electron than a muon. That suggests an as-yet-undescribed fundamental force is missing from the Standard Model. Then, on April 7th, Fermilab, Americas biggest particle-physics facility, announced the interim results of its own muon experiment, Muon g-2.

In the quantum world, there is no such thing as a perfect vacuum. Instead, a froth of particles constantly pops in and out of existence everywhere in spacetime. These are virtual rather than real particlesthat is, they are transient fluctuations which emerge straight out of quantum uncertainty. But, although they are short-lived, during the brief periods of their existence they still have time to interact with more permanent sorts of matter. They are, for example, the source of the black-hole radiation predicted by Hawking.

The strengths of their interactions with types of matter more conventional than black holes are predicted by the Standard Model, and to test these predictions, Muon g-2 shoots muons in circles around a powerful superconducting magnetic-storage ring. The quantum froth changes the way the muons wobble, which detectors can pick up with incredible precision. The Muon g-2 experiment suggests that the interactions causing these wobbles are slightly stronger than the Standard Model predicts. If confirmed, this would mean the model is missing one or more elementary particles.

There is a slim chance that these are the absent sparticles. If so, it is the supporters of supersymmetry who will have the last laugh. But nothing points in this direction and, having failed thus far to stand their ideas up, they are keeping sensibly quiet.

Whatever the causes of these two results, they do show that there is something out there which established explanations cannot account for. Similarly unexplained anomalies were starting points for both quantum theory and relativity. It looks possible, therefore, that what has seemed one of physicss darkest periods is about to brighten into a new morning.

This article appeared in the Science & technology section of the print edition under the headline "Bye, bye, little Susy"

See the rest here:

Life, the universe and everything Physics seeks the future - The Economist

Read the Rest...

Aussie innovations: mealworm snack packs, in-mouth robots, drone weeding and more – Sydney Morning Herald

§ August 27th, 2021 § Filed under Quantum Computer Comments Off on Aussie innovations: mealworm snack packs, in-mouth robots, drone weeding and more – Sydney Morning Herald

We have deals with larger food manufacturers at the moment for our bulk ingredients and weve had a really good response from retail, says founder Skye Blackburn, a food scientist and edible bug evangelist. In our new facility well be able to use all the technology weve been developing over the past 14 years, including applying artificial intelligence to the feeding, cleaning and monitoring side of things.

According to the CSIRO, the global market for edible insects is expected to grow to $1.4 billion by 2023. More than 2100 insect species are currently eaten, its report says, noting that there are 14 Australian insect-based businesses. While the industrys growth is limited by the current state of consumer attitudes, Blackburn thinks that will change as people realise that dried crickets are 68 per cent protein and packed with essential micronutrients. Everything your body needs in a tiny little package.

Solar will raise the standard of living around the world.Credit:Illustration by Simon Letch

Get ready for insanely cheap power as the price of renewables tumbles, says University of NSW professor Martin Green, inventor of the PERC solar cell used in about 85 per cent of the worlds solar module production.

Last year, the International Energy Agency said solar now provides the cheapest electricity ever seen, and the cost is still going down, Green says. Australia has more rooftop solar than any other country, even not normalising for population, and the average size of the systems is going up.

Green is director of the Australian Centre for Advanced Photovoltaics, where the next generation of solar is being developed. He says more powerful home systems will charge electric cars and vice versa, with those cars providing a bank of home energy when needed. But he doesnt see each house being self-contained and off the grid.

Storage is done most cheaply at the centralised level, he says. Green believes the revolution will happen due to economics, and raise the standard of living around the world. Solar is the most viable way of getting a reliable electricity supply to the couple of billion people in the world who still dont have access to it.

Humans and robots will be collaborating further in the future.Credit:Illustration by SimonLetch

Drones and other robots or cobots, to use the term for those designed to collaborate or interact with humans will play an ever larger role in our futures. Imagine rescue work at a collapsed building being aided by purpose-built drones, or electronic lizards capable of scaling sheer walls and slithering through tiny openings to detect survivors. The latter are currently being developed at the University of the Sunshine Coast. Or robots such as the AI-controlled drone developed by Israels Tevel Aerobotics Technology that can identify ripe fruit and pick it, around the clock.

Loading

New Zealands AgResearch led a three-year study into drone-based weeding, with the aim of identifying unwanted plants based on their unique chemical signatures and how they reflect light, and precisely mapping their locations using GPS. Program leader Dr Kioumars Ghamkhar has said the drone could then destroy the weeds with lasers.

The business applications of this so-called map and zap research are still being investigated, with more of them to be revealed this year.

It is believed phones in the future will be able to charge within a minute and last three days.Credit:Illustration by SimonLetch

The spruikers of graphene say this one-atom-thick material is 200 times stronger than steel, harder than diamond and has extraordinary electrical conductivity. Craig Nicol, chair of the Australian Graphene Industry Association, is convinced it will change the world the way silicon did with the advent of the silicon microchip that powers mobile phones and computers. We will likely see graphene used in electronics, filtration, ultra-sensitive sensors, lubrication and all manner of materials.

Nicol is also founder and CEO of GMG, which produces coatings that use graphenes heat transfer properties to make airconditioners run more efficiently. The Brisbane-based company is also working with the University of Queensland to bring energy-dense graphene aluminium-ion batteries to market, which they hope will one day power everything from watches to phones, and eventually cars and aircraft, while also backing up power grids. Nicol plans to debut a prototype watch-camera coin cell by the end of this year, and in a phone in 2022. He believes well eventually see phones that charge in less than a minute and run for three days. Others including Samsung are also working on graphene batteries, so the race is on.

The future will involve turning raw waste resources into high-value products.Credit:Illustration by SimonLetch

The circular economy is on the way and, according to KPMG, it will add more than $200 billion and 17,000 full-time jobs to the Australian economy by 2047-48. And theres no bigger and smarter advocate of the its not waste, its a resource mantra than University of NSW Professor Veena Sahajwalla, a pioneer of micro recycling which creates, as she puts it, a whole new range of very sophisticated recycling solutions that really didnt exist before.

Way beyond turning aluminium cans into more aluminium cans, the future will involve turning raw waste resources such as car tyres and beer bottles into high-value products such as green steel and home furnishings. Sahajwalla says the micro factories buildings with a handful of staff which her team have designed, with backing from the Australian Research Council, use a range of proprietary techniques, such as thermal isolation, to unpick complex structures.

They can therefore extract manganese and zinc from dead batteries, and create filament for 3D printers from mixed plastic structures such as old laser printers. Even more impressively, they can transform fabric into ceramic tiles. A soft material is now becoming part of a hard, durable green ceramic, Sahajwalla says. Youre combining that with waste glass and heat to create this integrated structure. Thats what we do in our micro factories.

Robots will allow those in the city to visit remote communities.Credit:Illustration by SimonLetch

Canberra-based Dentroid is working on an in-mouth robot that could allow city-bound dentists to visit remote communities. Co-founder and CEO Omar Zuaiter says the robot uses laser heads, micro cameras and other controllers to end the need for drills and needles. They look at the tooth, analyse it and remove the decayed materials. Laser is really, really good at that. Zuaiter says as communication infrastructure improves, the system will be able to reach further into distant areas. A commercial release is hoped for in 2024.

Quantum computing is set solve solve complex corporate, governmental and defence problems.Credit:Illustration by SimonLetch

Imagine a machine that could, in almost real time, complete calculations that would take thousands of years on the fastest iMac. Commercial versions could be available this decade, with Sydney-based Silicon Quantum Computing further advanced than most.

Loading

Silicon founding director Michelle Simmons says quantum computers will work by exploiting the power of quantum physics, and initially will likely solve complex corporate, governmental and defence problems such as logistics, financial analysis, software optimisation, machine learning and bioinformatics, including early disease detection and prevention.

Although few of us will use a quantum computer any time soon (you need a controlled environment for a start), the indirect results will be profound. Radically enhanced molecular models will mean faster processes in the development of new and better drugs, says Simmons. If you think classical computing has transformed the world, you havent seen anything yet.

To read more from Good Weekend magazine, visit our page at The Sydney Morning Herald, The Age and Brisbane Times.

The best of Good Weekend delivered to your inbox every Saturday morning. Sign up here.

Excerpt from:

Aussie innovations: mealworm snack packs, in-mouth robots, drone weeding and more - Sydney Morning Herald

Read the Rest...

Finland’s top startups and scaleups to watch – Sifted

§ August 27th, 2021 § Filed under Quantum Computer Comments Off on Finland’s top startups and scaleups to watch – Sifted

Its a banner year so far for investment in Finnish startups. According to Dealroom statistics, 1.2bn of VC funding has flowed into the country so far this year across 70 rounds compared to a solid 1bn from the previous year.

This is the most VC funding that the country known for its happiness, reindeer and saunas has ever received.

Whilst 39 of those rounds were at pre-seed and seed stage level, there have also been a few megarounds of $100m+, notably Wolts huge $530m raise and Aivens $100m Series C.

Here we spotlight fourteen startups that have caught our eye because they look set for an upward trajectory.

For those looking for the big picture, read the full list of over 100 Finnish startups here.

HQ: Helsinki

Founded: 2014

The food delivery company raised this years biggest funding round for a Finnish company, a $530m round back in January, and the next step is likely to be an IPO.

The scaleup saw its headcount go from 700 at the beginning of 2020 to 2,200 employees at the beginning of 2021. It is now in 23 countries and 129 cities, and saw revenue triple in 2020 to $345m.

Covid has changed our perspective on how big a business like us can be, says Miki Kuusi, Wolts CEO and cofounder.

Wolt is expanding beyond just food delivery to groceries, electronics, flowers, clothes and more, although it has steered away from building its own dark stores, preferring to work with partners, such as Spar in Poland and Carrefour in Georgia.

HQ: Helsinki

Founded: 2016

Founded in 2016, Aiven manages companies open-source data infrastructure in the cloud, so that developers can focus on building applications without worrying about managing background tasks like security and maintenance. The company has some 1000 customers, including big corporations like Comcast and Toyota, and has a workforce of around 200 people.

The company raised a $100m Series C funding round in March, giving it a valuation of around $800m and making it one of Finlands soonicorns. Aiven says it is planning to double its headcount over the next 12 months.

HQ: Oulu

Founded: 2013

The health-tracking ring has had a blisteringly good marketing run, having won over a number of celebrity fans such as Prince Harry, cyclist Lance Armstrong and Hollywood A-lister Will Smith. It got an added boost after studies showed that the ring, which tracks biometrics like body temperature, pulse and sleep patterns, could predict the onset of Covid-19 symptoms up to three days before they showed up.

This has helped the company win big corporate clients, such as the Las Vegas Sands hotel, as well as NASCAR and Ultimate Fighting Championship.

Oura has raised a total of 140m to date, including an 85m Series C round in May.

HQ: Helsinki and Berlin

Founded: 2017

This four-year-old company has global-sized ambitions to take on the biggest US tech companies like Google, IBM and Microsoft in the field of AI-powered customer service agents.

The technology focuses specifically on customer service, which can mean anything from building chatbots to systems that can automatically respond to questions sent in via simple contact forms and short emails. It is used in the customer service centres of large companies including Finnair, Telia, Deezer and Elisa. Up to 80% of customer interactions can be automated this way, the company says.

The company raised a $20m Series A round in December, which has allowed them to grow the headcount to more than 100 staff. Although the headquarters have moved to Berlin, a substantial part of the companys development work is still done in Finland.

The next big project is a plan to expand into the US market.

HQ: Espoo

Founded: 2016

Circular economy startup Infinited Fiber takes waste materials such as old textiles, used cardboard and even crop residues like rice and wheat straw, and uses a patented process to turn them into a textile fibre with a similar feel to cotton. In technical terms, the fibre is cellulose carbamate.

A number of fashion brands, including H&M, Patagonia and Adidas are customers, and in July a number of these customers, notably Adidas, Bestseller and H&M chipped into a 30m funding round for the startup.

This funding will help build a flagship factory in Finland that will turn household textile waste into a new, regenerated textile fibre, Infinna. The factory is expected to be operational in 2024 and will have the capacity to produce 30,000 metric tonnes of fibre.

Infinite Fibre is also looking to licence the technology to other producers it says any existing pulp or viscose factory can be retrofitted to produce the fibre.

HQ: Helsinki

Founded: 2018

A spin-out from Aalto University and the VTT Technical Research Centre, IQM is building quantum computers based on superconducting technology, setting itself up as a European challenger to Google, IBM and Rigetti. IQM is building Finlands first quantum computer, together with VTT, which will be operational by the end of the year. This will have just 5 qubits, far lower than the 60-70 qubit machines that Google and IBM have assembled, but IQM has plans for a 20-qubit computer by the end of next year and a 50-qubit computer by the end of 2023.

IQM has operations in Germany and recently announced the opening of a lab in Bilbao, which will focus on designing quantum software and hardware specifically to solve problems for the financial services sector. The company is one of the biggest quantum computing teams in Europe, with 50 people in Finland and a further 20 in Europe.

IQM has raised some 71m in funding to date, including a 39m Series A round at the end of 2020.

HQ: Helsinki

Founded: 2016

This five-year-old food waste startup has seen its revenues grow threefold during the pandemic, from 3.7m to 12m, as consumers embraced the internet ordering of food. The startup whose name translates as smart food sells food that is close to its sell-by date and about to become food waste, offering them to customers at heavily discounted prices. Unlike some of the food waste startups that rely on customers going to pick up waste food from restaurants and shops, Fiksu Ruoka offers home delivery.

In addition to food, Fiksu has also started stocking homewares and clothing. The business raised a 19m VC round in May.

HQ: Helsinki

Founded: 2017

Upright is building a new type of quantification model to calculate the net impact of companies on the environment, on the health of people and on society as a whole. It uses a neural network to assess the entire value chain surrounding a business. It is intended as a tool to help investors and consumers to make more informed decisions about the companies they back.

In June, Upright signed a partnership with Nasdaq, which will enable investors to get Upright data easily through the Nasdaq API and combine these with financial data. This is handy for investors building portfolios with impact goals.

Upright has taken a tiny amount of seed funding but mostly finances its operations with revenue from clients. Its ultimate aim is to make enough from the sale of its investor and corporate tools so that it can give the impact data to consumers and employees for free.

HQ: Helsinki

Founded: 2010

Neither Chris Thr nor his cofounder Mikko Kaipainen were musicians or music teachers. They were just two techies who were keen to learn to play musical instruments, and that was the whole point of Yousician a mobile app that can teach beginners how to play guitar, piano, ukulele, bass or to sing. The company started as a service focused on childrens music lessons but later pivoted to a less age-specific focus.

Users can get one free lesson a day, but can pay for a premium subscription to get more lessons and access to a bigger library of songs. The company has seen strong user growth during the pandemic as many people became interested in taking up an instrument while stuck at home.

Monthly users grew from 14.5m to 20m, while subscriptions increased by 80%. The company reported revenue of $50m last year and in April, Yousician raised a 24m Series B funding round.

HQ: Helsinki

Founded: 2013

Aiforia is developing cloud-based deep learning software to help scientists and clinicians with image analysis. The technology can increase the speed and precision with which medical images can be analysed in fields ranging from oncology to neuroscience. The company is planning, for example, to launch tools for breast and lung cancer diagnosis later this year.

Aiforia has some 3,000 users in 50 countries and raised a 25.2M in Series B funding round in June.

HQ: Helsinki

Founded: 2016

Flowhaven aims to streamline the way companies manage their licensing partnerships. This is a huge market but its still largely done manually through emails and clunky spreadsheets.

Its a problem that Flowhaven founder and CEO Kalle Trm experienced first-hand when he worked on licensing at Rovio, the Finnish company behind the Angry Birds game. Trm left his job to create a solution to this.

Flowhaven now has more than 100 customers using its system, including names like Nintendo and Games Workshop. The company raised a $16m Series A funding round in January and at the time reported 400% year-on-year growth. Its aiming to increase headcount to close to 100 by the end of the year.

HQ: Uusimaa

Founded: 2021

One of the newest arrivals on the Finnish start-up scene, this spinout from the VTT Technical Research Centre is focused on farming black soldier flies to create animal feed, pet food and ingredients for cosmetics. Cofounders Matti Thtinen and Tuure Parviainen met while working on a black soldier fly farming project at VTT, while COO Jarna Hyvnen has a background in managing circular economy projects. Their idea is to take agricultural waste products and byproducts from breweries and mills and turn them into high-value, usable protein, creating a circular economy for these parts of the food industry.

Volare has so far raised a 700k seed round from Maki.vc, allowing them to build a pilot facility for the black soldier fly breeding. They have plans for a first commercial-scale facility to be ready by 2023. The focus is currently on fish feed and pet food, but Volares intention is to also produce products for people too, once European regulations allow for black soldier fly-based protein to be used for human consumption. The EU has already ruled certain types of mealworm safe to eat.

Volares biggest competitor is Dutch company Protix, which has also focused on farming the black soldier fly. French startup Ynsect, which raised a 304m funding round last year, focuses on mealworms.

Mealworms may be further along the food approval route, but CTO Matti Thtinen says in the long run black soldier flies are a better proposition for the circular economy as they can be fed a far wider range of foods.

HQ: Espoo

Founded: 2021

Another new startup, only recently out of stealth mode, Pixieray is making active glasses that sense what the user is looking at and adjusts to give the perfect focus at all times. The principle is similar to the way a mobile phone camera automatically adjusts to the focal point of the shot, and it would mean an end to people having to use varifocal lenses or switching glasses between different activities.

Other companies have tried and failed at the active glasses challenge in the past, but CEO Niko Eiden and CTO Klaus Meklari come from Varjo, the VR glasses company and have a strong background in eyewear. One of the biggest challenges has been getting the technology and the batteries small enough to fit into the frame of a normal-sized pair of glasses, but miniaturisation is now reaching the point at which this is becoming possible.

The company so far has just a prototype but expects to start shipping a commercial product in 2023. Pixieray raised a 3.74m seed round from investors including Maki.vc in June.

HQ: Helsinki

Founded: 2018

The company began as an award-winning XR and gaming studio 17 years ago, but morphed into Glue in 2017 focusing on building VR remote collaboration tools for businesses.

Up to 30 people wearing VR headsets can work together in a virtual collaboration space, appearing as head-and-arms avatars and able to work together on documents, share presentations and videos as well as breaking out into smaller groups.

The company raised a 3.5m seed round in 2019 and hasnt raised since. However, it is now getting income from genuine paying customers. Some 100-150 big corporations, including Deutsche Telecom and Axel Springer, are using the system, although many of these relationships are still at the pilot stage.

Maija Palmer is Sifteds innovation editor. She covers deeptech and corporate innovation, and tweets from@maijapalmer

Read the original here:

Finland's top startups and scaleups to watch - Sifted

Read the Rest...

What is quantum computing? Everything you need to know about the strange world of quantum computers – ZDNet

§ July 30th, 2021 § Filed under Quantum Computer Comments Off on What is quantum computing? Everything you need to know about the strange world of quantum computers – ZDNet

While researchers don't understand everything about the quantum world, what they do know is that quantum particles hold immense potential, in particular to hold and process large amounts of information.

Quantum computing exploits the puzzling behavior that scientists have been observing for decades in nature's smallest particles think atoms, photons or electrons. At this scale, the classical laws of physics ceases to apply, and instead we shift to quantum rules.

While researchers don't understand everything about the quantum world, what they do know is that quantum particles hold immense potential, in particular to hold and process large amounts of information. Successfully bringing those particles under control in a quantum computer could trigger an explosion of compute power that would phenomenally advance innovation in many fields that require complex calculations, like drug discovery, climate modelling, financial optimization or logistics.

As Bob Sutor, chief quantum exponent at IBM, puts it: "Quantum computing is our way of emulating nature to solve extraordinarily difficult problems and make them tractable," he tells ZDNet.

Quantum computers come in various shapes and forms, but they are all built on the same principle: they host a quantum processor where quantum particles can be isolated for engineers to manipulate.

The nature of those quantum particles, as well as the method employed to control them, varies from one quantum computing approach to another. Some methods require the processor to be cooled down to freezing temperatures, others to play with quantum particles using lasers but share the goal of finding out how to best exploit the value of quantum physics.

The systems we have been using since the 1940s in various shapes and forms laptops, smartphones, cloud servers, supercomputers are known as classical computers. Those are based on bits, a unit of information that powers every computation that happens in the device.

In a classical computer, each bit can take on either a value of one or zero to represent and transmit the information that is used to carry out computations. Using bits, developers can write programs, which are sets of instructions that are read and executed by the computer.

Classical computers have been indispensable tools in the last few decades, but the inflexibility of bits is limiting. As an analogy, if tasked with looking for a needle in a haystack, a classical computer would have to be programmed to look through every single piece of hay straw until it reached the needle.

There are still many large problems, therefore, that classical devices can't solve. "There are calculations that could be done on a classical system, but they might take millions of years or use more computer memory that exists in total on Earth," says Sutor. "These problems are intractable today."

At the heart of any quantum computer are qubits, also known as quantum bits, and which can loosely be compared to the bits that process information in classical computers.

Qubits, however, have very different properties to bits, because they are made of the quantum particles found in nature those same particles that have been obsessing scientists for many years.

One of the properties of quantum particles that is most useful for quantum computing is known as superposition, which allows quantum particles to exist in several states at the same time. The best way to imagine superposition is to compare it to tossing a coin: instead of being heads or tails, quantum particles are the coin while it is still spinning.

By controlling quantum particles, researchers can load them with data to create qubits and thanks to superposition, a single qubit doesn't have to be either a one or a zero, but can be both at the same time. In other words, while a classical bit can only be heads or tails, a qubit can be, at once, heads and tails.

This means that, when asked to solve a problem, a quantum computer can use qubits to run several calculations at once to find an answer, exploring many different avenues in parallel.

So in the needle-in-a-haystack scenario about, unlike a classical machine, a quantum computer could in principle browse through all hay straws at the same time, finding the needle in a matter of seconds rather than looking for years even centuries before it found what it was searching for.

What's more: qubits can be physically linked together thanks to another quantum property called entanglement, meaning that with every qubit that is added to a system, the device's capabilities increase exponentially where adding more bits only generates linear improvement.

Every time we use another qubit in a quantum computer, we double the amount of information and processing ability available for solving problems. So by the time we get to 275 qubits, we can compute with more pieces of information than there are atoms in the observable universe. And the compression of computing time that this could generate could have big implications in many use cases.

Quantum computers are all built on the same principle: they host a quantum processor where quantum particles can be isolated for engineers to manipulate.

"There are a number of cases where time is money. Being able to do things more quickly will have a material impact in business," Scott Buchholz, managing director at Deloitte Consulting, tells ZDNet.

The gains in time that researchers are anticipating as a result of quantum computing are not of the order of hours or even days. We're rather talking about potentially being capable of calculating, in just a few minutes, the answer to problems that today's most powerful supercomputers couldn't resolve in thousands of years, ranging from modelling hurricanes all the way to cracking the cryptography keys protecting the most sensitive government secrets.

And businesses have a lot to gain, too. According to recent research by Boston Consulting Group (BCG),the advances that quantum computing will enable could create value of up to $850 billion in the next 15 to 30 years, $5 to $10 billion of which will be generated in the next five years if key vendors deliver on the technology as they have promised.

Programmers write problems in the form of algorithms for classical computers to resolve and similarly, quantum computers will carry out calculations based on quantum algorithms. Researchers have already identified that some quantum algorithms would be particularly suited to the enhanced capabilities of quantum computers.

For example, quantum systems could tackle optimization algorithms, which help identify the best solution among many feasible options, and could be applied in a wide range of scenarios ranging from supply chain administration to traffic management. ExxonMobil and IBM, for instance, are working together to find quantum algorithmsthat could one day manage the 50,000 merchant ships crossing the oceans each day to deliver goods, to reduce the distance and time traveled by fleets.

Quantum simulation algorithms are also expected to deliver unprecedented results, as qubits enable researchers to handle the simulation and prediction of complex interactions between molecules in larger systems, which could lead to faster breakthroughs in fields like materials science and drug discovery.

With quantum computers capable of handling and processing much larger datasets,AI and machine learning applications are set to benefit hugely, with faster training times and more capable algorithms. And researchers have also demonstrated that quantum algorithmshave the potential to crack traditional cryptography keys, which for now are too mathematically difficult for classical computers to break.

To create qubits, which are the building blocks of quantum computers, scientists have to find and manipulate the smallest particles of nature tiny parts of the universe that can be found thanks to different mediums. This is why there are currently many types of quantum processors being developed by a range of companies.

One of the most advanced approaches consists of using superconducting qubits, which are made of electrons, and come in the form of the familiar chandelier-like quantum computers. Both IBM and Google have developed superconducting processors.

Another approach that is gaining momentum is trapped ions, which Honeywell and IonQ are leading the way on, and in which qubits are housed in arrays of ions that are trapped in electric fields and then controlled with lasers.

Major companies like Xanadu and PsiQuantum, for their part, are investing in yet another method that relies on quantum particles of light, called photons, to encode data and create qubits. Qubits can also be created out of silicon spin qubits which Intel is focusing on but also cold atoms or even diamonds.

Quantum annealing, an approach that was chosen by D-Wave, is a different category of computing altogether. It doesn't rely on the same paradigm as other quantum processors, known as the gate model. Quantum annealing processors are much easier to control and operate, which is why D-Wave has already developed devices that can manipulate thousands of qubits, where virtually every other quantum hardware company is working with about 100 qubits or less. On the other hand, the annealing approach is only suitable for a specific set of optimization problems, which limits its capabilities.

What can you do with a quantum computer today?

Right now, with a mere 100 qubits the state of the art, there is very little that can actually be done with quantum computers. For qubits to start carrying out meaningful calculations, they will have to be counted in the thousands, and even millions.

Both IBM and Google have developed superconducting processors.

Right now, with a mere 100 qubits the state of the art, there is very little that can actually be done with quantum computers. For qubits to start carrying out meaningful calculations, they will have to be counted in the thousands, and even millions.

"While there is a tremendous amount of promise and excitement about what quantum computers can do one day, I think what they can do today is relatively underwhelming," says Buchholz.

Increasing the qubit count in gate-model processors, however, is incredibly challenging. This is because keeping the particles that make up qubits in their quantum state is difficult a little bit like trying to keep a coin spinning without falling on one side or the other, except much harder.

Keeping qubits spinning requires isolating them from any environmental disturbance that might cause them to lose their quantum state. Google and IBM, for example, do this by placing their superconducting processors in temperatures that are colder than outer space, which in turn require sophisticated cryogenic technologies that are currently near-impossible to scale up.

In addition, the instability of qubits means that they are unreliable, and still likely to cause computation errors. This hasgiven rise to a branch of quantum computing dedicated to developing error-correction methods.

Although research is advancing at pace, therefore, quantum computers are for now stuck in what is known as the NISQ era: noisy, intermediate-scale quantum computing but the end-goal is to build a fault-tolerant, universal quantum computer.

As Buchholz explains, it is hard to tell when this is likely to happen. "I would guess we are a handful of years from production use cases, but the real challenge is that this is a little like trying to predict research breakthroughs," he says. "It's hard to put a timeline on genius."

In 2019, Googleclaimed that its 54-qubit superconducting processor called Sycamore had achieved quantum supremacy the point at which a quantum computer can solve a computational task that is impossible to run on a classical device in any realistic amount of time.

Google said that Sycamore has calculated, in only 200 seconds, the answer to a problem that would have taken the world's biggest supercomputers 10,000 years to complete.

More recently,researchers from the University of Science and Technology of China claimed a similar breakthrough, saying that their quantum processor had taken 200 seconds to achieve a task that would have taken 600 million years to complete with classical devices.

This is far from saying that either of those quantum computers are now capable of outstripping any classical computer at any task. In both cases, the devices were programmed to run very specific problems, with little usefulness aside from proving that they could compute the task significantly faster than classical systems.

Without a higher qubit count and better error correction, proving quantum supremacy for useful problems is still some way off.

Organizations that are investing in quantum resources see this as the preparation stage: their scientists are doing the groundwork to be ready for the day that a universal and fault-tolerant quantum computer is ready.

In practice, this means that they are trying to discover the quantum algorithms that are most likely to show an advantage over classical algorithms once they can be run on large-scale quantum systems. To do so, researchers typically try to prove that quantum algorithms perform comparably to classical ones on very small use cases, and theorize that as quantum hardware improves, and the size of the problem can be grown, the quantum approach will inevitably show some significant speed-ups.

For example, scientists at Japanese steel manufacturer Nippon Steelrecently came up with a quantum optimization algorithm that could compete against its classical counterpartfor a small problem that was run on a 10-qubit quantum computer. In principle, this means that the same algorithm equipped with thousands or millions of error-corrected qubits could eventually optimize the company's entire supply chain, complete with the management of dozens of raw materials, processes and tight deadlines, generating huge cost savings.

The work that quantum scientists are carrying out for businesses is therefore highly experimental, and so far there are fewer than 100 quantum algorithms that have been shown to compete against their classical equivalents which only points to how emergent the field still is.

With most use cases requiring a fully error-corrected quantum computer, just who will deliver one first is the question on everyone's lips in the quantum industry, and it is impossible to know the exact answer.

All quantum hardware companies are keen to stress that their approach will be the first one to crack the quantum revolution, making it even harder to discern noise from reality. "The challenge at the moment is that it's like looking at a group of toddlers in a playground and trying to figure out which one of them is going to win the Nobel Prize," says Buchholz.

"I have seen the smartest people in the field say they're not really sure which one of these is the right answer. There are more than half a dozen different competing technologies and it's still not clear which one will wind up being the best, or if there will be a best one," he continues.

In general, experts agree that the technology will not reach its full potential until after 2030. The next five years, however, may start bringing some early use cases as error correction improves and qubit counts start reaching numbers that allow for small problems to be programmed.

IBM is one of the rare companies thathas committed to a specific quantum roadmap, which defines the ultimate objective of realizing a million-qubit quantum computer. In the nearer-term, Big Blue anticipates that it will release a 1,121-qubit system in 2023, which might mark the start of the first experimentations with real-world use cases.

In general, experts agree that quantum computers will not reach their full potential until after 2030.

Developing quantum hardware is a huge part of the challenge, and arguably the most significant bottleneck in the ecosystem. But even a universal fault-tolerant quantum computer would be of little use without the matching quantum software.

"Of course, none of these online facilities are much use without knowing how to 'speak' quantum," Andrew Fearnside, senior associate specializing in quantum technologies at intellectual property firm Mewburn Ellis, tells ZDNet.

Creating quantum algorithms is not as easy as taking a classical algorithm and adapting it to the quantum world. Quantum computing, rather, requires a brand-new programming paradigm that can only be ran on a brand-new software stack.

Of course, some hardware providers also develop software tools, the most established of which is IBM's open-source quantum software development kit Qiskit. But on top of that, the quantum ecosystem is expanding to include companies dedicated exclusively to creating quantum software. Familiar names include Zapata, QC Ware or 1QBit, which all specialize in providing businesses with the tools to understand the language of quantum.

And increasingly, promising partnerships are forming to bring together different parts of the ecosystem. For example, therecent alliance between Honeywell, which is building trapped ions quantum computers, and quantum software company Cambridge Quantum Computing (CQC), has got analysts predicting that a new player could be taking a lead in the quantum race.

The complexity of building a quantum computer think ultra-high vacuum chambers, cryogenic control systems and other exotic quantum instruments means that the vast majority of quantum systems are currently firmly sitting in lab environments, rather than being sent out to customers' data centers.

To let users access the devices to start running their experiments, therefore, quantum companies have launched commercial quantum computing cloud services, making the technology accessible to a wider range of customers.

The four largest providers of public cloud computing services currently offer access to quantum computers on their platform. IBM and Google have both put their own quantum processors on the cloud, whileMicrosoft's Azure QuantumandAWS's Braketservice let customers access computers from third-party quantum hardware providers.

The jury remains out on which technology will win the race, if any at all, but one thing is for certain: the quantum computing industry is developing fast, and investors are generously funding the ecosystem. Equity investments in quantum computing nearly tripled in 2020, and according to BCG, they are set to rise even more in 2021 to reach $800 million.

Government investment is even more significant: the US has unlocked $1.2 billion for quantum information science over the next five years, while the EU announced a 1 billion ($1.20 billion) quantum flagship. The UKalso recently reached the 1 billion ($1.37 billion) budget milestonefor quantum technologies, and while official numbers are not known in China,the government has made no secret of its desire to aggressively compete in the quantum race.

This has caused the quantum ecosystem to flourish over the past years, with new start-ups increasing from a handful in 2013 to nearly 200 in 2020. The appeal of quantum computing is also increasing among potential customers: according to analysis firm Gartner,while only 1% of companies were budgeting for quantum in 2018, 20% are expected to do so by 2023.

Although not all businesses need to be preparing themselves to keep up with quantum-ready competitors, there are some industries where quantum algorithms are expected to generate huge value, and where leading companies are already getting ready.

Goldman Sachs and JP Morgan are two examples of financial behemoths investing in quantum computing. That's because in banking,quantum optimization algorithms could give a boost to portfolio optimization, by better picking which stocks to buy and sell for maximum return.

In pharmaceuticals, where the drug discovery process is on average a $2 billion, ten-year-long deal that largely relies on trial and error, quantum simulation algorithms are also expected to make waves. This is also the case in materials science: companies like OTI Lumionics, for example,are exploring the use of quantum computers to design more efficient OLED displays.

Leading automotive companies including Volkswagen and BMW are also keeping a close eye on the technology, which could impact the sector in various ways, ranging from designing more efficient batteries to optimizing the supply chain, through to better management of traffic and mobility. Volkswagen, for example,pioneered the use of a quantum algorithm that optimized bus routes in real time by dodging traffic bottlenecks.

As the technology matures, however, it is unlikely that quantum computing will be limited to a select few. Rather, analysts anticipate that virtually all industries have the potential to benefit from the computational speedup that qubits will unlock.

There are some industries where quantum algorithms are expected to generate huge value, and where leading companies are already getting ready.

Quantum computers are expected to be phenomenal at solving a certain class of problems, but that doesn't mean that they will be a better tool than classical computers for every single application. Particularly, quantum systems aren't a good fit for fundamental computations like arithmetic, or for executing commands.

"Quantum computers are great constraint optimizers, but that's not what you need to run Microsoft Excel or Office," says Buchholz. "That's what classical technology is for: for doing lots of maths, calculations and sequential operations."

In other words, there will always be a place for the way that we compute today. It is unlikely, for example, that you will be streaming a Netflix series on a quantum computer anytime soon. Rather, the two technologies will be used in conjunction, with quantum computers being called for only where they can dramatically accelerate a specific calculation.

Buchholz predicts that, as classical and quantum computing start working alongside each other, access will look like a configuration option. Data scientists currently have a choice of using CPUs or GPUs when running their workloads, and it might be that quantum processing units (QPUs) join the list at some point. It will be up to researchers to decide which configuration to choose, based on the nature of their computation.

Although the precise way that users will access quantum computing in the future remains to be defined, one thing is certain: they are unlikely to be required to understand the fundamental laws of quantum computing in order to use the technology.

"People get confused because the way we lead into quantum computing is by talking about technical details," says Buchholz. "But you don't need to understand how your cellphone works to use it."

"People sometimes forget that when you log into a server somewhere, you have no idea what physical location the server is in or even if it exists physically at all anymore. The important question really becomes what it is going to look like to access it."

And as fascinating as qubits, superposition, entanglement and other quantum phenomena might be, for most of us this will come as welcome news.

View post:

What is quantum computing? Everything you need to know about the strange world of quantum computers - ZDNet

Read the Rest...

IBM’s newest quantum computer is now up-and-running: Here’s what it’s going to be used for – ZDNet

§ July 30th, 2021 § Filed under Quantum Computer Comments Off on IBM’s newest quantum computer is now up-and-running: Here’s what it’s going to be used for – ZDNet

A Quantum System One, IBM's flagship integrated superconducting quantum computer, is now available on-premises in the Kawasaki Business Incubation Center in Kawasaki City.

IBM has unveiled a brand-new quantum computer in Japan, thousands of miles away from the company's quantum computation center in Poughkeepsie, New York, in another step towards bringing quantum technologies out of Big Blue's labs and directly to partners around the world.

A Quantum System One, IBM's flagship integrated superconducting quantum computer, is now available on-premises in the Kawasaki Business Incubation Center in Kawasaki City, for Japanese researchers to run their quantum experiments in fields ranging from chemistry to finance.

Most customers to date can only access IBM's System One over the cloud, by connecting to the company's quantum computation center in Poughkeepsie.

Recently, the company unveiled the very first quantum computer that was physically built outside of the computation center's data centers,when the Fraunhofer Institute in Germany acquired a System One. The system that has now been deployed to Japan is therefore IBM's second quantum computer that is located outside of the US.

The announcement comes as part of a long-standing relationship with Japanese organizations. In 2019, IBM and the University of Tokyo inaugurated the Japan-IBM Quantum Partnership, a national agreement inviting universities and businesses across the country to engage in quantum research. It was agreed then that a Quantum System One would eventually be installed at an IBM facility in Japan.

Building on the partnership, Big Blue and the University of Tokyolaunched the Quantum Innovation Initiative Consortium last yearto further bring together organizations working in the field of quantum. With this, the Japanese government has made it clear that it is keen to be at the forefront of the promising developments that quantum technologies are expected to bring about.

Leveraging some physical properties that are specific to quantum mechanics, quantum computers could one day be capable of carrying out calculations that are impossible to run on the devices that are used today, known as a classical computers.

In some industries, this could have big implications; and as part of the consortium, together with IBM researchers, some Japanese companies have already identified promising use cases. Mitsubishi Chemical's research team, for example, has developed quantum algorithms capable of understanding the complex behavior of industrial chemical compounds with the goal of improving OLED displays.

A recent research paper published by the scientistshighlighted the potential of quantum computers when it comes to predicting the properties of OLED materials, which could eventually lead to more efficient displays requiring low-power consumption.

Similarly, researchers from Mizuho Financial Group and Mitsubishi Financial Group have been developing quantum algorithms that could speedup financial operations like Monte Carlo simulations, which could allow for optimized portfolio management thanks to better risk analysis and option pricing.

With access to IBM's Quantum System One, research in those fields is now expected to accelerate. But other industry leaders exploring quantum technologies as part of the partnership extend from Sony to Toyota, through Hitachi, Toshiba or JSR.

Quantum computing is still in its very early stages, and it is not yet possible to use quantum computers to perform computations that are of any value to a business. Rather, scientists are currently carrying out proofs-of-concept, by attempting to identify promising applications and testing them at a very small scale, to be prepared for the moment that the hardware is fully ready.

This is still some way off. Building and controlling the components of quantum computers is a huge challenge, which has so far been limited to the confines of specialist laboratories such as IBM's Poughkeepsie computation center.

It is significant, therefore, that IBM's Quantum System One is now mature enough to be deployed outside of the company's lab.

"Thousands of meticulously engineered components have to work together flawlessly in extreme temperatures within astonishing tolerances," said IBM in a blog post.

Back in the US, too, quantum customers are showing interest in building quantum hardware in their own facilities. The Cleveland Clinic, for example,recently invested $500 million for Big Blue to build quantum hardware on-premises.

Continued here:

IBM's newest quantum computer is now up-and-running: Here's what it's going to be used for - ZDNet

Read the Rest...

Quantum Cash and the End of Counterfeiting – IEEE Spectrum

§ July 30th, 2021 § Filed under Quantum Computer Comments Off on Quantum Cash and the End of Counterfeiting – IEEE Spectrum

Illustration: Emily Cooper

Since the invention of paper money, counterfeiters have churned out fake bills. Some of their handiwork, created with high-tech inks, papers, and printing presses, is so good that its very difficult to distinguish from the real thing. National banks combat the counterfeiters with difficult-to-copy watermarks, holograms, and other sophisticated measures. But to give money the ultimate protection, some quantum physicists are turning to the weird quirks that govern natures fundamental particles.

At the moment, the idea of quantum money is very much on the drawing board. That hasnt stopped researchers from pondering what encryption schemes they might apply for it, or from wondering how the technologies used to create quantum states could be shrunk down to the point of fitting it in your wallet, says Scott Aaronson, an MIT computer scientist who works on quantum money. This is science fiction, but its science fiction that doesnt violate any of the known laws of physics.

The laws that govern subatomic particles differ dramatically from those governing everyday experience. The relevant quantum law here is the no-cloning theorem, which says it is impossible to copy a quantum particles state exactly. Thats because reproducing a particles state involves making measurementsand the measurements change the particles overall properties. In certain cases, where you already know something about the state in question, quantum mechanics does allow you to measure one attribute of a particle. But in doing so youve made it impossible to measure the particles other attributes.

This rule implies that if you use money that is somehow linked to a quantum particle, you could, in principle, make it impossible to copy: It would be counterfeit-proof.

The visionary physicist Stephen Wiesner came up with the idea of quantum money in 1969. He suggested that banks somehow insert a hundred or so photons, the quantum particles of light, into each banknote. He didnt have any clear idea of how to do that, nor do physicists today, but never mind. Its still an intriguing notion, because the issuing bank could then create a kind of minuscule secret watermark by polarizing the photons in a special way.

To validate the note later, the bank would check just one attribute of each photon (for example, its vertical or horizontal polarization), leaving all other attributes unmeasured. The bank could then verify the notes authenticity by checking its records for how the photons were set originally for this particular bill, which the bank could look up using the bills printed serial number.

Thanks to the no-cloning theorem, a counterfeiter couldnt measure all the attributes of each photon to produce a copy. Nor could he just measure the one attribute that mattered for each photon, because only the bank would know which attributes those were.

But beyond the daunting engineering challenge of storing photons, or any other quantum particles, theres another basic problem with this scheme: Its a private encryption. Only the issuing bank could validate the notes. The ideal is quantum money that anyone can verify, Aaronson saysjust the way every store clerk in the United States can hold a $20 bill up to the light to look for the embedded plastic strip.

That would require some form of public encryption, and every such scheme researchers have created so far is potentially crackable. But its still worth exploring how that might work. Verification between two people would involve some kind of black boxa machine that checks the status of a piece of quantum money and spits out only the answer valid or invalid. Most of the proposed public-verification schemes are built on some sort of mathematical relationship between a bank notes quantum states and its serial number, so the verification machine would use an algorithm to check the math. This verifier, and the algorithm it follows, must be designed so that even if they were to fall into the hands of a counterfeiter, he couldnt use them to create fakes.

As fast as quantum money researchers have proposed encryption schemes, their colleagues have cracked them, but its clear that everyones having a great deal of fun. Most recently, Aaronson and his MIT collaborator Paul Christiano put forth a proposal [PDF] in which each banknotes serial number is linked to a large number of quantum particles, which are bound together using a quantum trick known as entanglement.

All of this is pie in the sky, of course, until engineers can create physical systems capable of retaining quantum states within moneyand that will perhaps be the biggest challenge of all. Running a quantum economy would require people to hold information encoded in the polarization of photons or the spin of electrons, say, for as long as they required cash to sit in their pockets. But quantum states are notoriously fragile: They decohere and lose their quantum properties after frustratingly short intervals of time. Youd have to prevent it from decohering in your wallet, Aaronson says.

For many researchers, that makes quantum money even more remote than useful quantum computers. At present, its hard to imagine having practical quantum money before having a large-scale quantum computer, says Michele Mosca of the Institute for Quantum Computing at the University of Waterloo, in Canada. And these superfast computers must also overcome the decoherence problem before they become feasible.

If engineers ever do succeed in building practical quantum computersones that can send information through fiber-optic networks in the form of encoded photonsquantum money might really have its day. On this quantum Internet, financial transactions would not only be secure, they would be so ephemeral that once the photons had been measured, there would be no trace of their existence. In todays age of digital cash, we have already relieved ourselves of the age-old burden of carrying around heavy metal coins or even wads of banknotes. With quantum money, our pockets and purses might finally be truly empty.

Michael Brooks, a British science journalist, holds a Ph.D. in quantum physics from the University of Sussex, which prepared him well to tackle the article Quantum Cash and the End of Counterfeiting. He says he found the topic of quantum money absolutely fascinating, and adds, I just hope I get to use some in my lifetime. He is the author, most recently, of Free Radicals: The Secret Anarchy of Science (Profile Books, 2011).

Read the original:

Quantum Cash and the End of Counterfeiting - IEEE Spectrum

Read the Rest...

Will the NSA Finally Build Its Superconducting Spy Computer? – IEEE Spectrum

§ July 30th, 2021 § Filed under Quantum Computer Comments Off on Will the NSA Finally Build Its Superconducting Spy Computer? – IEEE Spectrum

Today, silicon microchips underlie every aspect of digital computing. But their dominance was never a foregone conclusion. Throughout the 1950s, electrical engineers and other researchers explored many alternatives to making digital computers.

One of them seized the imagination of the U.S. National Security Agency (NSA): a superconducting supercomputer. Such a machine would take advantage of superconducting materials that, when chilled to nearly the temperature of deep spacejust a few degrees above absolute zeroexhibit no electrical resistance whatsoever. This extraordinary property held the promise of computers that could crunch numbers and crack codes faster than transistor-based systems while consuming far less power.

For six decades, from the mid-1950s to the present, the NSA has repeatedly pursued this dream, in partnership with industrial and academic researchers. Time and again, the agency sponsored significant projects to build a superconducting computer. Each time, the effort was abandoned in the face of the unrelenting pace of Moores Law and the astonishing increase in performance and decrease in cost of silicon microchips.

Now Moores Law is stuttering, and the worlds supercomputer builders are confronting an energy crisis. Nuclear weapon simulators, cryptographers, and others want exascale supercomputers, capable of 1,000 petaflops1 million trillion floating-point operations per secondor greater. The worlds fastest known supercomputer today, Chinas 34-petaflop Tianhe-2, consumes some 18 megawatts of power. Thats roughly the amount of electricity drawn instantaneously by 14,000 average U.S. households. Projections vary depending on the type of computer architecture used, but an exascale machine built with todays best silicon microchips could require hundreds of megawatts.

The exascale push may be superconducting computings opening. And the Intelligence Advanced Research Projects Activity, the U.S. intelligence communitys arm for high-risk R&D, is making the most of it. With new forms of superconducting logic and memory in development, IARPA has launched an ambitious program to create the fundamental building blocks of a superconducting supercomputer. In the next few years, the effort could finally show whether the technology really can beat silicon when given the chance.

Cold Calling: In the 1950s, Dudley Buck envisioned speedy, energy-efficient computers. These would be driven by his superconducting switch, thecryotron. Photo:Gjon Mili/The LIFE Picture Collection/Getty Images

The NSAs dream of superconducting supercomputers was first inspired by the electrical engineer Dudley Buck. Buck worked for the agencys immediate predecessor on an early digital computer. When he moved to MIT in 1950, he remained a military consultant, keeping the Armed Forces Security Agency, which quickly became the NSA, abreast of new computing developments in Cambridge.

Buck soon reported on his own worka novel superconducting switch he named the cryotron. The device works by switching a material between its superconducting statewhereelectrons couple up and flow as a supercurrent, with no resistance at alland its normal state, where electrons flow with some resistance. A number of superconducting metallic elements and alloys reach that state when they are cooled below a critical temperature near absolute zero. Once the material becomes superconducting, a sufficiently strong magnetic field can drive the material back to its normal state.

In this, Buck saw a digital switch. He coiled a tiny control wire around a gate wire, and plunged the pair into liquid helium. When current ran through the control, the magnetic field it created pushed the superconducting gate into its normal resistive state. When the control current was turned off, the gate became superconducting again.

Buck thought miniature cryotrons could be used to fashion powerful, fast, and energy-efficient digital computers. The NSA funded work by him and engineer Albert Slade on cryotron memory circuits at the firm A.D. Little, as well as a broader project on digital cryotron circuitry at IBM. Quickly, GE, RCA, and others launched their own cryotron efforts.

Engineers continued developing cryotron circuits into the early 1960s, despite Bucks sudden and premature death in 1959. But liquid-helium temperatures made cryotrons challenging to work with, and the time required for materials to transition from a superconducting to a resistive state limited switching speeds. The NSA eventually pulled back on funding, and many researchers abandoned superconducting electronics for silicon.

Even as these efforts faded, a big change was under way. In 1962 British physicist Brian Josephson made a provocative prediction about quantum tunneling in superconductors. In typical quantum-mechanical tunneling, electrons sneak across an insulating barrier, assisted by a voltage push; the electrons progress occurs with some resistance. But Josephson predicted that if the insulating barrier between two superconductors is thin enough, a supercurrent of paired electrons could flow across with zero resistance, as if the barrier were not there at all. This became known as the Josephson effect, and a switch based on the effect, the Josephson junction, soon followed.

Junction Exploration: 1970s-era Josephson circuitry. Image: IBM

IBM researchers developed a version of this switch in the mid-1960s. The active part of the device was a line of superconducting metal, interrupted by a thin oxide barrier cutting across it. A supercurrent would freely tunnel across the barrier, but only up to a point; if the current rose above a certain threshold, the device would saturate and unpaired electrons would trickle across the junction with some resistance. The threshold could be tuned by a magnetic field, created by running current through a nearby superconducting control line. If the device operated close to the threshold current, a small current in the control could shift the threshold and switch the gate out of its supercurrent-tunneling state. Unlike in Bucks cryotron, the materials in this device always remained superconducting, making it a much faster electronic switch.

As explored by historian Cyrus Mody, by 1973 IBM was working on building a superconducting supercomputer based on Josephson junctions. The basic building block of its circuits was a superconducting loop with Josephson junctions in it, known as a superconducting quantum interference device, or SQUID. The NSA covered a substantial fraction of the costs, and IBM expected the agency to be its first superconducting-supercomputer customer, with other government and industry buyers to follow.

IBMs superconducting supercomputer program ran for more than 10 years, at a cost of about US $250 million in todays dollars. It mainly pursued Josephson junctions made from lead alloy and lead oxide. Late in the project, engineers switched to a niobium oxide barrier, sandwiched between a lead alloy and a niobium film, an arrangement that produced more-reliable devices. But while the project made great strides, company executives were not convinced that an eventual supercomputer based on the technology could compete with the ones expected to emerge with advanced silicon microchips. In 1983, IBM shut down the program without ever finishing a Josephson-junction-based computer, super or otherwise.

Japan persisted where IBM had not. Inspired by IBMs project, Japans industrial ministry, MITI, launched a superconducting computer effort in 1981. The research partnership, which included Fujitsu, Hitachi, and NEC, lasted for eight years and produced an actual working Josephson-junction computerthe ETL-JC1. It was a tiny, 4-bit machine, with just 1,000 bits of RAM, but it could actually run a program. In the end, however, MITI came to share IBMs opinion about the prospect of scaling up the technology, and the project was abandoned.

Critical new developments emerged outside these larger superconducting-computer programs. In 1983, Bell Telephone Laboratories researchers formed Josephson junctions out of niobium separated by thin aluminum oxide layers. The new superconducting switches were extraordinarily reliable and could be fabricated using a simplified patterning process much in the same way silicon microchips were.

On The Move: Magnetic flux ejected from a superconducting loop through a Josephson junction can take the form of tiny voltage pulses. The presence or absence of a pulse in a given period of time can be used to perform computations. Image: Hypres

Then in 1985, researchers at Moscow State University proposed [PDF] a new kind of digital superconducting logic. Originally dubbed resistive, then renamed rapid single-flux-quantum logic, or RSFQ, it took advantage of the fact that a Josephson junction in a loop of superconducting material can emit minuscule voltage pulses. Integrated over time, they take on only a quantized, integer multiple of a tiny value called the flux quantum, measured in microvolt-picoseconds.

By using such ephemeral voltage pulses, each lasting a picosecond or so, RSFQ promised to boost clock speeds to greater than 100 gigahertz. Whats more, a Josephson junction in such a configuration would expend energy in the range of just a millionth of a picojoule, considerably less than consumed by todays silicon transistors.

Together, Bell Labs Josephson junctions and Moscow State Universitys RSFQ rekindled interest in superconducting electronics. By 1997, the U.S. had launched the Hybrid Technology Multi-Threaded (HTMT) project, which was supported by the National Science Foundation, the NSA, and other agencies. HTMTs goal was to beat conventional silicon to petaflop-level supercomputing, using RSFQ integrated circuits among other technologies.

It was an ambitious program that faced a number of challenges. The RSFQ circuits themselves limited potential computing efficiency. To achieve tremendous speed, RSFQ used resistors to provide electrical biases to the Josephson junctions in order to keep them close to the switching threshold. In experimental RSFQ circuitry with several thousand biased Josephson junctions, the static power dissipation was negligible. But in a petaflop-scale supercomputer, with possibly many billions of such devices, it would have added up to significant power consumption.

The HTMT project ended in 2000. Eight years later, a conventional silicon supercomputerIBMs Roadrunnerwas touted as the first to reach petaflop operation. It contained nearly 20,000 silicon microprocessors and consumed 2.3megawatts.

For many researchers working on superconducting electronics, the period around 2000 marked a shift to an entirely different direction: quantum computing. This new direction was inspired by the 1994 work of mathematician Peter Shor, then at Bell Labs, which suggested that a quantum computer could be a powerful cryptanalytical tool, able to rapidly decipher encrypted communications. Soon, projects in superconducting quantum computing and superconducting digital circuitry were being sponsored by the NSA and the U.S. Defense Advanced Research Projects Agency. They were later joined by IARPA, which was created in 2006 by the Office of the Director of National Intelligence to sponsor intelligence-related R&D programs, collaborating across a community that includes the NSA, the Central Intelligence Agency, and the National Geospatial-Intelligence Agency.

Single-Flux Quantum: Current in a superconducting loop containing a Josephson junction a nonsuperconducting barrier generates a magnetic field with atiny, quantized value.

Nobody knew how to build a quantum computer, of course, but lots of people had ideas. At IBM and elsewhere, engineers and scientists turned to the mainstays of superconducting electronicsSQUIDs and Josephson junctionsto craft the building blocks. A SQUID exhibits quantum effects under normal operation, and it was fairly straightforward to configure it to operate as a quantum bit, or qubit.

One of the centers of this work was the NSAs Laboratory for Physical Sciences. Built near the University of Maryland, College Parkoutside the fence of NSA headquarters in Fort Meadethe laboratory is a space where the NSA and outside researchers can collaborate on work relevant to the agencys insatiable thirst for computing power.

In the early 2010s, Marc Manheimer was head of quantum computing at the laboratory. As he recently recalled in an interview, he saw an acute need for conventional digital circuits that could physically surround quantum bits in order to control them and correct errors on very short timescales. The easiest way to do this, he thought, would be with superconducting computer elements, which could operate with voltage and current levels that were similar to those of the qubit circuitry they would be controlling. Optical links could be used to connect this cooled-down, hybrid system to the outside worldand to conventional silicon computers.

At the same time, Manheimer says, he became aware of the growing power problem in high-performance silicon computing, for supercomputers as well as the large banks of servers in commercial data centers. The closer I looked at superconducting logic, he says, the more that it became clear that it had value for supercomputing in its own right.

Manheimer proposed a new direct attack on the superconducting supercomputer. Initially, he encountered skepticism. Theres this history of failure, he says. Past pursuers of superconducting supercomputers had gotten burnedso people were very cautious. But by early 2013, he says, he had convinced IARPA to fund a multisite industrial and academic R&D program, dubbed the Cryogenic Computing Complexity (C3) program. He moved to IARPA to lead it.

The first phase of C3its budget is not publiccalls for the creation and evaluation of superconducting logic circuits and memory systems. These will be fabricated at MIT Lincoln Laboratorythe same lab where Dudley Buck once worked.

Manheimer says one thing that helped sell his C3 idea was recent progress in the field, which is reflected in IARPAs selection of performers, publicly disclosed in December 2014.

One of those teams is led by the defense giant Northrop Grumman Corp. The company participated in the late 1990s HTMT project, which employed fairly-power-hungry RSFQ logic. In 2011, Northrop Grummans Quentin Herr reported an exciting alternative, a different form of single-flux quantum logic called reciprocal quantum logic. RQL replaces RSFQs DC resistors with AC inductors, which bias the circuit without constantly drawing power. An RQL circuit, says Northrop Grumman team leader Marc Sherwin, consumes 1/100,000 the power of the best equivalent CMOS circuit and far less power than the equivalent RSFQ circuit.

A similarly energy-efficient logic called ERSFQ has been developed by superconducting electronics manufacturer Hypres, whose CTO, Oleg Mukhanov, is the coinventor of RSFQ. Hypres is working with IBM, which continued its fundamental superconducting device work even after canceling its Josephson-junction supercomputer project and was also chosen to work on logic for the program.

Hypres is also collaborating with a C3 team led by a Raytheon BBN Technologies laboratory that has been active in quantum computing research for several years. There, physicist Thomas Ohki and colleagues have been working on a cryogenic memory system that uses low-power superconducting logic to control, read, and write to high-density, low-power magnetoresistive RAM. This sort of memory is another change for superconducting computing. RSFQ memory cells were fairly large. Todays more compact nanomagnetic memories, originally developed to help extend Moores Law, can also work well at low temperatures.

The worlds most advanced superconducting circuitry uses devices based on niobium. Although such devices operate at temperatures of about 4 kelvins, or 4degrees above absolute zero, Manheimer says supplying the refrigeration is now a trivial matter. Thats thanks in large part to the multibillion-dollar industry based on magnetic resonance imaging machines, which rely on superconducting electromagnets and high-quality cryogenic refrigerators.

One big question has been how much the energy needed for cooling will increase a superconducting computers energy budget. But advocates suggest it might not be much. The power drawn by commercial cryocoolers leaves considerable room for improvement, Elie Track and Alan Kadin of the IEEEs Rebooting Computing initiative recently wrote. Even so, they say, the power dissipated in a superconducting computer is so small that it remains 100 times more efficient than a comparable silicon computer, even after taking into account the present inefficient cryocooler.

For now, C3s focus is on the fundamental components. This first phase, which will run through 2017, aims to demonstrate core components of a computer system: a set of key 64-bit logic circuits capable of running at a 10-GHz clock rate and cryogenic memory arrays with capacities up to about 250 megabytes. If this effort is successful, a second, two-year phase will integrate these components into a working cryogenic computer of as-yet-unspecified size. If that prototype is deemed promising, Manheimer estimates it should be possible to create a true superconducting supercomputer in another 5 to 10 years.

Go For Power: Performance demands power. Todays most powerful supercomputers consume multiple megawatts (red), not including cooling. Superconducting computers, cryocoolers included, are projected to dramatically drop those power requirements (blue). Source: IEEE Transactions on Applied Superconductivity, vol. 23, #1701610; Marc Manheimer

Such a system would be much smaller than CMOS-based supercomputers and require far less power. Manheimer projects that a superconducting supercomputer produced in a follow-up to C3 could run at 100 petaflops and consume 200 kilowatts, including the cryocooling. It would be 1/20 the size of Titan, currently the fastest supercomputer in the United States, but deliver more than five times the performance for 1/40 of the power.

A supercomputer with those capabilities would obviously represent a big jump. But as before, the fate of superconducting supercomputing strongly depends on what happens with silicon. While an exascale computer made from todays silicon chips may not be practical, great effort and billions of dollars are now being expended on continuing to shrink silicon transistors as well as on developing on-chip optical links and 3-D stacking. Such technologies could make a big difference, says Thomas Theis, who directs nanoelectronics research at the nonprofit Semiconductor Research Corp. In July 2015, President Barack Obama announced the National Strategic Computing Initiative and called for the creation of an exascale supercomputer. IARPAs work on alternatives to silicon is part of this initiative, but so is conventional silicon. The mid-2020s has been targeted for the first silicon-based exascale machine. If that goal is met, the arrival of a superconducting supercomputer would likely be pushed out still further.

But its too early to count out superconducting computing just yet. Compared with the massive, continuous investment in silicon over the decades, superconducting computing has had meager and intermittent support. Yet even with this subsistence diet, physicists and engineers have produced an impressive string of advances. The support of the C3 program, along with the wider attention of the computing community, could push the technology forward significantly. If all goes well, superconducting computers might finally come in from the cold.

This article appears in the March 2016 print issue as The NSAs Frozen Dream.

A historian of science and technology, David C. Brock recently became director of the Center for Software History at the Computer History Museum. A few years back, while looking into the history of microcircuitry, he stumbled across the work of Dudley Buck, a pioneer of speedy cryogenic logic. He wrote about Buck in our April 2014 issue. Here he explores what happened after Buck, including a new effort to build a superconducting computer. This time, he says, the draw is energy efficiency, not performance.

Excerpt from:

Will the NSA Finally Build Its Superconducting Spy Computer? - IEEE Spectrum

Read the Rest...

Is Bitcoin (BTC) Safe from Grover’s Algorithm? – Yahoo Finance

§ July 30th, 2021 § Filed under Quantum Computer Comments Off on Is Bitcoin (BTC) Safe from Grover’s Algorithm? – Yahoo Finance

When crypto investors discuss quantum computing, they invariably worry about its potential to undermine encryption. Quantum computers alone do not pose such a mortal threat, however. Its their capacity to exploit Shors algorithm that makes them formidable.

Thats because Shors algorithm can factor large prime numbers, the security behind asymmetric encryption.

Another quantum algorithm can potentially undermine the blockchain as well. Grovers algorithm helps facilitate quantum search capabilities, enabling users to quickly find values among billions of unstructured data points at once.

Unlike Shors algorithm, Grovers algorithm is more of a threat to cryptographic hashing than encryption. When cryptographic hashes are compromised, both blockchain integrity and block mining suffer.

Collision Attacks

One-way hash functions help to make a blockchain cryptographically secure. Classical computers cannot easily reverse-engineer them. They would have to find the correct arbitrary input that maps to a specific hash value.

Using Grovers algorithm, a quantum attacker could hypothetically find two inputs that produce the same hash value. This phenomenon is known as a hash collision.

By solving this search, a blockchain attacker could serendipitously replace a valid block with a falsified one. Thats because, in a Proof-of-Work system, the current blocks hash can verify the authenticity of all past blocks.

This kind of attack remains a distant threat, however. Indeed, achieving a cryptographic collision is far more challenging than breaking asymmetric encryption.

Mining Threats

A somewhat easier attack to pull off using Grovers algorithm involves proof-of-work mining.

Using Grovers search algorithm, a quantum miner can mine at a much faster rate than a traditional miner. This miner could generate as much Proof-of-Work as the rest of the network combined. Consequently, the attacker could effectively take over the blockchain and force consensus on any block they selected.

Story continues

A quantum miner might also use Grovers search algorithm to help facilitate the guessing of a nonce. The nonce is the number that blockchain miners are solving for, in order to receive cryptocurrency. Thats because Grovers algorithm provides a quadratic speedup over a classical computer (for now, ASIC-based mining remains considerably faster).

How fast is a quadratic speedup? Roughly stated, if a classical computer can solve a complex problem in the time of T, Grovers algorithm will be able to solve the problem in the square root of T (T).

Thus, any miner who can solve the nonce faster than other miners will be able to mine the blockchain faster as well.

Grovers algorithm could also be used to speed up the generation of nonces. This capability would allow an attacker to quickly reconstruct the chain from a previously modified block (and faster than the true chain), .In the end, a savvy attacker could substitute this reconstructed chain for the true chain.

Grovers algorithm may ultimately help make Proof-of-Work obsolete. Thats because there is no possible PoW system that is not susceptible to Grover speed-up. In the end, quantum actors will always have an advantage over classical ones in PoW-based blockchains. (allowing them) to either mine more effectively or (instigate) an attack (source).

Proof-of-Work Weaknesses

As bitcoin matures, the weaknesses inherent within PoW become ever-more evident. Miners are pitted against each other as if in a never-ending arms race This arms race is incentivized by the ability of larger mining pools to achieve economies of scale, a cost advantage that quickly erodes the capacity of individual miners to survive.

Of course, Proof-of-Stake is not without flaws. For instance, critics assert that it favors larger stakeholders (hence the claim that it enables the rich to get richer). These critics neglect to note that PoW is amenable to the same strategy (albeit with miners).

As this arms race comes to a head, any miner with the resources to do so will use quantum computing to achieve a competitive advantage. Combined with Grovers algorithm, a quantum-based miner would outperform other miners (most likely, small-and medium-sized miners). .

With access to quadratic speedup, any PoW coin will inevitably fall under the control of mega-cap institutions and governments. If so, regular investors and mid to large-cap enterprises risk getting priced out of the market. In particular, their devices will be either too expensive or prone to excessive regulation (much the same way that PGP encryption once was).

Summary

Shors algorithm undoubtedly poses the most immediate threat to bitcoin (namely, the potential to break ECDSA, its digital signature algorithm). Grovers algorithm is a distant second in this respect.

Grovers algorithm may someday pose a formidable challenge to PoW mining, however. And it could conceivably threaten cryptographic hashing as well. Any algorithm powerful enough to reverse engineer hash values would invariably undermine PoW itself.

Quantum Resistant Ledger (QRL) will ultimately offer protection against both.

For instance, a quantum-safe digital signature scheme named XMSS safeguards the coin from Shors algorithm.

Likewise, the QRL team will rely on Proof-of-Stake to head off mining-based attacks using Grovers search algorithm.

As you can see, the QRL team is thoroughly preparing for a post-quantum future. Their mission is an increasingly urgent one, as quantum computing continues to advance by leaps and bounds.

See more from Benzinga

2021 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.

Read the original:

Is Bitcoin (BTC) Safe from Grover's Algorithm? - Yahoo Finance

Read the Rest...

Quantum Computing Inc. to list on Nasdaq, expand Qatalyst visibility – ZDNet

§ July 17th, 2021 § Filed under Quantum Computer Comments Off on Quantum Computing Inc. to list on Nasdaq, expand Qatalyst visibility – ZDNet

Quantum Computing Inc. (QCI) will list on the Nasdaq on Thursday in a graduation from the over-the-counter market.

The move will give QCI more visibility for its flagship Qatalyst platform, which aims to deliver quantum computing without complex programming and code and quantum experts.

QCI's listing comes as the quantum computing space is heating up. IonQ will soon be public via a special purpose acquisition company (SPAC) deal. In addition, Honeywell is merging with Cambridge Quantum. QCI is pre-revenue, but is availability on Amazon Web Services and its Braket quantum marketplace.

According to QCI, Qatalyst gives enterprises the ability to use quantum computing to solve supply chain, logistics, drug discovery, cybersecurity and transportation issues. QCI will trade under the QUBT ticker, which was used for its over-the-counter listing.

Here are some key points about Qatalyst:

The components of Qatalyst include APIs, services, portals and access to compute resources.

Qatalyst components

Robert Liscouski, CEO of QCI, said in a recent shareholder letter:

Much of the market continues to focus on pure quantum for quantum's sake. However, the simple reality is that delivering business value with quantum in the near term will not come from quantum alone. It can only be derived from the sophisticated combination of classical and quantum computing techniques that is enabled today with Qatalyst.

In June, QCI said it entered a 3-year agreement with Los Alamos National Labratory to run exascale and petascale simulations.

Liscouski said the Nasdaq listing will bring more liquidity, shareholders and visibility to the company. As of Dec. 31, QCI had $15.2 million in cash, a net loss of $24.73 million and no revenue.

Original post:

Quantum Computing Inc. to list on Nasdaq, expand Qatalyst visibility - ZDNet

Read the Rest...

Quantum Computing Is Coming. What Can It Do? – Harvard Business Review

§ July 17th, 2021 § Filed under Quantum Computer Comments Off on Quantum Computing Is Coming. What Can It Do? – Harvard Business Review

Digital computing has limitations in regards to an important category of calculation called combinatorics, in which the order of data is important to the optimal solution. These complex, iterative calculations can take even the fastest computers a long time to process. Computers and software that are predicated on the assumptions of quantum mechanics have the potential to perform combinatorics and other calculations much faster, and as a result many firms are already exploring the technology, whose known and probable applications already include cybersecurity, bio-engineering, AI, finance, and complex manufacturing.

Tweet

Post

Share

Save

Print

Quantum technology is approaching the mainstream. Goldman Sachs recently announced that they could introduce quantum algorithms to price financial instruments in as soon as five years. Honeywell anticipates that quantum will form a $1 trillion industry in the decades ahead. But why are firms like Goldman taking this leap especially with commercial quantum computers being possibly years away?

To understand whats going on, its useful to take a step back and examine what exactly it is that computers do.

Lets start with todays digital technology. At its core, the digital computer is an arithmetic machine. It made performing mathematical calculations cheap and its impact on society has been immense. Advances in both hardware and software have made possible the application of all sorts of computing to products and services. Todays cars, dishwashers, and boilers all have some kind of computer embedded in them and thats before we even get to smartphones and the internet. Without computers we would never have reached the moon or put satellites in orbit.

These computers use binary signals (the famous 1s and 0s of code) which are measured in bits or bytes. The more complicated the code, the more processing power required and the longer the processing takes. What this means is that for all their advances from self-driving cars to beating grandmasters at Chess and Go there remain tasks that traditional computing devices struggle with, even when the task is dispersed across millions of machines.

A particular problem they struggle with is a category of calculation called combinatorics. These calculations involve finding an arrangement of items that optimizes some goal. As the number of items grows, the number of possible arrangements grows exponentially. To find the best arrangement, todays digital computers basically have to iterate through each permutation to find an outcome and then identify which does best at achieving the goal. In many cases this can require an enormous number of calculations (think about breaking passwords, for example). The challenge of combinatorics calculations, as well see in a minute, applies in many important fields, from finance to pharmaceuticals. It is also a critical bottleneck in the evolution of AI.

And this is where quantum computers come in. Just as classical computers reduced the cost of arithmetic, quantum presents a similar cost reduction to calculating daunting combinatoric problems.

Quantum computers (and quantum software) are based on a completely different model of how the world works. In classical physics, an object exists in a well-defined state. In the world of quantum mechanics, objects only occur in a well-defined state after we observe them. Prior to our observation, two objects states and how they are related are matters of probability.From a computing perspective, this means that data is recorded and stored in a different way through non-binary qubits of information rather than binary bits, reflecting the multiplicity of states in the quantum world. This multiplicity can enable faster and lower cost calculation for combinatoric arithmetic.

If that sounds mind-bending, its because it is. Even particle physicists struggle to get their minds around quantum mechanics and the many extraordinary properties of the subatomic world it describes, and this is not the place to attempt a full explanation. But what we can say is quantum mechanics does a better job of explaining many aspects of the natural world that classical physics does, and it accommodates nearly all of the theories that classical physics has produced.

Quantum translates, in the world of commercial computing, to machines and software that can, in principle, do many of the things that classical digital computers can and in addition do one big thing classical computers cant: perform combinatorics calculations quickly. As we describe in our paper, Commercial Applications of Quantum Computing, thats going to be a big deal in some important domains. In some cases, the importance of combinatorics is already known to be central to the domain.

As more people turn their attention to the potential of quantum computing, applications beyond quantum simulation and encryption are emerging:

The opportunity for quantum computing to solve large scale combinatorics problems faster and cheaper has encouraged billions of dollars of investment in recent years. The biggest opportunity may be in finding more new applications that benefit from the solutions offered through quantum. As professor and entrepreneur Alan Aspuru-Guzik said, there is a role for imagination, intuition, and adventure. Maybe its not about how many qubits we have; maybe its about how many hackers we have.

Continued here:

Quantum Computing Is Coming. What Can It Do? - Harvard Business Review

Read the Rest...

Startup hopes the world is ready to buy quantum processors – Ars Technica

§ July 17th, 2021 § Filed under Quantum Computer Comments Off on Startup hopes the world is ready to buy quantum processors – Ars Technica

Early in its history, computing was dominated by time-sharing systems. These systems were powerful machines (for their time, at least) that multiple users connected to in order to perform computing tasks. To an extent, quantum computing has repeated this history, with companies like Honeywell, IBM, and Rigetti making their machines available to users via cloud services. Companies pay based on the amount of time they spend executing algorithms on the hardware.

For the most part, time-sharing works out well, saving companies the expenses involved in maintaining the machine and its associated hardware, which often includes a system that chills the processor down to nearly absolute zero. But there are several customerscompanies developing support hardware, academic researchers, etc.for whom access to the actual hardware could be essential.

The fact that companies aren't shipping out processors suggests that the market isn't big enough to make production worthwhile. But a startup from the Netherlands is betting that the size of the market is about to change. On Monday, a company called QuantWare announced that it will start selling quantum processors based on transmons, superconducting loops of wire that form the basis of similar machines used by Google, IBM, and Rigetti.

Transmon-based qubits are popular because they're compatible with the standard fabrication techniques used for more traditional processors; they can also be controlled using microwave-frequency signals. Their big downside is that they operate only at temperatures that require liquid helium and specialized refrigeration hardware. These requirements complicate the hardware needed to exchange signals between the very cold processor and the room-temperature hardware that controls it.

Startup companies like D-Wave and Rigetti have set up their own fabrication facilities, but MatthijsRijlaarsdam, one of QuantWare's founders, told Ars that his company is taking advantage of an association with TU Delft, the host of the Kavli Nanolab. This partnership lets QuantWare do the fabrication without investing in its own facility. Rijlaarsdam said the situation shouldn't be a limiting factor, since he expects that the total market likely won't exceed tens of thousands of processors over the entirety of the next decade. Production volumes don't have to scale dramatically.

The initial processor the company will be shipping contains only five transmon qubits. Although this is well below anything on offer via one of the cloud services, Rijlaarsdam told Ars that the fidelities of each qubit will be 99.9 percent, which should keep the error rate manageable. He argued that, for now, a low qubit count should be sufficient based on the types of customers QuantWare expects to attract.

These customers include universities interested in studying new ways of using the processor and companies that might be interested in developing support hardware needed to turn a chip full of transmons into a functional system. Intel, for example, has been developing transmon hardware control chips that can tolerate the low temperatures required (although the semiconductor giant can also easily make its own transmons as needed).

That last aspectdeveloping a chip around which others could build a platformfeatures heavily in the press release that QuantWare shared with Ars. The announcement makes frequent mention of the Intel 4004, an early general-purpose microprocessor that found a home in a variety of computers.

Rijlaarsdam told Ars that he expects the company to increase its qubit count by two- to four-fold each year for the next few years. That's good progress, but it will still leave the company well behind the roadmap of competitors like IBM for the foreseeable future.

Rijlaarsdam also suggested that quantum computing will reach what he called "an inflection point" before 2025. Once this point is reached, quantum computers will regularly provide answers to problems that can't be practically calculated using classical hardware. Once that point is reached, "the market will be a multibillion-dollar market," Rijlaarsdam told Ars. "It will also grow rapidly, as the availability of large quantum computers will accelerateapplication development."

But if that point is reached before 2025, it will arrive at a time when QuantWare's qubit count is suited for the current market, which he accurately described as "an R&D market." QuantWare's solution to the awkward timing will be to develop quantum processors specialized for specific algorithms, which can presumably be done using fewer qubits. But those won't be aren't available for the company's launch.

Obviously, it's debatable whether there's a large market of companies anxiously awaiting the opportunity to install liquid helium dilution refrigerators in their office/lab/garage. But the reality is that there is almost certainly some market for an off-the-shelf quantum processorat least partly composed of other quantum computing startups.

That's not quite equivalent to the situation that greeted the Intel 4004. But it may be significant in that we seem to be getting close to the point where some of Ars' quantum-computing coverage will need to move out of the science section and over to IT, marking a clear shift in how the field is developing.

Listing image by QuantWare

Go here to see the original:

Startup hopes the world is ready to buy quantum processors - Ars Technica

Read the Rest...

Rigetti Computing Partners with Riverlane, Astex Pharmaceuticals on Quantum Computing for Drug Discovery – HPCwire

§ July 17th, 2021 § Filed under Quantum Computer Comments Off on Rigetti Computing Partners with Riverlane, Astex Pharmaceuticals on Quantum Computing for Drug Discovery – HPCwire

LONDON and CAMBRIDGE, England, July 13, 2021 Rigetti UK announced today it will partner with Riverlane and Astex Pharmaceuticals to develop an integrated application for simulating molecular systems using Rigetti Quantum Cloud Services, paving the way for a commercial application that could transform drug discovery in pharmaceutical R&D.

Our consortium brings together a complete quantum supply chain from hardware to end-user allowing us to develop a tailor-made solution to address a problem of real value to the pharmaceutical sector, says Mandy Birch, SVP of Technology Partnerships at Rigetti. This project lays the groundwork for the commercial application of Rigetti Quantum Cloud Services in the pharmaceutical industry.

The average cost of discovering a new drug and bringing it to market has tripled since 2010, reaching almost $3bn in 2018. However, soaring R&D costs have not translated into shorter times to market or higher numbers of newly approved drugs.

We want to solve this problem by using quantum computers to speed up the process of drug discovery, says Chris Murray, SVP Discovery Technology at Astex. Quantum computers provide a fundamentally different approach that could enable pharmaceutical companies to identify, screen, and simulate new drugs rather than using expensive, trial-and-error approaches in the laboratory.

To design more efficient drugs and shorten the time to market, researchers rely on advanced computational methods to model molecular structures and the interactions with their targets. While classical computers are limited to modelling simple structures, quantum computers have the potential to model more complex systems that could drastically improve the drug discovery process. However, todays quantum computers remain too noisy for results to evolve past proof-of-concept studies.

Building on previous work with Astex, our collaboration aims to overcome this technological barrier and address a real business need for the pharmaceutical sector, says Riverlane CEO Steve Brierley. The project will leverage Riverlanes algorithm expertise and existing technology for high-speed, low-latency processing on quantum computers using Rigettis commercially available quantum systems. The team will also develop error mitigation software to help optimise the performance of the hardware architecture, which they expect to result in up to a threefold reduction in errors and runtime improvements of up to 40x. This is an important first step in improving the performance of quantum computers so that they can solve commercially relevant problems, Brierley adds.

Science Minister Amanda Solloway says, The UK has bold ambitions to be the worlds first quantum-ready economy, harnessing the transformative capabilities of the technology to tackle global challenges such as climate change and disease outbreak.

This government-backed partnership will explore how the power of quantum could help boost drug discovery, with the aim of shortening the time it takes potentially life-saving drugs to transfer from lab to market, all while cementing the UKs status as a science superpower.

The 18-month feasibility study is facilitated by a grant through the Quantum Challenge at UK Research and Innovation (UKRI). Rigetti UK has previously received funding from UKRI to develop the first commercially available quantum computer in the UK. Riverlane has also received funding from UKRI to develop an operating system that makes quantum software portable across qubit technologies.

About Rigetti UK

Rigetti UK Limited is a wholly owned subsidiary of Rigetti Computing, based in Berkeley, California. Rigetti builds superconducting quantum computing systems and delivers access to them over the cloud. These systems are optimized for integration with existing computing infrastructure and tailored to support the development of practical software and applications. Learn more at rigetti.com.

About Riverlane

Riverlane builds ground-breaking software to unleash the power of quantum computers. Backed by leading venture-capital funds and the University of Cambridge, it develops software that transforms quantum computers from experimental technology into commercial products. Learn more at riverlane.com.

About Astex

Astex is a leader in innovative drug discovery and development, committed to the fight against cancer and diseases of the central nervous system. Astex is developing a proprietary pipeline of novel therapies and has a number of partnered products being developed under collaborations with leading pharmaceutical companies. Astex is a wholly owned subsidiary of Otsuka Pharmaceutical Co. Ltd., based in Tokyo, Japan.

For more information about Astex Pharmaceuticals, please visit https://astx.com For more information about Otsuka Pharmaceutical, please visit http://www.otsuka.co.jp/en/

Source: Rigetti UK

See more here:

Rigetti Computing Partners with Riverlane, Astex Pharmaceuticals on Quantum Computing for Drug Discovery - HPCwire

Read the Rest...

Quantum Computing on a Chip: Brace for the Revolution – Tom’s Hardware

§ July 17th, 2021 § Filed under Quantum Computer Comments Off on Quantum Computing on a Chip: Brace for the Revolution – Tom’s Hardware

In a moment of triumph thats being hailed as equivalent to the move from room-scale silicon technology down to desk-sized machines, quantum computing has now gone chip-scale down from the room-scale contraptions you might have seen elsewhere, including in science fiction.

The development has been spearheaded by Cambridge-based quantum specialist Riverlanes work with New York and London-based digital quantum company Seeqc. Theyre the first to deploy a quantum computing chip that has an integrated operating system for workflow and qubit management (qubits are comparable to classical computings transistors, but capable of pairing between themselves, instantly sharing information via quantum states, and also capable of representing both a 0 and a 1). The last time we achieved this level of miniaturization on a computing technology, we started the computing revolution. Now, expectations for a quantum revolution are on the table as well, and the world will have to adapt to the new reality.

The new chip ushers in scalable quantum computing, and the companies hope to scale the design by increasing surface area and qubit count. The aim is to bring qubits up to millions, a far cry from their current deployed maximum of a (comparatively puny, yet still remarkably complex) 76-qubit system that enabled China to claim quantum supremacy. There are, of course, other ways to scale besides increased qubit counts. Deployment of multiple chips in a single self-contained system or through multiple, inter-connectable systems could provide easier paths to quantum coherency. And on that end, a quantum OS is paramount. Enter Deltaflow.OS.

Deltaflow.OS is a hardware and platform-agnostic OS (think Linux, which populates everything from smartphones to IoT to supercomputers), meaning that it can serve as the control mechanism for various quantum deployment technologies currently being pursued around the globe. And even as multiple independent companies such as Google, Microsoft, and IBM, to name a few pursue the holy grail of quantum supremacy, Riverlanes Deltaflow.OS is an open-source, Github-available OS that's taking the open approach for market penetration.

And this makes sense, since the more than 50 quantum computers already built around the world all operate on independently-developed software. Its such a nascent field still that there are no standards regarding the deployment and control systems. An easily-deployable, quantum hardware-agnostic OS will undoubtedly accelerate development of applications that take advantage of quantum computings strengths, which at the 76 qubit system of China, already enables certain workloads to be crunched millions of times faster than the fastest classical, Turing-type supercomputer could ever hope to achieve.

To achieve this, Riverlane has effectively created a layered Digital Quantum Managament (DQM) SoC (System-On-Chip) that pairs classical computing capabilities with quantum mechanics. The companys diagrams demonstrate what it calls an SFQ (Single Flux Quantum) co-processor as the base layer of the design, which enables the OS to be exposed to developers with a relatively familiar interface for interaction with the qubits. This offers the capability to perform digital qubit control, readout and classical data processing functions, as well as being a platform for error correction.

There are numerous advantages to be taken from this approach, as the SFQs resources are (...) proximally co-located and integrated with qubit chips in a cryo-cooled environment to drastically reduce the complexity of input/output connections and maximize the benefits of fast, precise, low-noise digital control and readout, and energy-efficient classical co-processing. Essentially, some tenets of classical computing still apply, in that the closer the processing parts are, the more performant they are. This enables the OS to run, and is layered next to an active qubit sheet that actually performs the calculations.

Quantum computing has long been the holy grail in development for new processing technologies. However, the complexity of this endeavour cant be understated. The physics for quantum computing are essentially being written as we go, and while that is true, in a way, for many technological and innovation efforts, nowhere does It happen as much as here.

There are multiple questions related to quantum computing and its relationship to classical computing. Thanks to the efforts of Riverlane and Seeqc, the quantum computing ecosystem can now align their needles and collectively problem-solve for deployment and operation of quantum-computing-on-a-chip solutions.

More here:

Quantum Computing on a Chip: Brace for the Revolution - Tom's Hardware

Read the Rest...

Harvard-led physicists have taken a major step in the competition with quantum computing – Illinoisnewstoday.com

§ July 17th, 2021 § Filed under Quantum Computer Comments Off on Harvard-led physicists have taken a major step in the competition with quantum computing – Illinoisnewstoday.com

image: Dolev Bluvstein (from left), Mikhail Lukin, and Sepehr Ebadi have developed a special type of quantum computer known as a programmable quantum simulator. Evadi is adjusting the devices that make them possible to see More

Credits: Rose Lincoln / Harvard Staff Photographer

A team of physicists at the Harvard MIT Ultra-Cryogenic Atomic Center and other universities have developed a special type of quantum computer known as a programmable quantum simulator that can operate at 256 qubits or qubits.

The system sheds light on the host of complex quantum processes, ultimately helping to bring real-world breakthroughs in materials science, communications technology, finance, and many other areas. It shows a big step towards building. Overcome research hurdles beyond the capabilities of todays fastest supercomputers. Qubits are the basic building blocks of quantum computers and are the source of their enormous processing power.

This moves the field to a new territory that no one has ever been to, said Mikhail Lukin, a professor of physics at George Vasmer Leverett, co-director of the Harvard Quantum Initiative and one of the senior authors of the study. Stated.Published in the journal today Nature.. We are entering a whole new part of the quantum world.

According to Sepehr Ebadi, a physics student at the Graduate School of Arts and Sciences at Harvard and the lead author of the study, the unprecedented combination of size and programmability of the system is at the forefront of the quantum computer competition. The mysterious nature of the substance on a very small scale greatly improves its processing power. Under the right circumstances, increasing the cue bit means that the system can store and process more information exponentially than the traditional bits on which a standard computer runs.

The number of quantum states possible with just 256 qubits exceeds the number of atoms in the solar system, Evadi explained the vast size of the system.

Already, the simulator allows researchers to observe some exotic quantum states that have never been experimentally realized, and is accurate enough to serve as an example in a textbook showing how magnetism works at the quantum level. Quantum phase transition research can be performed.

These experiments provide powerful insights into the quantum physics that underlie material properties and help scientists show how to design new materials with exotic properties.

The project uses a significantly upgraded version of the platform developed by researchers in 2017 that was able to reach a size of 51 qubits. The old system allowed researchers to capture ultra-low temperature rubidium atoms and place them in a particular order using a one-dimensional array of individually focused laser beams called optical tweezers.

This new system allows atoms to be assembled into a two-dimensional array of optical tweezers. This increases the achievable system size from 51 qubits to 256 qubits. Tweezers allow researchers to arrange atoms in a defect-free pattern and create programmable shapes such as squares, honeycombs, or triangular grids to design different interactions between cubits.

The flagship product of this new platform is a device called the Spatial Light Modulator, which is used to form the light wave front and generate hundreds of individually focused optical tweezers beams, Ebadi said. Mr. says. These devices are essentially the same as those used in computer projectors to display images on the screen, but we have adapted them as an important component of quantum simulators.

The initial loading of atoms into optical tweezers is random, and researchers need to move the atoms to place them in the shape of the target. Researchers use a second set of moving optical tweezers to drag the atom to the desired position, eliminating the initial randomness. Lasers give researchers complete control over the placement of atomic cubits and their coherent quantum manipulation.

Other senior authors of this study include Professors Svil Sachidef and Marcus Greiner of Harvard University, Stanford University, University of California Berkeley, and Insbrook University of Austria, who worked on the project with Professor Vladin Vretti of Massachusetts Institute of Technology. Includes scientists. Austrian Academy of Sciences and QuEra Computing Inc. in Boston.

Our work is part of a very fierce, highly visible global competition to build larger, better quantum computers, said Harvard University Physics Researcher. Tout Wang, one of the authors of the paper, said. Overall effort [beyond our own] There are leading academic research institutes involved and major private sector investments from Google, IBM, Amazon, and many others.

Researchers are currently working on improving the system by improving laser control over qubits and making the system more programmable. They are also actively exploring how systems can be used in new applications, from exploring the exotic forms of quantum materials to solving challenging real-world problems that can be naturally encoded into qubits. doing.

This study enables a huge number of new scientific directions, Evadi said. We are far from the limits of what we can do with these systems.

###

Harvard-led physicists have taken a major step in the competition with quantum computing

Source link Harvard-led physicists have taken a major step in the competition with quantum computing

Continue reading here:

Harvard-led physicists have taken a major step in the competition with quantum computing - Illinoisnewstoday.com

Read the Rest...

Quantum computing: this is how quantum programming works using the example of random walk – Market Research Telecast

§ July 17th, 2021 § Filed under Quantum Computer Comments Off on Quantum computing: this is how quantum programming works using the example of random walk – Market Research Telecast

A Fritzbox can be set up quickly, but only those who know all the functions can optimize the connection and adequately protect the router.

153 Comments

Thanks to generous subsidies, you can save a lot of money when buying a wallbox. Using specific models, we show the important points in which there are differences.

218 Comments

Autos

If the prices for cryptocurrencies are high, private mining makes a profit. We show how easy it is to calculate ether with the graphics card.

473 Comments

with video

Microsoft Teams is indispensable in the everyday life of many companies. We show how vulnerable the communication tool is and how you can protect yourself.

38 Comments

You can finally set a different standard browser on the iPhone. We show what Google Chrome, Firefox, Microsoft Edge and Brave can do better.

21 Comments

Mac & i

We show how you can flexibly integrate the flush-mounted modules and ready-made components from Shelly into your smart home without a hub and, if you wish, without a cloud.

21 Comments

See original here:

Quantum computing: this is how quantum programming works using the example of random walk - Market Research Telecast

Read the Rest...

« Older Entries



Page 11234..1020..»