Page 21234..1020..»

You are currently browsing the Quantum Computer category

Life, the universe and everything Physics seeks the future – The Economist

§ August 27th, 2021 § Filed under Quantum Computer Comments Off on Life, the universe and everything Physics seeks the future – The Economist

Aug 25th 2021

A WISE PROVERB suggests not putting all your eggs in one basket. Over recent decades, however, physicists have failed to follow that wisdom. The 20th centuryand, indeed, the 19th before itwere periods of triumph for them. They transformed understanding of the material universe and thus peoples ability to manipulate the world around them. Modernity could not exist without the knowledge won by physicists over those two centuries.

Your browser does not support the

Get The Economist app and play articles, wherever you are

In exchange, the world has given them expensive toys to play with. The most recent of these, the Large Hadron Collider (LHC), which occupies a 27km-circumference tunnel near Geneva and cost $6bn, opened for business in 2008. It quickly found a long-predicted elementary particle, the Higgs boson, that was a hangover from calculations done in the 1960s. It then embarked on its real purpose, to search for a phenomenon called Supersymmetry.

This theory, devised in the 1970s and known as Susy for short, is the all-containing basket into which particle physicss eggs have until recently been placed. Of itself, it would eliminate many arbitrary mathematical assumptions needed for the proper working of what is known as the Standard Model of particle physics. But it is also the vanguard of a deeper hypothesis, string theory, which is intended to synthesise the Standard Model with Einsteins general theory of relativity. Einsteins theory explains gravity. The Standard Model explains the other three fundamental forceselectromagnetism and the weak and strong nuclear forcesand their associated particles. Both describe their particular provinces of reality well. But they do not connect together. String theory would connect them, and thus provide a so-called theory of everything.

String theory proposes that the universe is composed of minuscule objects which vibrate in the manner of the strings of a musical instrument. Like such strings, they have resonant frequencies and harmonics. These various vibrational modes, string theorists contend, correspond to various fundamental particles. Such particles include all of those already observed as part of the Standard Model, the further particles predicted by Susy, which posits that the Standard Models mathematical fragility will go away if each of that models particles has a heavier supersymmetric partner particle, or sparticle, and also particles called gravitons, which are needed to tie the force of gravity into any unified theory, but are not predicted by relativity.

But, no Susy, no string theory. And, 13 years after the LHC opened, no sparticles have shown up. Even two as-yet-unexplained results announced earlier this year (one from the LHC and one from a smaller machine) offer no evidence directly supporting Susy. Many physicists thus worry they have been on a wild-goose chase.

They have good reason to be nervous. String theory already comes with a disturbing conceptual price tagthat of adding six (or in one version seven) extra dimensions to the universe, over and above the four familiar ones (three of space and one of time). It also describes about 10{+500} possible universes, only one of which matches the universe in which human beings live. Accepting all that is challenging enough. Without Susy, though, string theory goes bananas. The number of dimensions balloons to 26. The theory also loses the ability to describe most of the Standard Models particles. And it implies the existence of weird stuff such as particles called tachyons that move faster than light and are thus incompatible with the theory of relativity. Without Susy, string theory thus looks pretty-much dead as a theory of everything. Which, if true, clears the field for non-string theories of everything.

The names of many of these do, it must be conceded, torture the English language. They include causal dynamical triangulation, asymptotically safe gravity, loop quantum gravity and the amplituhedron formulation of quantum theory. But at the moment the bookies favourite for unifying relativity and the Standard Model is something called entropic gravity.

Entropy is a measure of a systems disorder. Famously, the second law of thermodynamics asserts that it increases with time (ie, things have a tendency to get messier as they get older). What that has to do with a theory of gravity, let alone of everything, is not, perhaps, immediately obvious. But the link is black holes. These are objects which have such strong gravitational fields that even light cannot escape from them. They are predicted by the mathematics of general relativity. And even though Einstein remained sceptical about their actual existence until the day he died in 1955, subsequent observations have shown that they are indeed real. But they are not black.

In 1974 Stephen Hawking, of Cambridge University, showed that quantum effects at a black holes boundary allow it to radiate particlesespecially photons, which are the particles of electromagnetic radiation, including light. This has peculiar consequences. Photons carry radiant heat, so something which emits them has a temperature. And, from its temperature and mass, it is possible to calculate a black holes entropy. This matters because, when all these variables are plugged into the first law of thermodynamics, which states that energy can be neither created nor destroyed, only transformed from one form (say, heat) into another (say, mechanical work), what pops out are Einsteins equations of general relativity.

That relationship was discovered in 2010 by Erik Verlinde of Amsterdam University. It has serious implications. The laws of thermodynamics rely on statistical mechanics. They involve properties (temperature, entropy and so on) which emerge from probabilistic descriptions of the behaviour of the underlying particles involved. These are also the particles described by quantum mechanics, the mathematical theory which underpins the Standard Model. That Einsteins equations can be rewritten thermodynamically implies that space and time are also emergent properties of this deeper microscopic picture. The existing forms of quantum mechanics and relativity thus do indeed both seem derivable in principle from some deeper theory that describes the underlying fabric of the universe.

String theory is not so derivable. Strings are not fundamental enough entities. But entropic gravity claims to describe the very nature of space and timeor, to use Einsteinian terminology, spacetime. It asserts this is woven from filaments of quantum entanglement linking every particle in the cosmos.

The idea of quantum entanglement, another phenomenon pooh-poohed by Einstein that turned out to be true, goes back to 1935. It is that the properties of two or more objects can be correlated (entangled) in a way which means they cannot be described independently. This leads to weird effects. In particular, it means that two entangled particles can appear to influence each others behaviour instantaneously even when they are far apart. Einstein dubbed this spooky action at a distance, because it seems to violate the premise of relativity theory that, in the speed of light, the universe has a speed limit.

As with black holes, Einstein did not live long enough to see himself proved wrong. Experiments have nevertheless shown he was. Entanglement is real, and does not violate relativity because although the influence of one particle on another can be instantaneous there is no way to use the effect to pass information faster than light-speed. And, in the past five years, Brian Swingle of Harvard University and Sean Carroll of the California Institute of Technology have begun building models of what Dr Verlindes ideas might mean in practice, using ideas from quantum information theory. Their approach employs bits of quantum information (so-called qubits) to stand in for the entangled particles. The result is a simple but informative analogue of spacetime.

Qubits, the quantum equivalent of classical bitsthe ones and zeros on which regular computing is builtwill be familiar to those who follow the field of quantum computing. They are the basis of quantum information theory. Two properties distinguish qubits from the regular sort. First, they can be placed in a state of superposition, representing both a one and a zero at the same time. Second, several qubits can become entangled. Together, these properties let quantum computers accomplish feats such as performing multiple calculations at once, or completing certain classes of calculation in a sensible amount of time, that are difficult or impossible for a regular computer.

And because of their entanglement qubits can also, according to Dr Swingle and Dr Carroll, be used as stand-ins for how reality works. More closely entangled qubits represent particles at points in spacetime that are closer together. So far, quantum computers being a work in progress, this modelling can be done only with mathematical representations of qubits. These do, though, seem to obey the equations of general relativity. That supports entropic-gravity-theorys claims.

All of this modelling puts entropic gravity in poll position to replace strings as the long-sought theory of everything. But the idea that spacetime is an emergent property of the universe rather than being fundamental to it has a disturbing consequence. It blurs the nature of causality.

In the picture built by entropic gravity, spacetime is a superposition of multiple states. It is this which muddies causality. The branch of maths that best describes spacetime is a form of geometry that has four axes at right angles to each other instead of the more familiar three. The fourth represents time, so, like the position of objects, the order of events in spacetime is determined geometrically. If different geometric arrangements are superposed, as entropic gravity requires, it can therefore sometimes happen that the statements A causes B and B causes A are both true.

This is not mere speculation. In 2016 Giulia Rubino of the University of Bristol, in England, constructed an experiment involving polarised photons and prisms which achieved exactly that. This spells trouble for those who have old-fashioned notions about causalitys nature.

However, Lucien Hardy of the Perimeter Institute, in Canada, has discovered a way to reformulate the laws of quantum mechanics to get around this. In his view, causality as commonly perceived is like data compression in computing: it is a concept that gives you more bang for your buck. With a little bit of information about the present, causality can infer a lot about the futurecompressing the amount of information needed to capture the details of a physical system in time.

But causality, Dr Hardy thinks, may not be the only way to describe such correlations. Instead, he has invented a general method for building descriptions of the patterns in correlations from scratch. This method, which he calls the causaloid framework, tends to reproduce causality but it does not assume it, and he has used it to reformulate both quantum theory (in 2005) and general relativity (in 2016). Causaloid maths is not a theory of everything. But there is a good chance that if and when such a theory is found, causaloid principles will be needed to describe it, just as general relativity needed a geometry of four dimensions to describe spacetime.

Entropic gravity has, then, a lot of heavy-duty conceptual work to back it up. But it is not the only candidate to replace string theory. Others jostling for attention include an old competitor called loop quantum gravity, originally proposed in 1994 by Carlo Rovelli, then at the University of Pittsburgh, and Lee Smolin, of the Perimeter Institute. This, and causal dynamical triangulation, a more recent but similar idea, suggest that spacetime is not the smooth fabric asserted by general relativity, but, rather, has a structureeither elementary loops or triangles, according to which of the two theories you support.

A third option, asymptotically safe gravity, goes back still further, to 1976. It was suggested by Steven Weinberg, one of the Standard Models chief architects. A natural way to develop a theory of quantum gravity is to add gravitons to the model. Unfortunately, this approach got nowhere, because when the interactions of these putative particles were calculated at higher energies, the maths seemed to become nonsensical. However, Weinberg, who died in July, argued that this apparent breakdown would go away (in maths speak, the calculations would be asymptotically safe) if sufficiently powerful machines were used to do the calculating. And, with the recent advent of supercomputers of such power, it looks, from early results, as if he might have been right.

One of the most intriguing competitors of entropic gravity, though, is the amplituhedron formulation of quantum theory. This was introduced in 2013 by Nima Arkani-Hamed of the Institute of Advanced Study at Princeton and Jaroslav Trnka of the University of California, Davis. They have found a class of geometric structures dubbed amplituhedrons, each of which encodes the details of a possible quantum interaction. These, in turn, are facets of a master amplituhedron that encodes every possible type of physical process. It is thus possible to reformulate all of quantum theory in terms of the amplituhedron.

Most attempts at a theory of everything try to fit gravity, which Einstein describes geometrically, into quantum theory, which does not rely on geometry in this way. The amplituhedron approach does the opposite, by suggesting that quantum theory is actually deeply geometric after all. Better yet, the amplituhedron is not founded on notions of spacetime, or even statistical mechanics. Instead, these ideas emerge naturally from it. So, while the amplituhedron approach does not as yet offer a full theory of quantum gravity, it has opened up an intriguing path that may lead to one.

That space, time and even causality are emergent rather than fundamental properties of the cosmos are radical ideas. But this is the point. General relativity and quantum mechanics, the physics revolutions of the 20th century, were viewed as profound precisely because they overthrew common sense. To accept relativity meant abandoning a universal notion of time and space. To take quantum mechanics seriously meant getting comfortable with ideas like entanglement and superposition. Embracing entropic gravity or its alternatives will require similar feats of the imagination.

No theory, though, is worth a damn without data. That, after all, is the problem with Supersymmetry. Work like Dr Rubinos points the way. But something out of a particle-physics laboratory would also be welcome. And, though their meaning is obscure, the past few months have indeed seen two experimentally induced cracks in the Standard Model.

On March 23rd a team from CERN, the organisation that runs the LHC, reported an unexpected difference in behaviour between electrons and their heavier cousins, muons. These particles differ from one another in no known properties but their masses, so the Standard Model predicts that when other particles decay into them, the two should each be produced in equal numbers. But this appears not to be true. Interim results from the LHC suggest that a type of particle called a B-meson is more likely to decay into an electron than a muon. That suggests an as-yet-undescribed fundamental force is missing from the Standard Model. Then, on April 7th, Fermilab, Americas biggest particle-physics facility, announced the interim results of its own muon experiment, Muon g-2.

In the quantum world, there is no such thing as a perfect vacuum. Instead, a froth of particles constantly pops in and out of existence everywhere in spacetime. These are virtual rather than real particlesthat is, they are transient fluctuations which emerge straight out of quantum uncertainty. But, although they are short-lived, during the brief periods of their existence they still have time to interact with more permanent sorts of matter. They are, for example, the source of the black-hole radiation predicted by Hawking.

The strengths of their interactions with types of matter more conventional than black holes are predicted by the Standard Model, and to test these predictions, Muon g-2 shoots muons in circles around a powerful superconducting magnetic-storage ring. The quantum froth changes the way the muons wobble, which detectors can pick up with incredible precision. The Muon g-2 experiment suggests that the interactions causing these wobbles are slightly stronger than the Standard Model predicts. If confirmed, this would mean the model is missing one or more elementary particles.

There is a slim chance that these are the absent sparticles. If so, it is the supporters of supersymmetry who will have the last laugh. But nothing points in this direction and, having failed thus far to stand their ideas up, they are keeping sensibly quiet.

Whatever the causes of these two results, they do show that there is something out there which established explanations cannot account for. Similarly unexplained anomalies were starting points for both quantum theory and relativity. It looks possible, therefore, that what has seemed one of physicss darkest periods is about to brighten into a new morning.

This article appeared in the Science & technology section of the print edition under the headline "Bye, bye, little Susy"

See the rest here:

Life, the universe and everything Physics seeks the future - The Economist

Read the Rest...

Aussie innovations: mealworm snack packs, in-mouth robots, drone weeding and more – Sydney Morning Herald

§ August 27th, 2021 § Filed under Quantum Computer Comments Off on Aussie innovations: mealworm snack packs, in-mouth robots, drone weeding and more – Sydney Morning Herald

We have deals with larger food manufacturers at the moment for our bulk ingredients and weve had a really good response from retail, says founder Skye Blackburn, a food scientist and edible bug evangelist. In our new facility well be able to use all the technology weve been developing over the past 14 years, including applying artificial intelligence to the feeding, cleaning and monitoring side of things.

According to the CSIRO, the global market for edible insects is expected to grow to $1.4 billion by 2023. More than 2100 insect species are currently eaten, its report says, noting that there are 14 Australian insect-based businesses. While the industrys growth is limited by the current state of consumer attitudes, Blackburn thinks that will change as people realise that dried crickets are 68 per cent protein and packed with essential micronutrients. Everything your body needs in a tiny little package.

Solar will raise the standard of living around the world.Credit:Illustration by Simon Letch

Get ready for insanely cheap power as the price of renewables tumbles, says University of NSW professor Martin Green, inventor of the PERC solar cell used in about 85 per cent of the worlds solar module production.

Last year, the International Energy Agency said solar now provides the cheapest electricity ever seen, and the cost is still going down, Green says. Australia has more rooftop solar than any other country, even not normalising for population, and the average size of the systems is going up.

Green is director of the Australian Centre for Advanced Photovoltaics, where the next generation of solar is being developed. He says more powerful home systems will charge electric cars and vice versa, with those cars providing a bank of home energy when needed. But he doesnt see each house being self-contained and off the grid.

Storage is done most cheaply at the centralised level, he says. Green believes the revolution will happen due to economics, and raise the standard of living around the world. Solar is the most viable way of getting a reliable electricity supply to the couple of billion people in the world who still dont have access to it.

Humans and robots will be collaborating further in the future.Credit:Illustration by SimonLetch

Drones and other robots or cobots, to use the term for those designed to collaborate or interact with humans will play an ever larger role in our futures. Imagine rescue work at a collapsed building being aided by purpose-built drones, or electronic lizards capable of scaling sheer walls and slithering through tiny openings to detect survivors. The latter are currently being developed at the University of the Sunshine Coast. Or robots such as the AI-controlled drone developed by Israels Tevel Aerobotics Technology that can identify ripe fruit and pick it, around the clock.

Loading

New Zealands AgResearch led a three-year study into drone-based weeding, with the aim of identifying unwanted plants based on their unique chemical signatures and how they reflect light, and precisely mapping their locations using GPS. Program leader Dr Kioumars Ghamkhar has said the drone could then destroy the weeds with lasers.

The business applications of this so-called map and zap research are still being investigated, with more of them to be revealed this year.

It is believed phones in the future will be able to charge within a minute and last three days.Credit:Illustration by SimonLetch

The spruikers of graphene say this one-atom-thick material is 200 times stronger than steel, harder than diamond and has extraordinary electrical conductivity. Craig Nicol, chair of the Australian Graphene Industry Association, is convinced it will change the world the way silicon did with the advent of the silicon microchip that powers mobile phones and computers. We will likely see graphene used in electronics, filtration, ultra-sensitive sensors, lubrication and all manner of materials.

Nicol is also founder and CEO of GMG, which produces coatings that use graphenes heat transfer properties to make airconditioners run more efficiently. The Brisbane-based company is also working with the University of Queensland to bring energy-dense graphene aluminium-ion batteries to market, which they hope will one day power everything from watches to phones, and eventually cars and aircraft, while also backing up power grids. Nicol plans to debut a prototype watch-camera coin cell by the end of this year, and in a phone in 2022. He believes well eventually see phones that charge in less than a minute and run for three days. Others including Samsung are also working on graphene batteries, so the race is on.

The future will involve turning raw waste resources into high-value products.Credit:Illustration by SimonLetch

The circular economy is on the way and, according to KPMG, it will add more than $200 billion and 17,000 full-time jobs to the Australian economy by 2047-48. And theres no bigger and smarter advocate of the its not waste, its a resource mantra than University of NSW Professor Veena Sahajwalla, a pioneer of micro recycling which creates, as she puts it, a whole new range of very sophisticated recycling solutions that really didnt exist before.

Way beyond turning aluminium cans into more aluminium cans, the future will involve turning raw waste resources such as car tyres and beer bottles into high-value products such as green steel and home furnishings. Sahajwalla says the micro factories buildings with a handful of staff which her team have designed, with backing from the Australian Research Council, use a range of proprietary techniques, such as thermal isolation, to unpick complex structures.

They can therefore extract manganese and zinc from dead batteries, and create filament for 3D printers from mixed plastic structures such as old laser printers. Even more impressively, they can transform fabric into ceramic tiles. A soft material is now becoming part of a hard, durable green ceramic, Sahajwalla says. Youre combining that with waste glass and heat to create this integrated structure. Thats what we do in our micro factories.

Robots will allow those in the city to visit remote communities.Credit:Illustration by SimonLetch

Canberra-based Dentroid is working on an in-mouth robot that could allow city-bound dentists to visit remote communities. Co-founder and CEO Omar Zuaiter says the robot uses laser heads, micro cameras and other controllers to end the need for drills and needles. They look at the tooth, analyse it and remove the decayed materials. Laser is really, really good at that. Zuaiter says as communication infrastructure improves, the system will be able to reach further into distant areas. A commercial release is hoped for in 2024.

Quantum computing is set solve solve complex corporate, governmental and defence problems.Credit:Illustration by SimonLetch

Imagine a machine that could, in almost real time, complete calculations that would take thousands of years on the fastest iMac. Commercial versions could be available this decade, with Sydney-based Silicon Quantum Computing further advanced than most.

Loading

Silicon founding director Michelle Simmons says quantum computers will work by exploiting the power of quantum physics, and initially will likely solve complex corporate, governmental and defence problems such as logistics, financial analysis, software optimisation, machine learning and bioinformatics, including early disease detection and prevention.

Although few of us will use a quantum computer any time soon (you need a controlled environment for a start), the indirect results will be profound. Radically enhanced molecular models will mean faster processes in the development of new and better drugs, says Simmons. If you think classical computing has transformed the world, you havent seen anything yet.

To read more from Good Weekend magazine, visit our page at The Sydney Morning Herald, The Age and Brisbane Times.

The best of Good Weekend delivered to your inbox every Saturday morning. Sign up here.

Excerpt from:

Aussie innovations: mealworm snack packs, in-mouth robots, drone weeding and more - Sydney Morning Herald

Read the Rest...

Finland’s top startups and scaleups to watch – Sifted

§ August 27th, 2021 § Filed under Quantum Computer Comments Off on Finland’s top startups and scaleups to watch – Sifted

Its a banner year so far for investment in Finnish startups. According to Dealroom statistics, 1.2bn of VC funding has flowed into the country so far this year across 70 rounds compared to a solid 1bn from the previous year.

This is the most VC funding that the country known for its happiness, reindeer and saunas has ever received.

Whilst 39 of those rounds were at pre-seed and seed stage level, there have also been a few megarounds of $100m+, notably Wolts huge $530m raise and Aivens $100m Series C.

Here we spotlight fourteen startups that have caught our eye because they look set for an upward trajectory.

For those looking for the big picture, read the full list of over 100 Finnish startups here.

HQ: Helsinki

Founded: 2014

The food delivery company raised this years biggest funding round for a Finnish company, a $530m round back in January, and the next step is likely to be an IPO.

The scaleup saw its headcount go from 700 at the beginning of 2020 to 2,200 employees at the beginning of 2021. It is now in 23 countries and 129 cities, and saw revenue triple in 2020 to $345m.

Covid has changed our perspective on how big a business like us can be, says Miki Kuusi, Wolts CEO and cofounder.

Wolt is expanding beyond just food delivery to groceries, electronics, flowers, clothes and more, although it has steered away from building its own dark stores, preferring to work with partners, such as Spar in Poland and Carrefour in Georgia.

HQ: Helsinki

Founded: 2016

Founded in 2016, Aiven manages companies open-source data infrastructure in the cloud, so that developers can focus on building applications without worrying about managing background tasks like security and maintenance. The company has some 1000 customers, including big corporations like Comcast and Toyota, and has a workforce of around 200 people.

The company raised a $100m Series C funding round in March, giving it a valuation of around $800m and making it one of Finlands soonicorns. Aiven says it is planning to double its headcount over the next 12 months.

HQ: Oulu

Founded: 2013

The health-tracking ring has had a blisteringly good marketing run, having won over a number of celebrity fans such as Prince Harry, cyclist Lance Armstrong and Hollywood A-lister Will Smith. It got an added boost after studies showed that the ring, which tracks biometrics like body temperature, pulse and sleep patterns, could predict the onset of Covid-19 symptoms up to three days before they showed up.

This has helped the company win big corporate clients, such as the Las Vegas Sands hotel, as well as NASCAR and Ultimate Fighting Championship.

Oura has raised a total of 140m to date, including an 85m Series C round in May.

HQ: Helsinki and Berlin

Founded: 2017

This four-year-old company has global-sized ambitions to take on the biggest US tech companies like Google, IBM and Microsoft in the field of AI-powered customer service agents.

The technology focuses specifically on customer service, which can mean anything from building chatbots to systems that can automatically respond to questions sent in via simple contact forms and short emails. It is used in the customer service centres of large companies including Finnair, Telia, Deezer and Elisa. Up to 80% of customer interactions can be automated this way, the company says.

The company raised a $20m Series A round in December, which has allowed them to grow the headcount to more than 100 staff. Although the headquarters have moved to Berlin, a substantial part of the companys development work is still done in Finland.

The next big project is a plan to expand into the US market.

HQ: Espoo

Founded: 2016

Circular economy startup Infinited Fiber takes waste materials such as old textiles, used cardboard and even crop residues like rice and wheat straw, and uses a patented process to turn them into a textile fibre with a similar feel to cotton. In technical terms, the fibre is cellulose carbamate.

A number of fashion brands, including H&M, Patagonia and Adidas are customers, and in July a number of these customers, notably Adidas, Bestseller and H&M chipped into a 30m funding round for the startup.

This funding will help build a flagship factory in Finland that will turn household textile waste into a new, regenerated textile fibre, Infinna. The factory is expected to be operational in 2024 and will have the capacity to produce 30,000 metric tonnes of fibre.

Infinite Fibre is also looking to licence the technology to other producers it says any existing pulp or viscose factory can be retrofitted to produce the fibre.

HQ: Helsinki

Founded: 2018

A spin-out from Aalto University and the VTT Technical Research Centre, IQM is building quantum computers based on superconducting technology, setting itself up as a European challenger to Google, IBM and Rigetti. IQM is building Finlands first quantum computer, together with VTT, which will be operational by the end of the year. This will have just 5 qubits, far lower than the 60-70 qubit machines that Google and IBM have assembled, but IQM has plans for a 20-qubit computer by the end of next year and a 50-qubit computer by the end of 2023.

IQM has operations in Germany and recently announced the opening of a lab in Bilbao, which will focus on designing quantum software and hardware specifically to solve problems for the financial services sector. The company is one of the biggest quantum computing teams in Europe, with 50 people in Finland and a further 20 in Europe.

IQM has raised some 71m in funding to date, including a 39m Series A round at the end of 2020.

HQ: Helsinki

Founded: 2016

This five-year-old food waste startup has seen its revenues grow threefold during the pandemic, from 3.7m to 12m, as consumers embraced the internet ordering of food. The startup whose name translates as smart food sells food that is close to its sell-by date and about to become food waste, offering them to customers at heavily discounted prices. Unlike some of the food waste startups that rely on customers going to pick up waste food from restaurants and shops, Fiksu Ruoka offers home delivery.

In addition to food, Fiksu has also started stocking homewares and clothing. The business raised a 19m VC round in May.

HQ: Helsinki

Founded: 2017

Upright is building a new type of quantification model to calculate the net impact of companies on the environment, on the health of people and on society as a whole. It uses a neural network to assess the entire value chain surrounding a business. It is intended as a tool to help investors and consumers to make more informed decisions about the companies they back.

In June, Upright signed a partnership with Nasdaq, which will enable investors to get Upright data easily through the Nasdaq API and combine these with financial data. This is handy for investors building portfolios with impact goals.

Upright has taken a tiny amount of seed funding but mostly finances its operations with revenue from clients. Its ultimate aim is to make enough from the sale of its investor and corporate tools so that it can give the impact data to consumers and employees for free.

HQ: Helsinki

Founded: 2010

Neither Chris Thr nor his cofounder Mikko Kaipainen were musicians or music teachers. They were just two techies who were keen to learn to play musical instruments, and that was the whole point of Yousician a mobile app that can teach beginners how to play guitar, piano, ukulele, bass or to sing. The company started as a service focused on childrens music lessons but later pivoted to a less age-specific focus.

Users can get one free lesson a day, but can pay for a premium subscription to get more lessons and access to a bigger library of songs. The company has seen strong user growth during the pandemic as many people became interested in taking up an instrument while stuck at home.

Monthly users grew from 14.5m to 20m, while subscriptions increased by 80%. The company reported revenue of $50m last year and in April, Yousician raised a 24m Series B funding round.

HQ: Helsinki

Founded: 2013

Aiforia is developing cloud-based deep learning software to help scientists and clinicians with image analysis. The technology can increase the speed and precision with which medical images can be analysed in fields ranging from oncology to neuroscience. The company is planning, for example, to launch tools for breast and lung cancer diagnosis later this year.

Aiforia has some 3,000 users in 50 countries and raised a 25.2M in Series B funding round in June.

HQ: Helsinki

Founded: 2016

Flowhaven aims to streamline the way companies manage their licensing partnerships. This is a huge market but its still largely done manually through emails and clunky spreadsheets.

Its a problem that Flowhaven founder and CEO Kalle Trm experienced first-hand when he worked on licensing at Rovio, the Finnish company behind the Angry Birds game. Trm left his job to create a solution to this.

Flowhaven now has more than 100 customers using its system, including names like Nintendo and Games Workshop. The company raised a $16m Series A funding round in January and at the time reported 400% year-on-year growth. Its aiming to increase headcount to close to 100 by the end of the year.

HQ: Uusimaa

Founded: 2021

One of the newest arrivals on the Finnish start-up scene, this spinout from the VTT Technical Research Centre is focused on farming black soldier flies to create animal feed, pet food and ingredients for cosmetics. Cofounders Matti Thtinen and Tuure Parviainen met while working on a black soldier fly farming project at VTT, while COO Jarna Hyvnen has a background in managing circular economy projects. Their idea is to take agricultural waste products and byproducts from breweries and mills and turn them into high-value, usable protein, creating a circular economy for these parts of the food industry.

Volare has so far raised a 700k seed round from Maki.vc, allowing them to build a pilot facility for the black soldier fly breeding. They have plans for a first commercial-scale facility to be ready by 2023. The focus is currently on fish feed and pet food, but Volares intention is to also produce products for people too, once European regulations allow for black soldier fly-based protein to be used for human consumption. The EU has already ruled certain types of mealworm safe to eat.

Volares biggest competitor is Dutch company Protix, which has also focused on farming the black soldier fly. French startup Ynsect, which raised a 304m funding round last year, focuses on mealworms.

Mealworms may be further along the food approval route, but CTO Matti Thtinen says in the long run black soldier flies are a better proposition for the circular economy as they can be fed a far wider range of foods.

HQ: Espoo

Founded: 2021

Another new startup, only recently out of stealth mode, Pixieray is making active glasses that sense what the user is looking at and adjusts to give the perfect focus at all times. The principle is similar to the way a mobile phone camera automatically adjusts to the focal point of the shot, and it would mean an end to people having to use varifocal lenses or switching glasses between different activities.

Other companies have tried and failed at the active glasses challenge in the past, but CEO Niko Eiden and CTO Klaus Meklari come from Varjo, the VR glasses company and have a strong background in eyewear. One of the biggest challenges has been getting the technology and the batteries small enough to fit into the frame of a normal-sized pair of glasses, but miniaturisation is now reaching the point at which this is becoming possible.

The company so far has just a prototype but expects to start shipping a commercial product in 2023. Pixieray raised a 3.74m seed round from investors including Maki.vc in June.

HQ: Helsinki

Founded: 2018

The company began as an award-winning XR and gaming studio 17 years ago, but morphed into Glue in 2017 focusing on building VR remote collaboration tools for businesses.

Up to 30 people wearing VR headsets can work together in a virtual collaboration space, appearing as head-and-arms avatars and able to work together on documents, share presentations and videos as well as breaking out into smaller groups.

The company raised a 3.5m seed round in 2019 and hasnt raised since. However, it is now getting income from genuine paying customers. Some 100-150 big corporations, including Deutsche Telecom and Axel Springer, are using the system, although many of these relationships are still at the pilot stage.

Maija Palmer is Sifteds innovation editor. She covers deeptech and corporate innovation, and tweets from@maijapalmer

Read the original here:

Finland's top startups and scaleups to watch - Sifted

Read the Rest...

What is quantum computing? Everything you need to know about the strange world of quantum computers – ZDNet

§ July 30th, 2021 § Filed under Quantum Computer Comments Off on What is quantum computing? Everything you need to know about the strange world of quantum computers – ZDNet

While researchers don't understand everything about the quantum world, what they do know is that quantum particles hold immense potential, in particular to hold and process large amounts of information.

Quantum computing exploits the puzzling behavior that scientists have been observing for decades in nature's smallest particles think atoms, photons or electrons. At this scale, the classical laws of physics ceases to apply, and instead we shift to quantum rules.

While researchers don't understand everything about the quantum world, what they do know is that quantum particles hold immense potential, in particular to hold and process large amounts of information. Successfully bringing those particles under control in a quantum computer could trigger an explosion of compute power that would phenomenally advance innovation in many fields that require complex calculations, like drug discovery, climate modelling, financial optimization or logistics.

As Bob Sutor, chief quantum exponent at IBM, puts it: "Quantum computing is our way of emulating nature to solve extraordinarily difficult problems and make them tractable," he tells ZDNet.

Quantum computers come in various shapes and forms, but they are all built on the same principle: they host a quantum processor where quantum particles can be isolated for engineers to manipulate.

The nature of those quantum particles, as well as the method employed to control them, varies from one quantum computing approach to another. Some methods require the processor to be cooled down to freezing temperatures, others to play with quantum particles using lasers but share the goal of finding out how to best exploit the value of quantum physics.

The systems we have been using since the 1940s in various shapes and forms laptops, smartphones, cloud servers, supercomputers are known as classical computers. Those are based on bits, a unit of information that powers every computation that happens in the device.

In a classical computer, each bit can take on either a value of one or zero to represent and transmit the information that is used to carry out computations. Using bits, developers can write programs, which are sets of instructions that are read and executed by the computer.

Classical computers have been indispensable tools in the last few decades, but the inflexibility of bits is limiting. As an analogy, if tasked with looking for a needle in a haystack, a classical computer would have to be programmed to look through every single piece of hay straw until it reached the needle.

There are still many large problems, therefore, that classical devices can't solve. "There are calculations that could be done on a classical system, but they might take millions of years or use more computer memory that exists in total on Earth," says Sutor. "These problems are intractable today."

At the heart of any quantum computer are qubits, also known as quantum bits, and which can loosely be compared to the bits that process information in classical computers.

Qubits, however, have very different properties to bits, because they are made of the quantum particles found in nature those same particles that have been obsessing scientists for many years.

One of the properties of quantum particles that is most useful for quantum computing is known as superposition, which allows quantum particles to exist in several states at the same time. The best way to imagine superposition is to compare it to tossing a coin: instead of being heads or tails, quantum particles are the coin while it is still spinning.

By controlling quantum particles, researchers can load them with data to create qubits and thanks to superposition, a single qubit doesn't have to be either a one or a zero, but can be both at the same time. In other words, while a classical bit can only be heads or tails, a qubit can be, at once, heads and tails.

This means that, when asked to solve a problem, a quantum computer can use qubits to run several calculations at once to find an answer, exploring many different avenues in parallel.

So in the needle-in-a-haystack scenario about, unlike a classical machine, a quantum computer could in principle browse through all hay straws at the same time, finding the needle in a matter of seconds rather than looking for years even centuries before it found what it was searching for.

What's more: qubits can be physically linked together thanks to another quantum property called entanglement, meaning that with every qubit that is added to a system, the device's capabilities increase exponentially where adding more bits only generates linear improvement.

Every time we use another qubit in a quantum computer, we double the amount of information and processing ability available for solving problems. So by the time we get to 275 qubits, we can compute with more pieces of information than there are atoms in the observable universe. And the compression of computing time that this could generate could have big implications in many use cases.

Quantum computers are all built on the same principle: they host a quantum processor where quantum particles can be isolated for engineers to manipulate.

"There are a number of cases where time is money. Being able to do things more quickly will have a material impact in business," Scott Buchholz, managing director at Deloitte Consulting, tells ZDNet.

The gains in time that researchers are anticipating as a result of quantum computing are not of the order of hours or even days. We're rather talking about potentially being capable of calculating, in just a few minutes, the answer to problems that today's most powerful supercomputers couldn't resolve in thousands of years, ranging from modelling hurricanes all the way to cracking the cryptography keys protecting the most sensitive government secrets.

And businesses have a lot to gain, too. According to recent research by Boston Consulting Group (BCG),the advances that quantum computing will enable could create value of up to $850 billion in the next 15 to 30 years, $5 to $10 billion of which will be generated in the next five years if key vendors deliver on the technology as they have promised.

Programmers write problems in the form of algorithms for classical computers to resolve and similarly, quantum computers will carry out calculations based on quantum algorithms. Researchers have already identified that some quantum algorithms would be particularly suited to the enhanced capabilities of quantum computers.

For example, quantum systems could tackle optimization algorithms, which help identify the best solution among many feasible options, and could be applied in a wide range of scenarios ranging from supply chain administration to traffic management. ExxonMobil and IBM, for instance, are working together to find quantum algorithmsthat could one day manage the 50,000 merchant ships crossing the oceans each day to deliver goods, to reduce the distance and time traveled by fleets.

Quantum simulation algorithms are also expected to deliver unprecedented results, as qubits enable researchers to handle the simulation and prediction of complex interactions between molecules in larger systems, which could lead to faster breakthroughs in fields like materials science and drug discovery.

With quantum computers capable of handling and processing much larger datasets,AI and machine learning applications are set to benefit hugely, with faster training times and more capable algorithms. And researchers have also demonstrated that quantum algorithmshave the potential to crack traditional cryptography keys, which for now are too mathematically difficult for classical computers to break.

To create qubits, which are the building blocks of quantum computers, scientists have to find and manipulate the smallest particles of nature tiny parts of the universe that can be found thanks to different mediums. This is why there are currently many types of quantum processors being developed by a range of companies.

One of the most advanced approaches consists of using superconducting qubits, which are made of electrons, and come in the form of the familiar chandelier-like quantum computers. Both IBM and Google have developed superconducting processors.

Another approach that is gaining momentum is trapped ions, which Honeywell and IonQ are leading the way on, and in which qubits are housed in arrays of ions that are trapped in electric fields and then controlled with lasers.

Major companies like Xanadu and PsiQuantum, for their part, are investing in yet another method that relies on quantum particles of light, called photons, to encode data and create qubits. Qubits can also be created out of silicon spin qubits which Intel is focusing on but also cold atoms or even diamonds.

Quantum annealing, an approach that was chosen by D-Wave, is a different category of computing altogether. It doesn't rely on the same paradigm as other quantum processors, known as the gate model. Quantum annealing processors are much easier to control and operate, which is why D-Wave has already developed devices that can manipulate thousands of qubits, where virtually every other quantum hardware company is working with about 100 qubits or less. On the other hand, the annealing approach is only suitable for a specific set of optimization problems, which limits its capabilities.

What can you do with a quantum computer today?

Right now, with a mere 100 qubits the state of the art, there is very little that can actually be done with quantum computers. For qubits to start carrying out meaningful calculations, they will have to be counted in the thousands, and even millions.

Both IBM and Google have developed superconducting processors.

Right now, with a mere 100 qubits the state of the art, there is very little that can actually be done with quantum computers. For qubits to start carrying out meaningful calculations, they will have to be counted in the thousands, and even millions.

"While there is a tremendous amount of promise and excitement about what quantum computers can do one day, I think what they can do today is relatively underwhelming," says Buchholz.

Increasing the qubit count in gate-model processors, however, is incredibly challenging. This is because keeping the particles that make up qubits in their quantum state is difficult a little bit like trying to keep a coin spinning without falling on one side or the other, except much harder.

Keeping qubits spinning requires isolating them from any environmental disturbance that might cause them to lose their quantum state. Google and IBM, for example, do this by placing their superconducting processors in temperatures that are colder than outer space, which in turn require sophisticated cryogenic technologies that are currently near-impossible to scale up.

In addition, the instability of qubits means that they are unreliable, and still likely to cause computation errors. This hasgiven rise to a branch of quantum computing dedicated to developing error-correction methods.

Although research is advancing at pace, therefore, quantum computers are for now stuck in what is known as the NISQ era: noisy, intermediate-scale quantum computing but the end-goal is to build a fault-tolerant, universal quantum computer.

As Buchholz explains, it is hard to tell when this is likely to happen. "I would guess we are a handful of years from production use cases, but the real challenge is that this is a little like trying to predict research breakthroughs," he says. "It's hard to put a timeline on genius."

In 2019, Googleclaimed that its 54-qubit superconducting processor called Sycamore had achieved quantum supremacy the point at which a quantum computer can solve a computational task that is impossible to run on a classical device in any realistic amount of time.

Google said that Sycamore has calculated, in only 200 seconds, the answer to a problem that would have taken the world's biggest supercomputers 10,000 years to complete.

More recently,researchers from the University of Science and Technology of China claimed a similar breakthrough, saying that their quantum processor had taken 200 seconds to achieve a task that would have taken 600 million years to complete with classical devices.

This is far from saying that either of those quantum computers are now capable of outstripping any classical computer at any task. In both cases, the devices were programmed to run very specific problems, with little usefulness aside from proving that they could compute the task significantly faster than classical systems.

Without a higher qubit count and better error correction, proving quantum supremacy for useful problems is still some way off.

Organizations that are investing in quantum resources see this as the preparation stage: their scientists are doing the groundwork to be ready for the day that a universal and fault-tolerant quantum computer is ready.

In practice, this means that they are trying to discover the quantum algorithms that are most likely to show an advantage over classical algorithms once they can be run on large-scale quantum systems. To do so, researchers typically try to prove that quantum algorithms perform comparably to classical ones on very small use cases, and theorize that as quantum hardware improves, and the size of the problem can be grown, the quantum approach will inevitably show some significant speed-ups.

For example, scientists at Japanese steel manufacturer Nippon Steelrecently came up with a quantum optimization algorithm that could compete against its classical counterpartfor a small problem that was run on a 10-qubit quantum computer. In principle, this means that the same algorithm equipped with thousands or millions of error-corrected qubits could eventually optimize the company's entire supply chain, complete with the management of dozens of raw materials, processes and tight deadlines, generating huge cost savings.

The work that quantum scientists are carrying out for businesses is therefore highly experimental, and so far there are fewer than 100 quantum algorithms that have been shown to compete against their classical equivalents which only points to how emergent the field still is.

With most use cases requiring a fully error-corrected quantum computer, just who will deliver one first is the question on everyone's lips in the quantum industry, and it is impossible to know the exact answer.

All quantum hardware companies are keen to stress that their approach will be the first one to crack the quantum revolution, making it even harder to discern noise from reality. "The challenge at the moment is that it's like looking at a group of toddlers in a playground and trying to figure out which one of them is going to win the Nobel Prize," says Buchholz.

"I have seen the smartest people in the field say they're not really sure which one of these is the right answer. There are more than half a dozen different competing technologies and it's still not clear which one will wind up being the best, or if there will be a best one," he continues.

In general, experts agree that the technology will not reach its full potential until after 2030. The next five years, however, may start bringing some early use cases as error correction improves and qubit counts start reaching numbers that allow for small problems to be programmed.

IBM is one of the rare companies thathas committed to a specific quantum roadmap, which defines the ultimate objective of realizing a million-qubit quantum computer. In the nearer-term, Big Blue anticipates that it will release a 1,121-qubit system in 2023, which might mark the start of the first experimentations with real-world use cases.

In general, experts agree that quantum computers will not reach their full potential until after 2030.

Developing quantum hardware is a huge part of the challenge, and arguably the most significant bottleneck in the ecosystem. But even a universal fault-tolerant quantum computer would be of little use without the matching quantum software.

"Of course, none of these online facilities are much use without knowing how to 'speak' quantum," Andrew Fearnside, senior associate specializing in quantum technologies at intellectual property firm Mewburn Ellis, tells ZDNet.

Creating quantum algorithms is not as easy as taking a classical algorithm and adapting it to the quantum world. Quantum computing, rather, requires a brand-new programming paradigm that can only be ran on a brand-new software stack.

Of course, some hardware providers also develop software tools, the most established of which is IBM's open-source quantum software development kit Qiskit. But on top of that, the quantum ecosystem is expanding to include companies dedicated exclusively to creating quantum software. Familiar names include Zapata, QC Ware or 1QBit, which all specialize in providing businesses with the tools to understand the language of quantum.

And increasingly, promising partnerships are forming to bring together different parts of the ecosystem. For example, therecent alliance between Honeywell, which is building trapped ions quantum computers, and quantum software company Cambridge Quantum Computing (CQC), has got analysts predicting that a new player could be taking a lead in the quantum race.

The complexity of building a quantum computer think ultra-high vacuum chambers, cryogenic control systems and other exotic quantum instruments means that the vast majority of quantum systems are currently firmly sitting in lab environments, rather than being sent out to customers' data centers.

To let users access the devices to start running their experiments, therefore, quantum companies have launched commercial quantum computing cloud services, making the technology accessible to a wider range of customers.

The four largest providers of public cloud computing services currently offer access to quantum computers on their platform. IBM and Google have both put their own quantum processors on the cloud, whileMicrosoft's Azure QuantumandAWS's Braketservice let customers access computers from third-party quantum hardware providers.

The jury remains out on which technology will win the race, if any at all, but one thing is for certain: the quantum computing industry is developing fast, and investors are generously funding the ecosystem. Equity investments in quantum computing nearly tripled in 2020, and according to BCG, they are set to rise even more in 2021 to reach $800 million.

Government investment is even more significant: the US has unlocked $1.2 billion for quantum information science over the next five years, while the EU announced a 1 billion ($1.20 billion) quantum flagship. The UKalso recently reached the 1 billion ($1.37 billion) budget milestonefor quantum technologies, and while official numbers are not known in China,the government has made no secret of its desire to aggressively compete in the quantum race.

This has caused the quantum ecosystem to flourish over the past years, with new start-ups increasing from a handful in 2013 to nearly 200 in 2020. The appeal of quantum computing is also increasing among potential customers: according to analysis firm Gartner,while only 1% of companies were budgeting for quantum in 2018, 20% are expected to do so by 2023.

Although not all businesses need to be preparing themselves to keep up with quantum-ready competitors, there are some industries where quantum algorithms are expected to generate huge value, and where leading companies are already getting ready.

Goldman Sachs and JP Morgan are two examples of financial behemoths investing in quantum computing. That's because in banking,quantum optimization algorithms could give a boost to portfolio optimization, by better picking which stocks to buy and sell for maximum return.

In pharmaceuticals, where the drug discovery process is on average a $2 billion, ten-year-long deal that largely relies on trial and error, quantum simulation algorithms are also expected to make waves. This is also the case in materials science: companies like OTI Lumionics, for example,are exploring the use of quantum computers to design more efficient OLED displays.

Leading automotive companies including Volkswagen and BMW are also keeping a close eye on the technology, which could impact the sector in various ways, ranging from designing more efficient batteries to optimizing the supply chain, through to better management of traffic and mobility. Volkswagen, for example,pioneered the use of a quantum algorithm that optimized bus routes in real time by dodging traffic bottlenecks.

As the technology matures, however, it is unlikely that quantum computing will be limited to a select few. Rather, analysts anticipate that virtually all industries have the potential to benefit from the computational speedup that qubits will unlock.

There are some industries where quantum algorithms are expected to generate huge value, and where leading companies are already getting ready.

Quantum computers are expected to be phenomenal at solving a certain class of problems, but that doesn't mean that they will be a better tool than classical computers for every single application. Particularly, quantum systems aren't a good fit for fundamental computations like arithmetic, or for executing commands.

"Quantum computers are great constraint optimizers, but that's not what you need to run Microsoft Excel or Office," says Buchholz. "That's what classical technology is for: for doing lots of maths, calculations and sequential operations."

In other words, there will always be a place for the way that we compute today. It is unlikely, for example, that you will be streaming a Netflix series on a quantum computer anytime soon. Rather, the two technologies will be used in conjunction, with quantum computers being called for only where they can dramatically accelerate a specific calculation.

Buchholz predicts that, as classical and quantum computing start working alongside each other, access will look like a configuration option. Data scientists currently have a choice of using CPUs or GPUs when running their workloads, and it might be that quantum processing units (QPUs) join the list at some point. It will be up to researchers to decide which configuration to choose, based on the nature of their computation.

Although the precise way that users will access quantum computing in the future remains to be defined, one thing is certain: they are unlikely to be required to understand the fundamental laws of quantum computing in order to use the technology.

"People get confused because the way we lead into quantum computing is by talking about technical details," says Buchholz. "But you don't need to understand how your cellphone works to use it."

"People sometimes forget that when you log into a server somewhere, you have no idea what physical location the server is in or even if it exists physically at all anymore. The important question really becomes what it is going to look like to access it."

And as fascinating as qubits, superposition, entanglement and other quantum phenomena might be, for most of us this will come as welcome news.

View post:

What is quantum computing? Everything you need to know about the strange world of quantum computers - ZDNet

Read the Rest...

IBM’s newest quantum computer is now up-and-running: Here’s what it’s going to be used for – ZDNet

§ July 30th, 2021 § Filed under Quantum Computer Comments Off on IBM’s newest quantum computer is now up-and-running: Here’s what it’s going to be used for – ZDNet

A Quantum System One, IBM's flagship integrated superconducting quantum computer, is now available on-premises in the Kawasaki Business Incubation Center in Kawasaki City.

IBM has unveiled a brand-new quantum computer in Japan, thousands of miles away from the company's quantum computation center in Poughkeepsie, New York, in another step towards bringing quantum technologies out of Big Blue's labs and directly to partners around the world.

A Quantum System One, IBM's flagship integrated superconducting quantum computer, is now available on-premises in the Kawasaki Business Incubation Center in Kawasaki City, for Japanese researchers to run their quantum experiments in fields ranging from chemistry to finance.

Most customers to date can only access IBM's System One over the cloud, by connecting to the company's quantum computation center in Poughkeepsie.

Recently, the company unveiled the very first quantum computer that was physically built outside of the computation center's data centers,when the Fraunhofer Institute in Germany acquired a System One. The system that has now been deployed to Japan is therefore IBM's second quantum computer that is located outside of the US.

The announcement comes as part of a long-standing relationship with Japanese organizations. In 2019, IBM and the University of Tokyo inaugurated the Japan-IBM Quantum Partnership, a national agreement inviting universities and businesses across the country to engage in quantum research. It was agreed then that a Quantum System One would eventually be installed at an IBM facility in Japan.

Building on the partnership, Big Blue and the University of Tokyolaunched the Quantum Innovation Initiative Consortium last yearto further bring together organizations working in the field of quantum. With this, the Japanese government has made it clear that it is keen to be at the forefront of the promising developments that quantum technologies are expected to bring about.

Leveraging some physical properties that are specific to quantum mechanics, quantum computers could one day be capable of carrying out calculations that are impossible to run on the devices that are used today, known as a classical computers.

In some industries, this could have big implications; and as part of the consortium, together with IBM researchers, some Japanese companies have already identified promising use cases. Mitsubishi Chemical's research team, for example, has developed quantum algorithms capable of understanding the complex behavior of industrial chemical compounds with the goal of improving OLED displays.

A recent research paper published by the scientistshighlighted the potential of quantum computers when it comes to predicting the properties of OLED materials, which could eventually lead to more efficient displays requiring low-power consumption.

Similarly, researchers from Mizuho Financial Group and Mitsubishi Financial Group have been developing quantum algorithms that could speedup financial operations like Monte Carlo simulations, which could allow for optimized portfolio management thanks to better risk analysis and option pricing.

With access to IBM's Quantum System One, research in those fields is now expected to accelerate. But other industry leaders exploring quantum technologies as part of the partnership extend from Sony to Toyota, through Hitachi, Toshiba or JSR.

Quantum computing is still in its very early stages, and it is not yet possible to use quantum computers to perform computations that are of any value to a business. Rather, scientists are currently carrying out proofs-of-concept, by attempting to identify promising applications and testing them at a very small scale, to be prepared for the moment that the hardware is fully ready.

This is still some way off. Building and controlling the components of quantum computers is a huge challenge, which has so far been limited to the confines of specialist laboratories such as IBM's Poughkeepsie computation center.

It is significant, therefore, that IBM's Quantum System One is now mature enough to be deployed outside of the company's lab.

"Thousands of meticulously engineered components have to work together flawlessly in extreme temperatures within astonishing tolerances," said IBM in a blog post.

Back in the US, too, quantum customers are showing interest in building quantum hardware in their own facilities. The Cleveland Clinic, for example,recently invested $500 million for Big Blue to build quantum hardware on-premises.

Continued here:

IBM's newest quantum computer is now up-and-running: Here's what it's going to be used for - ZDNet

Read the Rest...

Quantum Cash and the End of Counterfeiting – IEEE Spectrum

§ July 30th, 2021 § Filed under Quantum Computer Comments Off on Quantum Cash and the End of Counterfeiting – IEEE Spectrum

Illustration: Emily Cooper

Since the invention of paper money, counterfeiters have churned out fake bills. Some of their handiwork, created with high-tech inks, papers, and printing presses, is so good that its very difficult to distinguish from the real thing. National banks combat the counterfeiters with difficult-to-copy watermarks, holograms, and other sophisticated measures. But to give money the ultimate protection, some quantum physicists are turning to the weird quirks that govern natures fundamental particles.

At the moment, the idea of quantum money is very much on the drawing board. That hasnt stopped researchers from pondering what encryption schemes they might apply for it, or from wondering how the technologies used to create quantum states could be shrunk down to the point of fitting it in your wallet, says Scott Aaronson, an MIT computer scientist who works on quantum money. This is science fiction, but its science fiction that doesnt violate any of the known laws of physics.

The laws that govern subatomic particles differ dramatically from those governing everyday experience. The relevant quantum law here is the no-cloning theorem, which says it is impossible to copy a quantum particles state exactly. Thats because reproducing a particles state involves making measurementsand the measurements change the particles overall properties. In certain cases, where you already know something about the state in question, quantum mechanics does allow you to measure one attribute of a particle. But in doing so youve made it impossible to measure the particles other attributes.

This rule implies that if you use money that is somehow linked to a quantum particle, you could, in principle, make it impossible to copy: It would be counterfeit-proof.

The visionary physicist Stephen Wiesner came up with the idea of quantum money in 1969. He suggested that banks somehow insert a hundred or so photons, the quantum particles of light, into each banknote. He didnt have any clear idea of how to do that, nor do physicists today, but never mind. Its still an intriguing notion, because the issuing bank could then create a kind of minuscule secret watermark by polarizing the photons in a special way.

To validate the note later, the bank would check just one attribute of each photon (for example, its vertical or horizontal polarization), leaving all other attributes unmeasured. The bank could then verify the notes authenticity by checking its records for how the photons were set originally for this particular bill, which the bank could look up using the bills printed serial number.

Thanks to the no-cloning theorem, a counterfeiter couldnt measure all the attributes of each photon to produce a copy. Nor could he just measure the one attribute that mattered for each photon, because only the bank would know which attributes those were.

But beyond the daunting engineering challenge of storing photons, or any other quantum particles, theres another basic problem with this scheme: Its a private encryption. Only the issuing bank could validate the notes. The ideal is quantum money that anyone can verify, Aaronson saysjust the way every store clerk in the United States can hold a $20 bill up to the light to look for the embedded plastic strip.

That would require some form of public encryption, and every such scheme researchers have created so far is potentially crackable. But its still worth exploring how that might work. Verification between two people would involve some kind of black boxa machine that checks the status of a piece of quantum money and spits out only the answer valid or invalid. Most of the proposed public-verification schemes are built on some sort of mathematical relationship between a bank notes quantum states and its serial number, so the verification machine would use an algorithm to check the math. This verifier, and the algorithm it follows, must be designed so that even if they were to fall into the hands of a counterfeiter, he couldnt use them to create fakes.

As fast as quantum money researchers have proposed encryption schemes, their colleagues have cracked them, but its clear that everyones having a great deal of fun. Most recently, Aaronson and his MIT collaborator Paul Christiano put forth a proposal [PDF] in which each banknotes serial number is linked to a large number of quantum particles, which are bound together using a quantum trick known as entanglement.

All of this is pie in the sky, of course, until engineers can create physical systems capable of retaining quantum states within moneyand that will perhaps be the biggest challenge of all. Running a quantum economy would require people to hold information encoded in the polarization of photons or the spin of electrons, say, for as long as they required cash to sit in their pockets. But quantum states are notoriously fragile: They decohere and lose their quantum properties after frustratingly short intervals of time. Youd have to prevent it from decohering in your wallet, Aaronson says.

For many researchers, that makes quantum money even more remote than useful quantum computers. At present, its hard to imagine having practical quantum money before having a large-scale quantum computer, says Michele Mosca of the Institute for Quantum Computing at the University of Waterloo, in Canada. And these superfast computers must also overcome the decoherence problem before they become feasible.

If engineers ever do succeed in building practical quantum computersones that can send information through fiber-optic networks in the form of encoded photonsquantum money might really have its day. On this quantum Internet, financial transactions would not only be secure, they would be so ephemeral that once the photons had been measured, there would be no trace of their existence. In todays age of digital cash, we have already relieved ourselves of the age-old burden of carrying around heavy metal coins or even wads of banknotes. With quantum money, our pockets and purses might finally be truly empty.

Michael Brooks, a British science journalist, holds a Ph.D. in quantum physics from the University of Sussex, which prepared him well to tackle the article Quantum Cash and the End of Counterfeiting. He says he found the topic of quantum money absolutely fascinating, and adds, I just hope I get to use some in my lifetime. He is the author, most recently, of Free Radicals: The Secret Anarchy of Science (Profile Books, 2011).

Read the original:

Quantum Cash and the End of Counterfeiting - IEEE Spectrum

Read the Rest...

Will the NSA Finally Build Its Superconducting Spy Computer? – IEEE Spectrum

§ July 30th, 2021 § Filed under Quantum Computer Comments Off on Will the NSA Finally Build Its Superconducting Spy Computer? – IEEE Spectrum

Today, silicon microchips underlie every aspect of digital computing. But their dominance was never a foregone conclusion. Throughout the 1950s, electrical engineers and other researchers explored many alternatives to making digital computers.

One of them seized the imagination of the U.S. National Security Agency (NSA): a superconducting supercomputer. Such a machine would take advantage of superconducting materials that, when chilled to nearly the temperature of deep spacejust a few degrees above absolute zeroexhibit no electrical resistance whatsoever. This extraordinary property held the promise of computers that could crunch numbers and crack codes faster than transistor-based systems while consuming far less power.

For six decades, from the mid-1950s to the present, the NSA has repeatedly pursued this dream, in partnership with industrial and academic researchers. Time and again, the agency sponsored significant projects to build a superconducting computer. Each time, the effort was abandoned in the face of the unrelenting pace of Moores Law and the astonishing increase in performance and decrease in cost of silicon microchips.

Now Moores Law is stuttering, and the worlds supercomputer builders are confronting an energy crisis. Nuclear weapon simulators, cryptographers, and others want exascale supercomputers, capable of 1,000 petaflops1 million trillion floating-point operations per secondor greater. The worlds fastest known supercomputer today, Chinas 34-petaflop Tianhe-2, consumes some 18 megawatts of power. Thats roughly the amount of electricity drawn instantaneously by 14,000 average U.S. households. Projections vary depending on the type of computer architecture used, but an exascale machine built with todays best silicon microchips could require hundreds of megawatts.

The exascale push may be superconducting computings opening. And the Intelligence Advanced Research Projects Activity, the U.S. intelligence communitys arm for high-risk R&D, is making the most of it. With new forms of superconducting logic and memory in development, IARPA has launched an ambitious program to create the fundamental building blocks of a superconducting supercomputer. In the next few years, the effort could finally show whether the technology really can beat silicon when given the chance.

Cold Calling: In the 1950s, Dudley Buck envisioned speedy, energy-efficient computers. These would be driven by his superconducting switch, thecryotron. Photo:Gjon Mili/The LIFE Picture Collection/Getty Images

The NSAs dream of superconducting supercomputers was first inspired by the electrical engineer Dudley Buck. Buck worked for the agencys immediate predecessor on an early digital computer. When he moved to MIT in 1950, he remained a military consultant, keeping the Armed Forces Security Agency, which quickly became the NSA, abreast of new computing developments in Cambridge.

Buck soon reported on his own worka novel superconducting switch he named the cryotron. The device works by switching a material between its superconducting statewhereelectrons couple up and flow as a supercurrent, with no resistance at alland its normal state, where electrons flow with some resistance. A number of superconducting metallic elements and alloys reach that state when they are cooled below a critical temperature near absolute zero. Once the material becomes superconducting, a sufficiently strong magnetic field can drive the material back to its normal state.

In this, Buck saw a digital switch. He coiled a tiny control wire around a gate wire, and plunged the pair into liquid helium. When current ran through the control, the magnetic field it created pushed the superconducting gate into its normal resistive state. When the control current was turned off, the gate became superconducting again.

Buck thought miniature cryotrons could be used to fashion powerful, fast, and energy-efficient digital computers. The NSA funded work by him and engineer Albert Slade on cryotron memory circuits at the firm A.D. Little, as well as a broader project on digital cryotron circuitry at IBM. Quickly, GE, RCA, and others launched their own cryotron efforts.

Engineers continued developing cryotron circuits into the early 1960s, despite Bucks sudden and premature death in 1959. But liquid-helium temperatures made cryotrons challenging to work with, and the time required for materials to transition from a superconducting to a resistive state limited switching speeds. The NSA eventually pulled back on funding, and many researchers abandoned superconducting electronics for silicon.

Even as these efforts faded, a big change was under way. In 1962 British physicist Brian Josephson made a provocative prediction about quantum tunneling in superconductors. In typical quantum-mechanical tunneling, electrons sneak across an insulating barrier, assisted by a voltage push; the electrons progress occurs with some resistance. But Josephson predicted that if the insulating barrier between two superconductors is thin enough, a supercurrent of paired electrons could flow across with zero resistance, as if the barrier were not there at all. This became known as the Josephson effect, and a switch based on the effect, the Josephson junction, soon followed.

Junction Exploration: 1970s-era Josephson circuitry. Image: IBM

IBM researchers developed a version of this switch in the mid-1960s. The active part of the device was a line of superconducting metal, interrupted by a thin oxide barrier cutting across it. A supercurrent would freely tunnel across the barrier, but only up to a point; if the current rose above a certain threshold, the device would saturate and unpaired electrons would trickle across the junction with some resistance. The threshold could be tuned by a magnetic field, created by running current through a nearby superconducting control line. If the device operated close to the threshold current, a small current in the control could shift the threshold and switch the gate out of its supercurrent-tunneling state. Unlike in Bucks cryotron, the materials in this device always remained superconducting, making it a much faster electronic switch.

As explored by historian Cyrus Mody, by 1973 IBM was working on building a superconducting supercomputer based on Josephson junctions. The basic building block of its circuits was a superconducting loop with Josephson junctions in it, known as a superconducting quantum interference device, or SQUID. The NSA covered a substantial fraction of the costs, and IBM expected the agency to be its first superconducting-supercomputer customer, with other government and industry buyers to follow.

IBMs superconducting supercomputer program ran for more than 10 years, at a cost of about US $250 million in todays dollars. It mainly pursued Josephson junctions made from lead alloy and lead oxide. Late in the project, engineers switched to a niobium oxide barrier, sandwiched between a lead alloy and a niobium film, an arrangement that produced more-reliable devices. But while the project made great strides, company executives were not convinced that an eventual supercomputer based on the technology could compete with the ones expected to emerge with advanced silicon microchips. In 1983, IBM shut down the program without ever finishing a Josephson-junction-based computer, super or otherwise.

Japan persisted where IBM had not. Inspired by IBMs project, Japans industrial ministry, MITI, launched a superconducting computer effort in 1981. The research partnership, which included Fujitsu, Hitachi, and NEC, lasted for eight years and produced an actual working Josephson-junction computerthe ETL-JC1. It was a tiny, 4-bit machine, with just 1,000 bits of RAM, but it could actually run a program. In the end, however, MITI came to share IBMs opinion about the prospect of scaling up the technology, and the project was abandoned.

Critical new developments emerged outside these larger superconducting-computer programs. In 1983, Bell Telephone Laboratories researchers formed Josephson junctions out of niobium separated by thin aluminum oxide layers. The new superconducting switches were extraordinarily reliable and could be fabricated using a simplified patterning process much in the same way silicon microchips were.

On The Move: Magnetic flux ejected from a superconducting loop through a Josephson junction can take the form of tiny voltage pulses. The presence or absence of a pulse in a given period of time can be used to perform computations. Image: Hypres

Then in 1985, researchers at Moscow State University proposed [PDF] a new kind of digital superconducting logic. Originally dubbed resistive, then renamed rapid single-flux-quantum logic, or RSFQ, it took advantage of the fact that a Josephson junction in a loop of superconducting material can emit minuscule voltage pulses. Integrated over time, they take on only a quantized, integer multiple of a tiny value called the flux quantum, measured in microvolt-picoseconds.

By using such ephemeral voltage pulses, each lasting a picosecond or so, RSFQ promised to boost clock speeds to greater than 100 gigahertz. Whats more, a Josephson junction in such a configuration would expend energy in the range of just a millionth of a picojoule, considerably less than consumed by todays silicon transistors.

Together, Bell Labs Josephson junctions and Moscow State Universitys RSFQ rekindled interest in superconducting electronics. By 1997, the U.S. had launched the Hybrid Technology Multi-Threaded (HTMT) project, which was supported by the National Science Foundation, the NSA, and other agencies. HTMTs goal was to beat conventional silicon to petaflop-level supercomputing, using RSFQ integrated circuits among other technologies.

It was an ambitious program that faced a number of challenges. The RSFQ circuits themselves limited potential computing efficiency. To achieve tremendous speed, RSFQ used resistors to provide electrical biases to the Josephson junctions in order to keep them close to the switching threshold. In experimental RSFQ circuitry with several thousand biased Josephson junctions, the static power dissipation was negligible. But in a petaflop-scale supercomputer, with possibly many billions of such devices, it would have added up to significant power consumption.

The HTMT project ended in 2000. Eight years later, a conventional silicon supercomputerIBMs Roadrunnerwas touted as the first to reach petaflop operation. It contained nearly 20,000 silicon microprocessors and consumed 2.3megawatts.

For many researchers working on superconducting electronics, the period around 2000 marked a shift to an entirely different direction: quantum computing. This new direction was inspired by the 1994 work of mathematician Peter Shor, then at Bell Labs, which suggested that a quantum computer could be a powerful cryptanalytical tool, able to rapidly decipher encrypted communications. Soon, projects in superconducting quantum computing and superconducting digital circuitry were being sponsored by the NSA and the U.S. Defense Advanced Research Projects Agency. They were later joined by IARPA, which was created in 2006 by the Office of the Director of National Intelligence to sponsor intelligence-related R&D programs, collaborating across a community that includes the NSA, the Central Intelligence Agency, and the National Geospatial-Intelligence Agency.

Single-Flux Quantum: Current in a superconducting loop containing a Josephson junction a nonsuperconducting barrier generates a magnetic field with atiny, quantized value.

Nobody knew how to build a quantum computer, of course, but lots of people had ideas. At IBM and elsewhere, engineers and scientists turned to the mainstays of superconducting electronicsSQUIDs and Josephson junctionsto craft the building blocks. A SQUID exhibits quantum effects under normal operation, and it was fairly straightforward to configure it to operate as a quantum bit, or qubit.

One of the centers of this work was the NSAs Laboratory for Physical Sciences. Built near the University of Maryland, College Parkoutside the fence of NSA headquarters in Fort Meadethe laboratory is a space where the NSA and outside researchers can collaborate on work relevant to the agencys insatiable thirst for computing power.

In the early 2010s, Marc Manheimer was head of quantum computing at the laboratory. As he recently recalled in an interview, he saw an acute need for conventional digital circuits that could physically surround quantum bits in order to control them and correct errors on very short timescales. The easiest way to do this, he thought, would be with superconducting computer elements, which could operate with voltage and current levels that were similar to those of the qubit circuitry they would be controlling. Optical links could be used to connect this cooled-down, hybrid system to the outside worldand to conventional silicon computers.

At the same time, Manheimer says, he became aware of the growing power problem in high-performance silicon computing, for supercomputers as well as the large banks of servers in commercial data centers. The closer I looked at superconducting logic, he says, the more that it became clear that it had value for supercomputing in its own right.

Manheimer proposed a new direct attack on the superconducting supercomputer. Initially, he encountered skepticism. Theres this history of failure, he says. Past pursuers of superconducting supercomputers had gotten burnedso people were very cautious. But by early 2013, he says, he had convinced IARPA to fund a multisite industrial and academic R&D program, dubbed the Cryogenic Computing Complexity (C3) program. He moved to IARPA to lead it.

The first phase of C3its budget is not publiccalls for the creation and evaluation of superconducting logic circuits and memory systems. These will be fabricated at MIT Lincoln Laboratorythe same lab where Dudley Buck once worked.

Manheimer says one thing that helped sell his C3 idea was recent progress in the field, which is reflected in IARPAs selection of performers, publicly disclosed in December 2014.

One of those teams is led by the defense giant Northrop Grumman Corp. The company participated in the late 1990s HTMT project, which employed fairly-power-hungry RSFQ logic. In 2011, Northrop Grummans Quentin Herr reported an exciting alternative, a different form of single-flux quantum logic called reciprocal quantum logic. RQL replaces RSFQs DC resistors with AC inductors, which bias the circuit without constantly drawing power. An RQL circuit, says Northrop Grumman team leader Marc Sherwin, consumes 1/100,000 the power of the best equivalent CMOS circuit and far less power than the equivalent RSFQ circuit.

A similarly energy-efficient logic called ERSFQ has been developed by superconducting electronics manufacturer Hypres, whose CTO, Oleg Mukhanov, is the coinventor of RSFQ. Hypres is working with IBM, which continued its fundamental superconducting device work even after canceling its Josephson-junction supercomputer project and was also chosen to work on logic for the program.

Hypres is also collaborating with a C3 team led by a Raytheon BBN Technologies laboratory that has been active in quantum computing research for several years. There, physicist Thomas Ohki and colleagues have been working on a cryogenic memory system that uses low-power superconducting logic to control, read, and write to high-density, low-power magnetoresistive RAM. This sort of memory is another change for superconducting computing. RSFQ memory cells were fairly large. Todays more compact nanomagnetic memories, originally developed to help extend Moores Law, can also work well at low temperatures.

The worlds most advanced superconducting circuitry uses devices based on niobium. Although such devices operate at temperatures of about 4 kelvins, or 4degrees above absolute zero, Manheimer says supplying the refrigeration is now a trivial matter. Thats thanks in large part to the multibillion-dollar industry based on magnetic resonance imaging machines, which rely on superconducting electromagnets and high-quality cryogenic refrigerators.

One big question has been how much the energy needed for cooling will increase a superconducting computers energy budget. But advocates suggest it might not be much. The power drawn by commercial cryocoolers leaves considerable room for improvement, Elie Track and Alan Kadin of the IEEEs Rebooting Computing initiative recently wrote. Even so, they say, the power dissipated in a superconducting computer is so small that it remains 100 times more efficient than a comparable silicon computer, even after taking into account the present inefficient cryocooler.

For now, C3s focus is on the fundamental components. This first phase, which will run through 2017, aims to demonstrate core components of a computer system: a set of key 64-bit logic circuits capable of running at a 10-GHz clock rate and cryogenic memory arrays with capacities up to about 250 megabytes. If this effort is successful, a second, two-year phase will integrate these components into a working cryogenic computer of as-yet-unspecified size. If that prototype is deemed promising, Manheimer estimates it should be possible to create a true superconducting supercomputer in another 5 to 10 years.

Go For Power: Performance demands power. Todays most powerful supercomputers consume multiple megawatts (red), not including cooling. Superconducting computers, cryocoolers included, are projected to dramatically drop those power requirements (blue). Source: IEEE Transactions on Applied Superconductivity, vol. 23, #1701610; Marc Manheimer

Such a system would be much smaller than CMOS-based supercomputers and require far less power. Manheimer projects that a superconducting supercomputer produced in a follow-up to C3 could run at 100 petaflops and consume 200 kilowatts, including the cryocooling. It would be 1/20 the size of Titan, currently the fastest supercomputer in the United States, but deliver more than five times the performance for 1/40 of the power.

A supercomputer with those capabilities would obviously represent a big jump. But as before, the fate of superconducting supercomputing strongly depends on what happens with silicon. While an exascale computer made from todays silicon chips may not be practical, great effort and billions of dollars are now being expended on continuing to shrink silicon transistors as well as on developing on-chip optical links and 3-D stacking. Such technologies could make a big difference, says Thomas Theis, who directs nanoelectronics research at the nonprofit Semiconductor Research Corp. In July 2015, President Barack Obama announced the National Strategic Computing Initiative and called for the creation of an exascale supercomputer. IARPAs work on alternatives to silicon is part of this initiative, but so is conventional silicon. The mid-2020s has been targeted for the first silicon-based exascale machine. If that goal is met, the arrival of a superconducting supercomputer would likely be pushed out still further.

But its too early to count out superconducting computing just yet. Compared with the massive, continuous investment in silicon over the decades, superconducting computing has had meager and intermittent support. Yet even with this subsistence diet, physicists and engineers have produced an impressive string of advances. The support of the C3 program, along with the wider attention of the computing community, could push the technology forward significantly. If all goes well, superconducting computers might finally come in from the cold.

This article appears in the March 2016 print issue as The NSAs Frozen Dream.

A historian of science and technology, David C. Brock recently became director of the Center for Software History at the Computer History Museum. A few years back, while looking into the history of microcircuitry, he stumbled across the work of Dudley Buck, a pioneer of speedy cryogenic logic. He wrote about Buck in our April 2014 issue. Here he explores what happened after Buck, including a new effort to build a superconducting computer. This time, he says, the draw is energy efficiency, not performance.

Excerpt from:

Will the NSA Finally Build Its Superconducting Spy Computer? - IEEE Spectrum

Read the Rest...

Is Bitcoin (BTC) Safe from Grover’s Algorithm? – Yahoo Finance

§ July 30th, 2021 § Filed under Quantum Computer Comments Off on Is Bitcoin (BTC) Safe from Grover’s Algorithm? – Yahoo Finance

When crypto investors discuss quantum computing, they invariably worry about its potential to undermine encryption. Quantum computers alone do not pose such a mortal threat, however. Its their capacity to exploit Shors algorithm that makes them formidable.

Thats because Shors algorithm can factor large prime numbers, the security behind asymmetric encryption.

Another quantum algorithm can potentially undermine the blockchain as well. Grovers algorithm helps facilitate quantum search capabilities, enabling users to quickly find values among billions of unstructured data points at once.

Unlike Shors algorithm, Grovers algorithm is more of a threat to cryptographic hashing than encryption. When cryptographic hashes are compromised, both blockchain integrity and block mining suffer.

Collision Attacks

One-way hash functions help to make a blockchain cryptographically secure. Classical computers cannot easily reverse-engineer them. They would have to find the correct arbitrary input that maps to a specific hash value.

Using Grovers algorithm, a quantum attacker could hypothetically find two inputs that produce the same hash value. This phenomenon is known as a hash collision.

By solving this search, a blockchain attacker could serendipitously replace a valid block with a falsified one. Thats because, in a Proof-of-Work system, the current blocks hash can verify the authenticity of all past blocks.

This kind of attack remains a distant threat, however. Indeed, achieving a cryptographic collision is far more challenging than breaking asymmetric encryption.

Mining Threats

A somewhat easier attack to pull off using Grovers algorithm involves proof-of-work mining.

Using Grovers search algorithm, a quantum miner can mine at a much faster rate than a traditional miner. This miner could generate as much Proof-of-Work as the rest of the network combined. Consequently, the attacker could effectively take over the blockchain and force consensus on any block they selected.

Story continues

A quantum miner might also use Grovers search algorithm to help facilitate the guessing of a nonce. The nonce is the number that blockchain miners are solving for, in order to receive cryptocurrency. Thats because Grovers algorithm provides a quadratic speedup over a classical computer (for now, ASIC-based mining remains considerably faster).

How fast is a quadratic speedup? Roughly stated, if a classical computer can solve a complex problem in the time of T, Grovers algorithm will be able to solve the problem in the square root of T (T).

Thus, any miner who can solve the nonce faster than other miners will be able to mine the blockchain faster as well.

Grovers algorithm could also be used to speed up the generation of nonces. This capability would allow an attacker to quickly reconstruct the chain from a previously modified block (and faster than the true chain), .In the end, a savvy attacker could substitute this reconstructed chain for the true chain.

Grovers algorithm may ultimately help make Proof-of-Work obsolete. Thats because there is no possible PoW system that is not susceptible to Grover speed-up. In the end, quantum actors will always have an advantage over classical ones in PoW-based blockchains. (allowing them) to either mine more effectively or (instigate) an attack (source).

Proof-of-Work Weaknesses

As bitcoin matures, the weaknesses inherent within PoW become ever-more evident. Miners are pitted against each other as if in a never-ending arms race This arms race is incentivized by the ability of larger mining pools to achieve economies of scale, a cost advantage that quickly erodes the capacity of individual miners to survive.

Of course, Proof-of-Stake is not without flaws. For instance, critics assert that it favors larger stakeholders (hence the claim that it enables the rich to get richer). These critics neglect to note that PoW is amenable to the same strategy (albeit with miners).

As this arms race comes to a head, any miner with the resources to do so will use quantum computing to achieve a competitive advantage. Combined with Grovers algorithm, a quantum-based miner would outperform other miners (most likely, small-and medium-sized miners). .

With access to quadratic speedup, any PoW coin will inevitably fall under the control of mega-cap institutions and governments. If so, regular investors and mid to large-cap enterprises risk getting priced out of the market. In particular, their devices will be either too expensive or prone to excessive regulation (much the same way that PGP encryption once was).

Summary

Shors algorithm undoubtedly poses the most immediate threat to bitcoin (namely, the potential to break ECDSA, its digital signature algorithm). Grovers algorithm is a distant second in this respect.

Grovers algorithm may someday pose a formidable challenge to PoW mining, however. And it could conceivably threaten cryptographic hashing as well. Any algorithm powerful enough to reverse engineer hash values would invariably undermine PoW itself.

Quantum Resistant Ledger (QRL) will ultimately offer protection against both.

For instance, a quantum-safe digital signature scheme named XMSS safeguards the coin from Shors algorithm.

Likewise, the QRL team will rely on Proof-of-Stake to head off mining-based attacks using Grovers search algorithm.

As you can see, the QRL team is thoroughly preparing for a post-quantum future. Their mission is an increasingly urgent one, as quantum computing continues to advance by leaps and bounds.

See more from Benzinga

2021 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.

Read the original:

Is Bitcoin (BTC) Safe from Grover's Algorithm? - Yahoo Finance

Read the Rest...

Quantum Computing Inc. to list on Nasdaq, expand Qatalyst visibility – ZDNet

§ July 17th, 2021 § Filed under Quantum Computer Comments Off on Quantum Computing Inc. to list on Nasdaq, expand Qatalyst visibility – ZDNet

Quantum Computing Inc. (QCI) will list on the Nasdaq on Thursday in a graduation from the over-the-counter market.

The move will give QCI more visibility for its flagship Qatalyst platform, which aims to deliver quantum computing without complex programming and code and quantum experts.

QCI's listing comes as the quantum computing space is heating up. IonQ will soon be public via a special purpose acquisition company (SPAC) deal. In addition, Honeywell is merging with Cambridge Quantum. QCI is pre-revenue, but is availability on Amazon Web Services and its Braket quantum marketplace.

According to QCI, Qatalyst gives enterprises the ability to use quantum computing to solve supply chain, logistics, drug discovery, cybersecurity and transportation issues. QCI will trade under the QUBT ticker, which was used for its over-the-counter listing.

Here are some key points about Qatalyst:

The components of Qatalyst include APIs, services, portals and access to compute resources.

Qatalyst components

Robert Liscouski, CEO of QCI, said in a recent shareholder letter:

Much of the market continues to focus on pure quantum for quantum's sake. However, the simple reality is that delivering business value with quantum in the near term will not come from quantum alone. It can only be derived from the sophisticated combination of classical and quantum computing techniques that is enabled today with Qatalyst.

In June, QCI said it entered a 3-year agreement with Los Alamos National Labratory to run exascale and petascale simulations.

Liscouski said the Nasdaq listing will bring more liquidity, shareholders and visibility to the company. As of Dec. 31, QCI had $15.2 million in cash, a net loss of $24.73 million and no revenue.

Original post:

Quantum Computing Inc. to list on Nasdaq, expand Qatalyst visibility - ZDNet

Read the Rest...

Quantum Computing Is Coming. What Can It Do? – Harvard Business Review

§ July 17th, 2021 § Filed under Quantum Computer Comments Off on Quantum Computing Is Coming. What Can It Do? – Harvard Business Review

Digital computing has limitations in regards to an important category of calculation called combinatorics, in which the order of data is important to the optimal solution. These complex, iterative calculations can take even the fastest computers a long time to process. Computers and software that are predicated on the assumptions of quantum mechanics have the potential to perform combinatorics and other calculations much faster, and as a result many firms are already exploring the technology, whose known and probable applications already include cybersecurity, bio-engineering, AI, finance, and complex manufacturing.

Tweet

Post

Share

Save

Print

Quantum technology is approaching the mainstream. Goldman Sachs recently announced that they could introduce quantum algorithms to price financial instruments in as soon as five years. Honeywell anticipates that quantum will form a $1 trillion industry in the decades ahead. But why are firms like Goldman taking this leap especially with commercial quantum computers being possibly years away?

To understand whats going on, its useful to take a step back and examine what exactly it is that computers do.

Lets start with todays digital technology. At its core, the digital computer is an arithmetic machine. It made performing mathematical calculations cheap and its impact on society has been immense. Advances in both hardware and software have made possible the application of all sorts of computing to products and services. Todays cars, dishwashers, and boilers all have some kind of computer embedded in them and thats before we even get to smartphones and the internet. Without computers we would never have reached the moon or put satellites in orbit.

These computers use binary signals (the famous 1s and 0s of code) which are measured in bits or bytes. The more complicated the code, the more processing power required and the longer the processing takes. What this means is that for all their advances from self-driving cars to beating grandmasters at Chess and Go there remain tasks that traditional computing devices struggle with, even when the task is dispersed across millions of machines.

A particular problem they struggle with is a category of calculation called combinatorics. These calculations involve finding an arrangement of items that optimizes some goal. As the number of items grows, the number of possible arrangements grows exponentially. To find the best arrangement, todays digital computers basically have to iterate through each permutation to find an outcome and then identify which does best at achieving the goal. In many cases this can require an enormous number of calculations (think about breaking passwords, for example). The challenge of combinatorics calculations, as well see in a minute, applies in many important fields, from finance to pharmaceuticals. It is also a critical bottleneck in the evolution of AI.

And this is where quantum computers come in. Just as classical computers reduced the cost of arithmetic, quantum presents a similar cost reduction to calculating daunting combinatoric problems.

Quantum computers (and quantum software) are based on a completely different model of how the world works. In classical physics, an object exists in a well-defined state. In the world of quantum mechanics, objects only occur in a well-defined state after we observe them. Prior to our observation, two objects states and how they are related are matters of probability.From a computing perspective, this means that data is recorded and stored in a different way through non-binary qubits of information rather than binary bits, reflecting the multiplicity of states in the quantum world. This multiplicity can enable faster and lower cost calculation for combinatoric arithmetic.

If that sounds mind-bending, its because it is. Even particle physicists struggle to get their minds around quantum mechanics and the many extraordinary properties of the subatomic world it describes, and this is not the place to attempt a full explanation. But what we can say is quantum mechanics does a better job of explaining many aspects of the natural world that classical physics does, and it accommodates nearly all of the theories that classical physics has produced.

Quantum translates, in the world of commercial computing, to machines and software that can, in principle, do many of the things that classical digital computers can and in addition do one big thing classical computers cant: perform combinatorics calculations quickly. As we describe in our paper, Commercial Applications of Quantum Computing, thats going to be a big deal in some important domains. In some cases, the importance of combinatorics is already known to be central to the domain.

As more people turn their attention to the potential of quantum computing, applications beyond quantum simulation and encryption are emerging:

The opportunity for quantum computing to solve large scale combinatorics problems faster and cheaper has encouraged billions of dollars of investment in recent years. The biggest opportunity may be in finding more new applications that benefit from the solutions offered through quantum. As professor and entrepreneur Alan Aspuru-Guzik said, there is a role for imagination, intuition, and adventure. Maybe its not about how many qubits we have; maybe its about how many hackers we have.

Continued here:

Quantum Computing Is Coming. What Can It Do? - Harvard Business Review

Read the Rest...

Startup hopes the world is ready to buy quantum processors – Ars Technica

§ July 17th, 2021 § Filed under Quantum Computer Comments Off on Startup hopes the world is ready to buy quantum processors – Ars Technica

Early in its history, computing was dominated by time-sharing systems. These systems were powerful machines (for their time, at least) that multiple users connected to in order to perform computing tasks. To an extent, quantum computing has repeated this history, with companies like Honeywell, IBM, and Rigetti making their machines available to users via cloud services. Companies pay based on the amount of time they spend executing algorithms on the hardware.

For the most part, time-sharing works out well, saving companies the expenses involved in maintaining the machine and its associated hardware, which often includes a system that chills the processor down to nearly absolute zero. But there are several customerscompanies developing support hardware, academic researchers, etc.for whom access to the actual hardware could be essential.

The fact that companies aren't shipping out processors suggests that the market isn't big enough to make production worthwhile. But a startup from the Netherlands is betting that the size of the market is about to change. On Monday, a company called QuantWare announced that it will start selling quantum processors based on transmons, superconducting loops of wire that form the basis of similar machines used by Google, IBM, and Rigetti.

Transmon-based qubits are popular because they're compatible with the standard fabrication techniques used for more traditional processors; they can also be controlled using microwave-frequency signals. Their big downside is that they operate only at temperatures that require liquid helium and specialized refrigeration hardware. These requirements complicate the hardware needed to exchange signals between the very cold processor and the room-temperature hardware that controls it.

Startup companies like D-Wave and Rigetti have set up their own fabrication facilities, but MatthijsRijlaarsdam, one of QuantWare's founders, told Ars that his company is taking advantage of an association with TU Delft, the host of the Kavli Nanolab. This partnership lets QuantWare do the fabrication without investing in its own facility. Rijlaarsdam said the situation shouldn't be a limiting factor, since he expects that the total market likely won't exceed tens of thousands of processors over the entirety of the next decade. Production volumes don't have to scale dramatically.

The initial processor the company will be shipping contains only five transmon qubits. Although this is well below anything on offer via one of the cloud services, Rijlaarsdam told Ars that the fidelities of each qubit will be 99.9 percent, which should keep the error rate manageable. He argued that, for now, a low qubit count should be sufficient based on the types of customers QuantWare expects to attract.

These customers include universities interested in studying new ways of using the processor and companies that might be interested in developing support hardware needed to turn a chip full of transmons into a functional system. Intel, for example, has been developing transmon hardware control chips that can tolerate the low temperatures required (although the semiconductor giant can also easily make its own transmons as needed).

That last aspectdeveloping a chip around which others could build a platformfeatures heavily in the press release that QuantWare shared with Ars. The announcement makes frequent mention of the Intel 4004, an early general-purpose microprocessor that found a home in a variety of computers.

Rijlaarsdam told Ars that he expects the company to increase its qubit count by two- to four-fold each year for the next few years. That's good progress, but it will still leave the company well behind the roadmap of competitors like IBM for the foreseeable future.

Rijlaarsdam also suggested that quantum computing will reach what he called "an inflection point" before 2025. Once this point is reached, quantum computers will regularly provide answers to problems that can't be practically calculated using classical hardware. Once that point is reached, "the market will be a multibillion-dollar market," Rijlaarsdam told Ars. "It will also grow rapidly, as the availability of large quantum computers will accelerateapplication development."

But if that point is reached before 2025, it will arrive at a time when QuantWare's qubit count is suited for the current market, which he accurately described as "an R&D market." QuantWare's solution to the awkward timing will be to develop quantum processors specialized for specific algorithms, which can presumably be done using fewer qubits. But those won't be aren't available for the company's launch.

Obviously, it's debatable whether there's a large market of companies anxiously awaiting the opportunity to install liquid helium dilution refrigerators in their office/lab/garage. But the reality is that there is almost certainly some market for an off-the-shelf quantum processorat least partly composed of other quantum computing startups.

That's not quite equivalent to the situation that greeted the Intel 4004. But it may be significant in that we seem to be getting close to the point where some of Ars' quantum-computing coverage will need to move out of the science section and over to IT, marking a clear shift in how the field is developing.

Listing image by QuantWare

Go here to see the original:

Startup hopes the world is ready to buy quantum processors - Ars Technica

Read the Rest...

Rigetti Computing Partners with Riverlane, Astex Pharmaceuticals on Quantum Computing for Drug Discovery – HPCwire

§ July 17th, 2021 § Filed under Quantum Computer Comments Off on Rigetti Computing Partners with Riverlane, Astex Pharmaceuticals on Quantum Computing for Drug Discovery – HPCwire

LONDON and CAMBRIDGE, England, July 13, 2021 Rigetti UK announced today it will partner with Riverlane and Astex Pharmaceuticals to develop an integrated application for simulating molecular systems using Rigetti Quantum Cloud Services, paving the way for a commercial application that could transform drug discovery in pharmaceutical R&D.

Our consortium brings together a complete quantum supply chain from hardware to end-user allowing us to develop a tailor-made solution to address a problem of real value to the pharmaceutical sector, says Mandy Birch, SVP of Technology Partnerships at Rigetti. This project lays the groundwork for the commercial application of Rigetti Quantum Cloud Services in the pharmaceutical industry.

The average cost of discovering a new drug and bringing it to market has tripled since 2010, reaching almost $3bn in 2018. However, soaring R&D costs have not translated into shorter times to market or higher numbers of newly approved drugs.

We want to solve this problem by using quantum computers to speed up the process of drug discovery, says Chris Murray, SVP Discovery Technology at Astex. Quantum computers provide a fundamentally different approach that could enable pharmaceutical companies to identify, screen, and simulate new drugs rather than using expensive, trial-and-error approaches in the laboratory.

To design more efficient drugs and shorten the time to market, researchers rely on advanced computational methods to model molecular structures and the interactions with their targets. While classical computers are limited to modelling simple structures, quantum computers have the potential to model more complex systems that could drastically improve the drug discovery process. However, todays quantum computers remain too noisy for results to evolve past proof-of-concept studies.

Building on previous work with Astex, our collaboration aims to overcome this technological barrier and address a real business need for the pharmaceutical sector, says Riverlane CEO Steve Brierley. The project will leverage Riverlanes algorithm expertise and existing technology for high-speed, low-latency processing on quantum computers using Rigettis commercially available quantum systems. The team will also develop error mitigation software to help optimise the performance of the hardware architecture, which they expect to result in up to a threefold reduction in errors and runtime improvements of up to 40x. This is an important first step in improving the performance of quantum computers so that they can solve commercially relevant problems, Brierley adds.

Science Minister Amanda Solloway says, The UK has bold ambitions to be the worlds first quantum-ready economy, harnessing the transformative capabilities of the technology to tackle global challenges such as climate change and disease outbreak.

This government-backed partnership will explore how the power of quantum could help boost drug discovery, with the aim of shortening the time it takes potentially life-saving drugs to transfer from lab to market, all while cementing the UKs status as a science superpower.

The 18-month feasibility study is facilitated by a grant through the Quantum Challenge at UK Research and Innovation (UKRI). Rigetti UK has previously received funding from UKRI to develop the first commercially available quantum computer in the UK. Riverlane has also received funding from UKRI to develop an operating system that makes quantum software portable across qubit technologies.

About Rigetti UK

Rigetti UK Limited is a wholly owned subsidiary of Rigetti Computing, based in Berkeley, California. Rigetti builds superconducting quantum computing systems and delivers access to them over the cloud. These systems are optimized for integration with existing computing infrastructure and tailored to support the development of practical software and applications. Learn more at rigetti.com.

About Riverlane

Riverlane builds ground-breaking software to unleash the power of quantum computers. Backed by leading venture-capital funds and the University of Cambridge, it develops software that transforms quantum computers from experimental technology into commercial products. Learn more at riverlane.com.

About Astex

Astex is a leader in innovative drug discovery and development, committed to the fight against cancer and diseases of the central nervous system. Astex is developing a proprietary pipeline of novel therapies and has a number of partnered products being developed under collaborations with leading pharmaceutical companies. Astex is a wholly owned subsidiary of Otsuka Pharmaceutical Co. Ltd., based in Tokyo, Japan.

For more information about Astex Pharmaceuticals, please visit https://astx.com For more information about Otsuka Pharmaceutical, please visit http://www.otsuka.co.jp/en/

Source: Rigetti UK

See more here:

Rigetti Computing Partners with Riverlane, Astex Pharmaceuticals on Quantum Computing for Drug Discovery - HPCwire

Read the Rest...

Quantum Computing on a Chip: Brace for the Revolution – Tom’s Hardware

§ July 17th, 2021 § Filed under Quantum Computer Comments Off on Quantum Computing on a Chip: Brace for the Revolution – Tom’s Hardware

In a moment of triumph thats being hailed as equivalent to the move from room-scale silicon technology down to desk-sized machines, quantum computing has now gone chip-scale down from the room-scale contraptions you might have seen elsewhere, including in science fiction.

The development has been spearheaded by Cambridge-based quantum specialist Riverlanes work with New York and London-based digital quantum company Seeqc. Theyre the first to deploy a quantum computing chip that has an integrated operating system for workflow and qubit management (qubits are comparable to classical computings transistors, but capable of pairing between themselves, instantly sharing information via quantum states, and also capable of representing both a 0 and a 1). The last time we achieved this level of miniaturization on a computing technology, we started the computing revolution. Now, expectations for a quantum revolution are on the table as well, and the world will have to adapt to the new reality.

The new chip ushers in scalable quantum computing, and the companies hope to scale the design by increasing surface area and qubit count. The aim is to bring qubits up to millions, a far cry from their current deployed maximum of a (comparatively puny, yet still remarkably complex) 76-qubit system that enabled China to claim quantum supremacy. There are, of course, other ways to scale besides increased qubit counts. Deployment of multiple chips in a single self-contained system or through multiple, inter-connectable systems could provide easier paths to quantum coherency. And on that end, a quantum OS is paramount. Enter Deltaflow.OS.

Deltaflow.OS is a hardware and platform-agnostic OS (think Linux, which populates everything from smartphones to IoT to supercomputers), meaning that it can serve as the control mechanism for various quantum deployment technologies currently being pursued around the globe. And even as multiple independent companies such as Google, Microsoft, and IBM, to name a few pursue the holy grail of quantum supremacy, Riverlanes Deltaflow.OS is an open-source, Github-available OS that's taking the open approach for market penetration.

And this makes sense, since the more than 50 quantum computers already built around the world all operate on independently-developed software. Its such a nascent field still that there are no standards regarding the deployment and control systems. An easily-deployable, quantum hardware-agnostic OS will undoubtedly accelerate development of applications that take advantage of quantum computings strengths, which at the 76 qubit system of China, already enables certain workloads to be crunched millions of times faster than the fastest classical, Turing-type supercomputer could ever hope to achieve.

To achieve this, Riverlane has effectively created a layered Digital Quantum Managament (DQM) SoC (System-On-Chip) that pairs classical computing capabilities with quantum mechanics. The companys diagrams demonstrate what it calls an SFQ (Single Flux Quantum) co-processor as the base layer of the design, which enables the OS to be exposed to developers with a relatively familiar interface for interaction with the qubits. This offers the capability to perform digital qubit control, readout and classical data processing functions, as well as being a platform for error correction.

There are numerous advantages to be taken from this approach, as the SFQs resources are (...) proximally co-located and integrated with qubit chips in a cryo-cooled environment to drastically reduce the complexity of input/output connections and maximize the benefits of fast, precise, low-noise digital control and readout, and energy-efficient classical co-processing. Essentially, some tenets of classical computing still apply, in that the closer the processing parts are, the more performant they are. This enables the OS to run, and is layered next to an active qubit sheet that actually performs the calculations.

Quantum computing has long been the holy grail in development for new processing technologies. However, the complexity of this endeavour cant be understated. The physics for quantum computing are essentially being written as we go, and while that is true, in a way, for many technological and innovation efforts, nowhere does It happen as much as here.

There are multiple questions related to quantum computing and its relationship to classical computing. Thanks to the efforts of Riverlane and Seeqc, the quantum computing ecosystem can now align their needles and collectively problem-solve for deployment and operation of quantum-computing-on-a-chip solutions.

More here:

Quantum Computing on a Chip: Brace for the Revolution - Tom's Hardware

Read the Rest...

Harvard-led physicists have taken a major step in the competition with quantum computing – Illinoisnewstoday.com

§ July 17th, 2021 § Filed under Quantum Computer Comments Off on Harvard-led physicists have taken a major step in the competition with quantum computing – Illinoisnewstoday.com

image: Dolev Bluvstein (from left), Mikhail Lukin, and Sepehr Ebadi have developed a special type of quantum computer known as a programmable quantum simulator. Evadi is adjusting the devices that make them possible to see More

Credits: Rose Lincoln / Harvard Staff Photographer

A team of physicists at the Harvard MIT Ultra-Cryogenic Atomic Center and other universities have developed a special type of quantum computer known as a programmable quantum simulator that can operate at 256 qubits or qubits.

The system sheds light on the host of complex quantum processes, ultimately helping to bring real-world breakthroughs in materials science, communications technology, finance, and many other areas. It shows a big step towards building. Overcome research hurdles beyond the capabilities of todays fastest supercomputers. Qubits are the basic building blocks of quantum computers and are the source of their enormous processing power.

This moves the field to a new territory that no one has ever been to, said Mikhail Lukin, a professor of physics at George Vasmer Leverett, co-director of the Harvard Quantum Initiative and one of the senior authors of the study. Stated.Published in the journal today Nature.. We are entering a whole new part of the quantum world.

According to Sepehr Ebadi, a physics student at the Graduate School of Arts and Sciences at Harvard and the lead author of the study, the unprecedented combination of size and programmability of the system is at the forefront of the quantum computer competition. The mysterious nature of the substance on a very small scale greatly improves its processing power. Under the right circumstances, increasing the cue bit means that the system can store and process more information exponentially than the traditional bits on which a standard computer runs.

The number of quantum states possible with just 256 qubits exceeds the number of atoms in the solar system, Evadi explained the vast size of the system.

Already, the simulator allows researchers to observe some exotic quantum states that have never been experimentally realized, and is accurate enough to serve as an example in a textbook showing how magnetism works at the quantum level. Quantum phase transition research can be performed.

These experiments provide powerful insights into the quantum physics that underlie material properties and help scientists show how to design new materials with exotic properties.

The project uses a significantly upgraded version of the platform developed by researchers in 2017 that was able to reach a size of 51 qubits. The old system allowed researchers to capture ultra-low temperature rubidium atoms and place them in a particular order using a one-dimensional array of individually focused laser beams called optical tweezers.

This new system allows atoms to be assembled into a two-dimensional array of optical tweezers. This increases the achievable system size from 51 qubits to 256 qubits. Tweezers allow researchers to arrange atoms in a defect-free pattern and create programmable shapes such as squares, honeycombs, or triangular grids to design different interactions between cubits.

The flagship product of this new platform is a device called the Spatial Light Modulator, which is used to form the light wave front and generate hundreds of individually focused optical tweezers beams, Ebadi said. Mr. says. These devices are essentially the same as those used in computer projectors to display images on the screen, but we have adapted them as an important component of quantum simulators.

The initial loading of atoms into optical tweezers is random, and researchers need to move the atoms to place them in the shape of the target. Researchers use a second set of moving optical tweezers to drag the atom to the desired position, eliminating the initial randomness. Lasers give researchers complete control over the placement of atomic cubits and their coherent quantum manipulation.

Other senior authors of this study include Professors Svil Sachidef and Marcus Greiner of Harvard University, Stanford University, University of California Berkeley, and Insbrook University of Austria, who worked on the project with Professor Vladin Vretti of Massachusetts Institute of Technology. Includes scientists. Austrian Academy of Sciences and QuEra Computing Inc. in Boston.

Our work is part of a very fierce, highly visible global competition to build larger, better quantum computers, said Harvard University Physics Researcher. Tout Wang, one of the authors of the paper, said. Overall effort [beyond our own] There are leading academic research institutes involved and major private sector investments from Google, IBM, Amazon, and many others.

Researchers are currently working on improving the system by improving laser control over qubits and making the system more programmable. They are also actively exploring how systems can be used in new applications, from exploring the exotic forms of quantum materials to solving challenging real-world problems that can be naturally encoded into qubits. doing.

This study enables a huge number of new scientific directions, Evadi said. We are far from the limits of what we can do with these systems.

###

Harvard-led physicists have taken a major step in the competition with quantum computing

Source link Harvard-led physicists have taken a major step in the competition with quantum computing

Continue reading here:

Harvard-led physicists have taken a major step in the competition with quantum computing - Illinoisnewstoday.com

Read the Rest...

Quantum computing: this is how quantum programming works using the example of random walk – Market Research Telecast

§ July 17th, 2021 § Filed under Quantum Computer Comments Off on Quantum computing: this is how quantum programming works using the example of random walk – Market Research Telecast

A Fritzbox can be set up quickly, but only those who know all the functions can optimize the connection and adequately protect the router.

153 Comments

Thanks to generous subsidies, you can save a lot of money when buying a wallbox. Using specific models, we show the important points in which there are differences.

218 Comments

Autos

If the prices for cryptocurrencies are high, private mining makes a profit. We show how easy it is to calculate ether with the graphics card.

473 Comments

with video

Microsoft Teams is indispensable in the everyday life of many companies. We show how vulnerable the communication tool is and how you can protect yourself.

38 Comments

You can finally set a different standard browser on the iPhone. We show what Google Chrome, Firefox, Microsoft Edge and Brave can do better.

21 Comments

Mac & i

We show how you can flexibly integrate the flush-mounted modules and ready-made components from Shelly into your smart home without a hub and, if you wish, without a cloud.

21 Comments

See original here:

Quantum computing: this is how quantum programming works using the example of random walk - Market Research Telecast

Read the Rest...

IBM shows the advantages of a quantum computer over traditional computers – Tech News Inc

§ July 17th, 2021 § Filed under Quantum Computer Comments Off on IBM shows the advantages of a quantum computer over traditional computers – Tech News Inc

Among the most promising applications of quantum computing, quantum machine learning is set to form waves. But how this could be achieved is still a bit of a mystery.

IBM researchers now claim to have mathematically proven it With a quantum approach, some machine learning problems can be solved faster than conventional computers.

Machine learning is a well-established branch of artificial intelligence, and it is already used in many industries to solve different problems. This involves training an algorithm with large data sets, in order to allow the model to identify different patterns and ultimately calculate the best answer when new information is provided.

With larger data sets, a machine learning algorithm can be improved to provide more accurate answers, but this comes at a computational cost that quickly reaches the limits of traditional hardware. Thats why researchers hope that one day they will be able to harness the enormous computing power of quantum techniques to take machine learning models to the next level.

One method in particular, called quantum nuclei, is the subject of many research articles. In this approach, a quantum computer intervenes only for part of the global algorithm, by expanding the so-called characteristic space, that is, the set of properties used to characterize the data submitted to the model, such as gender or age if the system is trained to recognize patterns in people.

To put it simply, using a quantum nucleus approach, a quantum computer can distinguish between a larger number of features and thus identify patterns even in a huge database, whereas a classical computer would not see just random noise.

READ Recognize gestures or how to interact with a computer or screen without touching it

IBM researchers set out to use this approach to solve a specific type of machine learning problem called classification. As the IBM team explains, the most common example of a classification problem is a computer that receives pictures of dogs and cats and needs to be trained with this data set. The ultimate goal is to allow it to automatically tag all future images it receives whether it is a dog or a cat, with the goal of creating accurate tags in the least amount of time.

Big Blue scientists developed a new classification task and found that a quantum algorithm using the quantum kernel method was able to find relevant features in the data for accurate labeling, while for classical computers, the data set looked like random noise.

The routine we are using is a general method that in principle can be applied to a wide range of problems, Kristan Temme, a researcher at IBM Quantum, told ZDNet. In our research paper, we formally demonstrated that a quantum kernel estimation routine can lead to learning algorithms that, for specific problems, go beyond classical machine learning approaches.

To demonstrate the advantage of the quantum method over the classical approach, the researchers created a classification problem for which data could be generated on a classical computer, and showed that no classical algorithm could do better than a stochastic response to answer the problem.

However, when they visualized the data in a quantum feature map, the quantum algorithm was able to predict the labels very accurately and quickly.

The research team concludes, This article can be considered an important step in the field of quantum machine learning, as it demonstrates a comprehensive acceleration of a quantum nucleus method implemented in a fault-tolerant manner with realistic assumptions.

READ Chinese researchers claim to have the most powerful quantum computer.

Of course, the classification task developed by scientists at IBM is specifically designed to determine whether the quantum nucleus method is useful, and is still far from ready to apply to any kind of large-scale business problem.

According to Kristan Temme, this is mainly due to the limited size of IBMs current quantum computers, which so far can only support less than 100 qubits. There are far from the thousands, if not millions, of qubits that scientists believe are necessary to start creating value in the field of quantum technologies.

At this point, we cant cite a specific use case and say this will have a direct impact, the researcher adds. We have not yet realized the implementation of a large quantum machine learning algorithm. The size of this algorithm is of course directly related to the development of quantum matter.

IBMs latest experiment also applies to a specific type of classification problem in machine learning, and it does not mean that all machine learning problems will benefit from the use of quantum cores.

But the results open the door to further research in this area, to see if other machine learning problems could benefit from using this method.

So much work is still up for debate at the moment, and the IBM team has recognized that any new discovery in this area has many caveats. But until quantum hardware improves, researchers are committed to continuing to prove the value of quantum algorithms, even if from a mathematical point of view.

READ Sportscheck: discounts the outerwear for sale again

Source : ZDNet.com

Excerpt from:

IBM shows the advantages of a quantum computer over traditional computers - Tech News Inc

Read the Rest...

The Future of Data Encryption: What You Need to Know Now – FedTech Magazine

§ July 17th, 2021 § Filed under Quantum Computer Comments Off on The Future of Data Encryption: What You Need to Know Now – FedTech Magazine

Making Encryption Harder, Better, Faster and Stronger

In response, the industry is advancing encryption on several fronts. Some efforts are focused on increasing key sizes to protect against brute-force decryption. Other efforts are looking at new cryptographic algorithms. For example, the National Institute of Standards and Technology isevaluating a next-generation public key algorithm intended to be quantum safe.

The trouble is that most quantum-safe algorithms arent efficient in classical computer architectures. To address this problem, the industry is focused on developing accelerators to speed up algorithms on x86 platforms.

A third area of research ishomomorphic encryption, an amazing concept that allows users to perform calculations on encrypted data without first decrypting it. So, an analyst who needs to can query a database containing classified information without having to ask an analyst with higher clearance to access the data or request that the data be declassified.

A big advantage of homomorphic encryption is that it protects data in all its states at rest (stored on a hard drive), in motion (transmitted across a network) or in use (while in computer memory). Another boon is that its quantum safe, because its based on some of the same math as quantum computing.

A downside is that homomorphic encryption performs very poorly on traditional computers, because its not designed to work with them. The industry is collaborating to develop x86-style instructions to make these new cryptosystems operate at cloud speeds. Practical applications are still a few years away, but were confident well get there.

EXPLORE:How can agencies combat encrypted attacks on government traffic?

In the interim, a new encryption capability has emerged that organizations can take advantage of right now:confidential computing. Confidential computing safeguards data while its being acted upon in computer memory; for example, while a user is conducting analytics on a database.

Confidential computing works by having the CPU reserve a section of memory as a secure enclave, encrypting the memory in the enclave with a key unique to the CPU. Data and application code placed in the enclave can be decrypted only within that enclave, on that CPU. Even if attackers gained root access to the system, they wouldnt be able to read the data.

With the latest generation of computer processors, a two-CPU server can create a 1 terabyte enclave. That enables organizations to place an entire database or transaction server inside the enclave.

The functionality is now being extended with the ability to encrypt all of a computers memory with minimal impact on performance. Total memory encryption uses a platform-specific encryption key thats randomly derived each time the system is booted up. When the computer is turned off, the key goes away. So even if cybercriminals stole the CPU, they wouldnt be able to access the memory.

Confidential computing transforms the way organizations approach security in the cloud, because they no longer have to implicitly trust the cloud provider. Instead, they can protect their data while its in use, even though its being hosted by a third party.

One major cloud provider already offers a confidential computing service to the federal government, and more will surely follow. Agencies can now build enclave-based applications to protect data in use in a dedicated cloud that meets government security and compliance requirements.

The need for strong data encryption wont go away, and the encryption challenges will only increase as quantum computing emerges over the next several years. In the meantime, innovative new encryption capabilities are delivering tighter cybersecurity to agencies today, and the industry is investing in the next generation of cryptosystems to protect government information for the next 25 years.

Originally posted here:

The Future of Data Encryption: What You Need to Know Now - FedTech Magazine

Read the Rest...

Bigger quantum computers, faster: This new idea could be the quickest route to real world apps – ZDNet

§ July 4th, 2021 § Filed under Quantum Computer Comments Off on Bigger quantum computers, faster: This new idea could be the quickest route to real world apps – ZDNet

Rigetti launched the multi-chip device with the objective of reaching 80 qubits later this year, up from the current 31 qubits supported by the company's Aspen processor.

Finding out how to pack as many high-quality qubits as possible on a single quantum processor is a challenge that still keeps most researchers scratching their heads but now quantum startup Rigetti Computing has come up with a radically new approach to the problem.

Instead of focusing on increasing the size of a single quantum processor,Rigetti has linked up various smaller chips together to create, instead, a modular processor that still has a higher overall qubit count.

Describing the technology as the world's "first multi-chip quantum processor", the company launched the device with the objective of reaching 80 qubits later this year, up from the current 31 qubits supported by its Aspen processor.

SEE: Building the bionic brain (free PDF) (TechRepublic)

By that time, the new quantum system will be available for Rigetti customers to use over the firm's Quantum Cloud Services platform.

"We've developed a fundamentally new approach to scaling quantum computers," said Chad Rigetti, the founder of Rigetti Computing. "Our proprietary innovations in chip design and manufacturing have unlocked what we believe is the fastest path to building the systems needed to run practical applications and error correction."

Like IBM and Google, Rigetti's quantum systems are based on superconducting qubits, which are mounted in arrays on a processor where they are coupled and controlled thanks to microwave pulses. Qubits are also connected to a resonator and associated wiring, which enables the system to encode, manipulate and read out quantum information.

Qubits come with special quantum properties that are expected to lend quantum computers unprecedented computational power. But for that to happen, processors will need to pack a significant number of qubits far more than they currently do.

For quantum computers to start generating very early value, experts anticipate that at least 1,000 qubits will be necessary; and a million qubits is often cited as the threshold for most useful applications. In contrast, the most powerful quantum processors currently support less than 100 qubits.

Scaling up the number of qubits sitting on a single processor, however, is difficult. This is mostly due to the fragility of qubits, which need to be kept in ultra-protected environments that are colder than outer space to ensure that they remain in their quantum state. More qubits on a chip, therefore, inevitably mean more potential for failure and lower manufacturing yields.

Instead, Rigetti proposes to connect several identical processors, such as those that the company is already capable of reliably manufacturing, into a large-scale quantum processor.

"This modular approach exponentially reduces manufacturing complexity and allows for accelerated, predictable scaling," said the company.

According to Rigetti, this will also enable future systems to scale in multiplicative ways, as individual chips increase their number of qubits, and new technologies enable more of these chips to be connected into larger processors.

With scale being a top priority for virtually every organization in the quantum ecosystem, Rigetti's new launch could well give the startup a competitive advantage, even in an industry crowded with tech giants the likes of Google, IBM, Microsoft and Amazon.

IBM recently unveiled a roadmap for its quantum hardware thataims to build a 1,121-qubit device for release in 2023.

SEE: Quantum computing just took on another big challenge, one that could be as tough as steel

And smaller players are now emerging, often with the goal of exploring alternatives to superconducting qubits that might enable quantum computers to grow faster. UK start-up Quantum Motion, for instance,recently published the result of an experiment with qubits on silicon chips.

"There is a race to get from the tens of qubits that devices have today, to the thousands of qubits that future systems will require to solve real-world problems," said Amir Safavi-Naeini, assistant professor of applied physics at Stanford University. "Rigetti's modular approach demonstrates a very promising way of approaching these scales."

As demonstrated by Rigetti's latest announcement, new approaches, methods and technologies are constantly developing in the quantum ecosystem. It is unlikely that one clear way forward will stand out anytime soon.

Read more:

Bigger quantum computers, faster: This new idea could be the quickest route to real world apps - ZDNet

Read the Rest...

Quantum Computing Breakthrough: Unveiling Properties of New Superconductor – Analytics Insight

§ July 4th, 2021 § Filed under Quantum Computer Comments Off on Quantum Computing Breakthrough: Unveiling Properties of New Superconductor – Analytics Insight

The collaboration of the School of Physics and Astronomy, of the University of Minnesota and Cornell University, has revealed some unique properties of a new semiconductor such as a superconducting metal. It has created a breakthrough in quantum computing and can be utilized in the nearby future. The metal is known as Niobium diselenide (NbSe2) that can conduct electricity or transport electrons or photons without any resistance. Quantum computing can reap the benefits of this new superconducting metal effectively and efficiently for new innovations.

Niobium diselenide is in 2D form with two-fold symmetry that makes it a more resilient superconductor. There are two types of superconductivity found in this metal conventional wave-type consisting of bulk NbSe2 and unconventional d- or p- wave type for a few layers of NbSe2. These both have the same kind of energies due to the constant interaction and competition between each other. The research teams from both universities have combined the results of two different experimental techniques to generate this ground-breaking discovery. The scientists wanted to investigate the properties of NbSe2 further to able to use unconventional superconducting states to develop advanced quantum computers.

Superconducting metals, help to explore the boundaries between quantum computing and traditional computing with applications in quantum information. The quantum bits transform the functionalities of quantum computers with much higher speed than the traditional ones. Quantum bits exist in a superposition state along with two values 0 and 1 simultaneously with alpha and beta. Quantum computers require around 10,000 qubits to work smartly and help in the entanglement of natures mysteries. Superconductors can create a solid state of the qubit with quantum dots and single-donor systems. These superconductor metals are known for transforming electrons into a single superfluid that can move through a metal lattice without any resistance.

The discovery of 2D crystalline superconductors has opened a plethora of methods to investigate unconventional quantum mechanics. The top-notch quality of monolayer superconductor, NbSe2, is grown by chemical vapor deposition. The growth of these superconductors depends on the ultrahigh vacuum or dangling bond-free substrates that help to reduce environment and substrate-induced defects.

Hence, the world is waiting for further discoveries of some unique properties of any superconducting metal to help in the advancement of quantum computing that can bring certain breakthroughs in industries.

Excerpt from:

Quantum Computing Breakthrough: Unveiling Properties of New Superconductor - Analytics Insight

Read the Rest...

Missing Piece Discovered in the Puzzle of Optical Quantum Computing – SciTechDaily

§ July 4th, 2021 § Filed under Quantum Computer Comments Off on Missing Piece Discovered in the Puzzle of Optical Quantum Computing – SciTechDaily

Jung-Tsung Shen, associate professor in the Department of Electrical & Systems Engineering, has developed a deterministic, high-fidelity, two-bit quantum logic gate that takes advantage of a new form of light. This new logic gate is orders of magnitude more efficient than the current technology. Credit: Jung-Tsung Shen

An efficient two-bit quantum logic gate has been out of reach, until now.

Research from the McKelvey School of Engineering at Washington University in St. Louis has found a missing piece in the puzzle of optical quantum computing.

Jung-Tsung Shen, associate professor in the Preston M. Green Department of Electrical & Systems Engineering, has developed a deterministic, high-fidelity two-bit quantum logic gate that takes advantage of a new form of light. This new logic gate is orders of magnitude more efficient than the current technology.

In the ideal case, the fidelity can be as high as 97%, Shen said.

His research was published in May 2021 in the journalPhysical Review A.

The potential of quantum computers is bound to the unusual properties of superposition the ability of a quantum system to contain many distinct properties, or states, at the same time and entanglement two particles acting as if they are correlated in a non-classical manner, despite being physically removed from each other.

Where voltage determines the value of a bit (a 1 or a 0) in a classical computer, researchers often use individual electrons as qubits, the quantum equivalent. Electrons have several traits that suit them well to the task: they are easily manipulated by an electric or magnetic field and they interact with each other. Interaction is a benefit when you need two bits to be entangled letting the wilderness of quantum mechanics manifest.

But their propensity to interact is also a problem. Everything from stray magnetic fields to power lines can influence electrons, making them hard to truly control.

For the past two decades, however, some scientists have been trying to use photons as qubits instead of electrons. If computers are going to have a true impact, we need to look into creating the platform using light, Shen said.

Photons have no charge, which can lead to the opposite problems: they do not interact with the environment like electrons, but they also do not interact with each other. It has also been challenging to engineer and to create ad hoc (effective) inter-photon interactions. Or so traditional thinking went.

Less than a decade ago, scientists working on this problem discovered that, even if they werent entangled as they entered a logic gate, the act of measuring the two photons when they exited led them to behave as if they had been.The unique features of measurement are another wild manifestation of quantum mechanics.

Quantum mechanics is not difficult, but its full of surprises, Shen said.

The measurement discovery was groundbreaking, but not quite game-changing. Thats because for every 1,000,000 photons, only one pair became entangled. Researchers have since been more successful, but, Shen said, Its still not good enough for a computer, which has to carry out millions to billions of operations per second.

Shen was able to build a two-bit quantum logic gate with such efficiency because of the discovery of a new class of quantum photonic states photonic dimers, photons entangled in both space and frequency. His prediction of their existence was experimentally validated in 2013, and he has since been finding applications for this new form of light.

When a single photon enters a logic gate, nothing notable happens it goes in and comes out. But when there are two photons, Thats when we predicted the two can make a new state, photonic dimers. It turns out this new state is crucial.

High-fidelity, two-bit logic gate, designed by Jung-Tsung Shen. Credit: Jung-Tsung Shen

Mathematically, there are many ways to design a logic gate for two-bit operations. These different designs are called equivalent. The specific logic gate that Shen and his research group designed is the controlled-phase gate (or controlled-Z gate). The principal function of the controlled-phase gate is that the two photons that come out are in the negative state of the two photons that went in.

In classical circuits, there is no minus sign, Shen said. But in quantum computing, it turns out the minus sign exists and is crucial.

Quantum mechanics is not difficult, but its full of surprises.

Jung-Tsung Shen

When two independent photons (representing two optical qubits) enter the logic gate, The design of the logic gate is such that the two photons can form a photonic dimer, Shen said. It turns out the new quantum photonic state is crucial as it enables the output state to have the correct sign that is essential to the optical logic operations.

Shen has been working with the University of Michigan to test his design, which is a solid-state logic gate one that can operate under moderate conditions. So far, he says, results seem positive.

Shen says this result, while baffling to most, is clear as day to those in the know.

Its like a puzzle, he said. It may be complicated to do, but once its done, just by glancing at it, you will know its correct.

Reference: Two-photon controlled-phase gates enabled by photonic dimers by Zihao Chen, Yao Zhou, Jung-Tsung Shen, Pei-Cheng Ku and Duncan Steel, 21 May 2021, Physical Review A. DOI: 10.1103/PhysRevA.103.052610

This research was supported by the National Science Foundation, ECCS grants nos. 1608049 and 1838996. It was also supported by the 2018 NSF Quantum Leap (RAISE) Award.

The McKelvey School of Engineering at Washington University in St. Louis promotes independent inquiry and education with an emphasis on scientific excellence, innovation and collaboration without boundaries. McKelvey Engineering has top-ranked research and graduate programs across departments, particularly in biomedical engineering, environmental engineering and computing, and has one of the most selective undergraduate programs in the country. With 140 full-time faculty, 1,387 undergraduate students, 1,448 graduate students and 21,000 living alumni, we are working to solve some of societys greatest challenges; to prepare students to become leaders and innovate throughout their careers; and to be a catalyst of economic development for the St. Louis region and beyond.

Follow this link:

Missing Piece Discovered in the Puzzle of Optical Quantum Computing - SciTechDaily

Read the Rest...

« Older Entries Newer Entries »



Page 21234..1020..»