Page 10«..9101112..20..»

You are currently browsing the Quantum Computer category

Quantum computing now is a bit like SQL was in the late 80s: Wild and wooly and full of promise – ZDNet

§ November 20th, 2020 § Filed under Quantum Computer Comments Off on Quantum computing now is a bit like SQL was in the late 80s: Wild and wooly and full of promise – ZDNet

Quantum computing is bright and shiny, with demonstrations by Google suggesting a kind of transcendent ability to scale beyond the heights of known problems.

But there's a real bummer in store for anyone with their head in the clouds: All that glitters is not gold, and there's a lot of hard work to be done on the way to someday computing NP-hard problems.

"ETL, if you get that wrong in this flow-based programming, if you get the data frame wrong, it's garbage in, garbage out," according to Christopher Savoie, who is the CEO and a co-founder of a three-year-old startup Zapata Computing of Boston, Mass.

"There's this naive idea you're going to show up with this beautiful quantum computer, and just drop it in your data center, and everything is going to be solved it's not going to work that way," said Savoie, in a video interview with ZDNet. "You really have to solve these basic problems."

"There's this naive idea you're going to show up with this beautiful quantum computer, and just drop it in your data center, and everything is going to be solved it's not going to work that way," said Savoie, in a video interview with ZDNet. "You really have to solve these basic problems."

Zapata sells a programming tool for quantum computing, called Orquestra. It can let developers invent algorithms to be run on real quantum hardware, such as Honeywell's trapped-ion computer.

But most of the work of quantum today is not writing pretty algorithms, it's just making sure data is not junk.

"Ninety-five percent of the problem is data cleaning," Savoie told ZDNet in a video interview. "There wasn't any great toolset out there, so that's why we created Orquestra to do this."

The company on Thursday announced it has received a Series B round of investment totaling $38 million from large investors that include Honeywell's venture capital outfit and returning Series A investors Comcast Ventures, Pitango, and Prelude Ventures, among others. The company has now raised $64.4 million.

Also:Honeywell introduces quantum computing as a service with subscription offering

Zapata was spun out of Harvard University in 2017 by scholars including Aln Aspuru-Guzik, who has done fundamental work on quantum. But a lot of what is coming up are the mundane matters of data prep and other gotchas that can be a nightmare in a bold new world of only partially-understood hardware.

Things such as extract, transform, load, or ETL, which become maddening when prepping a quantum workload.

"We had a customer who thought they had a compute problem because they had a job that was taking a long time; it turned out, when we dug in, just parallelizing the workflow, the ETL, gave them a compute advantage," recalled Savoie.

Such pitfalls are things, said Savoie, that companies don't know are an issue until they get ready to spend valuable time on a quantum computer and code doesn't run as expected.

"That's why we think it's critical for companies to start now," he said, even though today's noisy intermediate-scale quantum, or NISQ, machines have only a handful of qubits.

"You have to solve all these basic problems we really haven't even solved yet in classical computing," said Savoie.

The present moment in time in the young field of quantum sounds a bit like the early days of microcomputer-based relational databases. And, in fact, Savoie likes to make an analogy to the era of the 1980s and 1990s, when Oracle database was taking over workloads from IBM's DB/2.

Also:What the Google vs. IBM debate over quantum supremacy means

"Oracle is a really good analogy, he said. "Recall when SQL wasn't even a thing, and databases had to be turned on a per-on-premises, as-a-solution basis; how do I use a database versus storage, and there weren't a lot of tools for those things, and every installment was an engagement, really," recalled Savoie.

"There are a lot of close analogies to that" with today's quantum, said Savoie. "It's enterprise, it's tough problems, it's a lot of big data, it's a lot of big compute problems, and we are the software company sitting in the middle of all that with a lot of tools that aren't there yet."

Mind you, Savoie is a big believer in quantum's potential, despite pointing out all the challenges. He has seen how technologies can get stymied, but also how they ultimately triumph. He helped found startup Dejima, one of the companies that became a component of Apple's Siri voice assistant, in 1998. Dejima didn't produce an AI wave, it sold out to database giant Sybase.

"We invented this natural language understanding engine, but we didn't have the great SpeechWorks engine, we didn't have 3G, never mind 4G cell phones or OLED displays," he recalled. "It took ten years from 1998 till it was a product, till it was Siri, so I've seen this movie before I've been in that movie."

But the technology of NLP did survive and is now thriving. Similarly, the basic science of quantum, as with the basic science of NLP, is real, is validated. "Somebody is going to be the iPhone" of quantum, he said, although along the way there may be a couple Apple Newtons, too, he quipped.

Even an Apple Newton of quantum will be a breakthrough. "It will be solving real problems," he said.

Also: All that glitters is not quantum AI

In the meantime, handling the complexity that's cropping up now, with things like ETL, suggests there's a role for a young company that can be for quantum what Oracle was for structured query language.

"You build that out, and you have best practices, and you can become a great company, and that's what we aspire to," he said.

Zapata has fifty-eight employees and has had contract revenue since its first year of operations, and has doubled each year, said Savoie.

Read the rest here:

Quantum computing now is a bit like SQL was in the late 80s: Wild and wooly and full of promise - ZDNet

Read the Rest...

Cracking the secrets of an emerging branch of physics – MIT News

§ November 20th, 2020 § Filed under Quantum Computer Comments Off on Cracking the secrets of an emerging branch of physics – MIT News

Thanh Nguyen is in the habit of breaking down barriers. Take languages, for instance: Nguyen, a third-year doctoral candidate in nuclear science and engineering (NSE), wanted to connect with other people and cultures for his work and social life, he says, so he learned Vietnamese, French, German, and Russian, and is now taking an MIT course in Mandarin. But this drive to push past obstacles really comes to the fore in his research, where Nguyen is trying to crack the secrets of a new and burgeoning branch of physics.

My dissertation focuses on neutron scattering on topological semimetals, which were only experimentally discovered in 2015, he says. They have very special properties, but because they are so novel, theres a lot thats unknown, and neutrons offer a unique perspective to probe their properties at a new level of clarity.

Topological materials dont fit neatly into conventional categories of substances found in everyday life. They were first materialized in the 1980s, but only became practical in the mid-2000s with deepened understanding of topology, which concerns itself with geometric objects whose properties remain the same even when the objects undergo extreme deformation. Researchers experimentally discovered topological materials even more recently, using the tools of quantum physics.

Within this domain, topological semimetals, which share qualities of both metals and semiconductors, are of special interest to Nguyen.They offer high levels of thermal and electric conductivity, and inherent robustness, which makes them very promising for applications in microelectronics, energy conversions, and quantum computing, he says.

Intrigued by the possibilities that might emerge from such unconventional physics, Nguyen is pursuing two related but distinct areas of research: On the one hand, Im trying to identify and then synthesize new, robust topological semimetals, and on the other, I want to detect fundamental new physics with neutrons and further design new devices.

On a fast research track

Reaching these goals over the next few years might seem a tall order. But at MIT, Nguyen has seized every opportunity to master the specialized techniques required for conducting large-scale experiments with topological materials, and getting results. Guided by his advisor,Mingda Li, the Norman C Rasmussen Assistant Professor and director of theQuantum Matter Group within NSE, Nguyen was able to dive into significant research even before he set foot on campus.

The summer, before I joined the group, Mingda sent me on a trip to Argonne National Laboratory for a very fun experiment that used synchrotron X-ray scattering to characterize topological materials, recalls Nguyen. Learning the techniques got me fascinated in the field, and I started to see my future.

During his first two years of graduate school, he participated in four studies, serving as a lead author in three journal papers. In one notable project,described earlier this year in Physical Review Letters, Nguyen and fellow Quantum Matter Group researchers demonstrated, through experiments conducted at three national laboratories, unexpected phenomena involving the way electrons move through a topological semimetal, tantalum phosphide (TaP).

These materials inherently withstand perturbations such as heat and disorders, and can conduct electricity with a level of robustness, says Nguyen. With robust properties like this, certain materials can conductivity electricity better than best metals, and in some circumstances superconductors which is an improvement over current generation materials.

This discovery opens the door to topological quantum computing. Current quantum computing systems, where the elemental units of calculation are qubits that perform superfast calculations, require superconducting materials that only function in extremely cold conditions. Fluctuations in heat can throw one of these systems out of whack.

The properties inherent to materials such as TaP could form the basis of future qubits, says Nguyen. He envisions synthesizing TaP and other topological semimetals a process involving the delicate cultivation of these crystalline structures and then characterizing their structural and excitational properties with the help of neutron and X-ray beam technology, which probe these materials at the atomic level. This would enable him to identify and deploy the right materials for specific applications.

My goal is to create programmable artificial structured topological materials, which can directly be applied as a quantum computer, says Nguyen. With infinitely better heat management, these quantum computing systems and devices could prove to be incredibly energy efficient.

Physics for the environment

Energy efficiency and its benefits have long concerned Nguyen. A native of Montreal, Quebec, with an aptitude for math and physics and a concern for climate change, he devoted his final year of high school to environmental studies. I worked on a Montreal initiative to reduce heat islands in the city by creating more urban parks, he says. Climate change mattered to me, and I wanted to make an impact.

At McGill University, he majored in physics. I became fascinated by problems in the field, but I also felt I could eventually apply what I learned to fulfill my goals of protecting the environment, he says.

In both classes and research, Nguyen immersed himself in different domains of physics. He worked for two years in a high-energy physics lab making detectors for neutrinos, part of a much larger collaboration seeking to verify the Standard Model. In the fall of his senior year at McGill, Nguyens interest gravitated toward condensed matter studies. I really enjoyed the interplay between physics and chemistry in this area, and especially liked exploring questions in superconductivity, which seemed to have many important applications, he says. That spring, seeking to add useful skills to his research repertoire, he worked at Ontarios Chalk River Laboratories, where he learned to characterize materials using neutron spectroscopes and other tools.

These academic and practical experiences served to propel Nguyen toward his current course of graduate study. Mingda Li proposed an interesting research plan, and although I didnt know much about topological materials, I knew they had recently been discovered, and I was excited to enter the field, he says.

Man with a plan

Nguyen has mapped out the remaining years of his doctoral program, and they will prove demanding. Topological semimetals are difficult to work with, he says. We dont yet know the optimal conditions for synthesizing them, and we need to make these crystals, which are micrometers in scale, in quantities large enough to permit testing.

With the right materials in hand, he hopes to develop a qubit structure that isnt so vulnerable to perturbations, quickly advancing the field of quantum computing so that calculations that now take years might require just minutes or seconds, he says. Vastly higher computational speeds could have enormous impacts on problems like climate, or health, or finance that have important ramifications for society. If his research on topological materials benefits the planet or improves how people live, says Nguyen, I would be totally happy.

Originally posted here:

Cracking the secrets of an emerging branch of physics - MIT News

Read the Rest...

Neighbor discussion: I would like to post an obituary for my brother, Harry L…. – Patch.com

§ November 20th, 2020 § Filed under Quantum Computer Comments Off on Neighbor discussion: I would like to post an obituary for my brother, Harry L…. – Patch.com

WILMETTE RESIDENT SALLY SCHOCH IS NUTS! AT 86-YEARS YOUNG AND DURING A PANDEMIC SHE OPENS SALLYS NUTS AND SNACK SHOP IN THE RAVINIA DISTRICT OF HIGHLAND PARK

This year may be a bit nuts but nothing is stopping 86-years young Sally Schoch from fulfilling her dream. She and daughter, Kari Guhl are the proud owners of the newly opened Sallys Nuts and Snack Shop in the Ravinia District of Highland Park, offering their signature sweet & salty Sallys Nuts as well as a grab-n-go menu of fun, affordable simple sandwiches, salads, snacks, cheese boards, sweets, drinks, and fun inspired merchandise.

Its never too late to chase a dream!! says Highland Park-resident Kari Guhl of her moms 15-year fetish with making and perfecting nuts for her family and friends. Everyone would tell Sal she should sell her delicious creations and now I am so happy to see my moms dream become a reality. As Sal says, I want to achieve my dream and who knows what the future holds.

Sallys Nuts handcrafted sweet & salty pecans, cashews and almonds are the perfect accompaniment to the Snack Shops array of homemade sandwiches, salads, cheese and meat boards. Some of the affordable, simple options on the menu include peanut butter & jelly, bologna & cheese, cucumber or egg salad sandwiches, and dont forget the bag of carrots! There are also family favorite salads, dressings, and sweets recipes including the creative Junkanoos and Scrabble Mix (you have to come in for yourself to taste and learn more!).

Inspired by Karis son, Sallys Nuts offers a full line of merch perfect for bundling with the sensational nuts in terrific tote bags. The Merch Menu includes the ordinary Ts, sweatshirt, knit, beanie and baseball hats, to the more unique aprons, tea towels, sponges and cutting boards, making customizing a gift for any Sally or nut a great holiday gift option.

Gaining recognition as the nuttiest girls around, this dynamic mother daughter duo can be found making the handcrafted nuts and all the yummy snacks in the back of their shop at 481 Roger Williams Avenue. They invite the public to come get a little nutty and sample some of the best nuts around!

Hours are Tuesday- Saturday 10AM- 6PM. For more information visit https://sallysnuts.com or call (847) 226-7042.

ABOUT SALLY SCHOCH

A longtime businesswoman, mother of four, and School of the Art Institute of Chicago graduate, Sally believes that you are never too old to live out your passions. Selling her art for 63 years, Sally is ready to live out another passion of hers, creating tasty treats. Her love of all things creative, delicious and celebratory has made Sallys Nuts possible.

Sally has been gifting her famous nuts to family and friends while fine tuning her recipe for the last 15 years. After receiving much encouragement, and the help of her daughter, Sally decided that she would start her own business and share her handcrafted nuts with the community.

View post:

Neighbor discussion: I would like to post an obituary for my brother, Harry L.... - Patch.com

Read the Rest...

Confirming simulated calculations with experiment results – Science Codex

§ November 20th, 2020 § Filed under Quantum Computer Comments Off on Confirming simulated calculations with experiment results – Science Codex

Dr Zi Yang MENG from Division of Physics and Astronomy, Faculty of Science, the University of Hong Kong (HKU), is pursuing a new paradigm of quantum material research that combines theory, computation and experiment in a coherent manner. Recently, he teamed up with Dr Wei LI from Beihang University, Professor Yang QI from Fudan University, Professor Weiqiang YU from Renmin University and Professor Jinsheng WEN from Nanjing University to untangle the puzzle of Nobel Prize-winning theory Kosterlitz-Thouless (KT) phase.

Not long ago, Dr Meng, Dr Li and Dr Qi achieved accurate model calculations of a topological KT phase for a rare-earth magnet TmMgGaO4 (TMGO), by performing computation on the Supercomputers Tianhe 1 and Tianhe 2 (see supplementary information); this time, the team overcame several conceptual and experimental difficulties, and succeeded in discovering a topological KT phase and its transitions in the same rare-earth magnet via highly sensitive nuclear magnetic resonance (NMR) and magnetic susceptibility measurements, means of detecting magnetic responses of material. The former one is more sensitive in detecting small magnetic moments while the latter one can facilitate easy implementation of the experiment.

These experimental results, further explained the quantum Monte Carlo computations of the team, have completed the half-a-century pursuit of the topological KT phase in quantum magnetic material, which eventually leads to the Nobel Physics Prize of 2016. The research findings are recently published in renowned academic journal Nature Communications.

KT phase of TMGO is detected

Quantum materials are becoming the cornerstone for the continuous prosperity of human society, including the next-generation AI computing chips that go beyond Moore's law, the high-speed Maglev train, and the topological unit for quantum computers, etc. However, these complicated systems require modern computational techniques and advanced analysis to reveal their microscopic mechanism. Thanks to the fast development of the supercomputing platforms all over the world, scientists and engineers are now making great use of these facilities to discover better materials that benefit our society. Nevertheless, computation cannot stand alone.

In the present investigation, experimental techniques for handling extreme conditions such as low temperature, high sensitivity and strong magnetic field, are required to verify the predictions and make discoveries. These equipments and technologies are acquired and organised by the team members coherently.

The research is inspired by the KT phase theory discovered by V Berezinskii, J Michael Kosterlitz and David J Thouless, of which the latter two are laureates of the Nobel Prize in Physics 2016 (together with F Duncan M Haldane) for their theoretical discoveries of topological phase, and phase transitions of matter. Topology is a new way of classifying and predicting the properties of materials, and now becoming the mainstream of quantum material research and industry, with broad potential applications in quantum computer, lossless transmission of signals for information technology, etc. Back to 1970s, Kosterlitz and Thouless had predicted the existence of topological phase, hence named after them as the KT phase in quantum magnetic materials. Although such phenomena have been found in superfluids and superconductors, KT phase has yet been realised in bulk magnetic material, and is eventually discovered in the present work.

To detect such interesting KT phase in a magnetic material is not easy, as usually the 3-dimensional coupling would render magnetic material to develop ordered phase but not topological phase at low temperature, and even if there exists a temperature window for the KT phase, highly sensitive measurement technique is required to be able to pick up the unique fluctuation pattern of the topological phase, and that is the reason why such phase has been enthusiastically perused, but its experimental discovery has defied many previous attempts. After some initial failures, the team member discovered that the NMR method under in-plane magnetic fields, do not disturb the low-energy electronic states as the in-plane moment in TMGO is mostly multipolar with little interference on magnetic field and intrinsic magnetic moments of the material, which consequently allows the intricated topological KT fluctuations in the phase to be detected sensitively.

As shown in Fig.1, NMR spin-lattice relaxation rate measurements indeed revealed a KT phase sandwiched between a paramagnetic phase at temperature T > T_u and an antiferromagnetic phase at temperature T

This finding indicates a stable phase (KT phase) of TMGO, which serves as a concrete example of topological state of matter in crystalline material, might have potential applications in future information technologies. With its unique properties of topological excitations and strong magnetic fluctuations, many interesting research and potential applications with topological quantum materials can be pursued from here.

Dr Meng said: "It will eventually bring benefits to the society, such that quantum computers, lossless transmission of signals for information technology, faster and more energy-saving high-speed trains, all these dreams could gradually come true from quantum material research."

"Our approach, combining the state-of-art experimental techniques with unbiased quantum many-body computation schemes, enables us to directly compare experimental data to accurate numerical results with key theoretical predictions quantitatively, providing a bridge way to connect theoretical, numerical and experimental studies, the new paradigm set up by the joint team will certainly lead to more profound and impactful discoveries in quantum materials." He added.

The supercomputers used in computations and simulations

The powerful supercomputers Tianhe-1 and Tianhe-2 in China used in the computations are among the world's fastest supercomputers and ranked No.1 in 2010 and 2014 respectively in the TOP500 list (https://www.top500.org/). Their next-generation Tianhe-3 is expected to be in usage in 2021 and will be world first exaFLOPS scale supercomputer. The quantum Monte Carlo and tensor network simulations performed by the joint team make use of the Tianhe supercomputers and requires the parallel simulations for thousands of hours on thousands of CPUs, it will take more than 20 years to finish if performed in common PC.

Read the original here:

Confirming simulated calculations with experiment results - Science Codex

Read the Rest...

What’s Next In AI, Chips And Masks – SemiEngineering

§ November 20th, 2020 § Filed under Quantum Computer Comments Off on What’s Next In AI, Chips And Masks – SemiEngineering

Aki Fujimura, chief executive of D2S, sat down with Semiconductor Engineering to talk about AI and Moores Law, lithography, and photomask technologies. What follows are excerpts of that conversation.

SE: In the eBeam Initiatives recent Luminary Survey, the participants had some interesting observations about the outlook for the photomask market. What were those observations?

Fujimura: In the last couple of years, mask revenues have been going up. Prior to that, mask revenues were fairly steady at around $3 billion per year. Recently, they have gone up beyond the $4 billion level, and theyre projected to keep going up. Luminaries believe a component of this increase is because of the shift in the industry toward EUV. One question in the survey asked participants, What business impact will COVID have on the photomask market? Some people think it may be negative, but the majority of the people believe that its not going to have much of an effect or it might have a positive effect. At a recent eBeam Initiative panel, the panelists commented that the reason for a positive outlook might be because of the demand picture in the semiconductor industry. The shelter-in-place and work-from-home environments are creating more need and opportunities for the electronics and semiconductor industries.

SE: How will extreme ultraviolet (EUV) lithography impact mask revenues?

Fujimura: In general, two thirds of the participants in the survey believe that it will have a positive impact. When you go to EUV, you have a fewer number of masks. This is because EUV brings the industry back to single patterning. 193nm immersion with multiple patterning requires more masks at advanced nodes. With EUV, you have fewer masks, but mask costs for each EUV layer is more expensive.

SE: For decades, the IC industry has followed the Moores Law axiom that transistor density in chips doubles every 18 to 24 months. At this cadence, chipmakers can pack more and smaller transistors on a die, but Moores Law appears to be slowing down. What comes next?

Fujimura: The definition of Moores Law is changing. Its no longer looking at the trends in CPU clock speeds. Thats not changing much. Its scaling more by bit width than by clock speed. A lot of that has to do with thermal properties and other things. We have some theories on where we can make that better over time. On the other hand, if you look at things like massively parallel computing using GPUs or having more CPU cores and how quickly you can access memory or how much memory you can access if you include those things, Moores Law is very much alive. For example, D2S supplies computing systems for the semiconductor manufacturing industry, so we are also a consumer of technology. We do heavy supercomputing, so its important for us to understand whats happening on the computing capability side. What we see is that our ability to compute is continuing to improve at about the same rate as before. But as programmers we have to adapt how we take advantage of it. Its not like you can take the same code and it automatically scales like it did 20 years ago. You have to understand how that scaling is different at any given point in time. You have to figure out how you can take advantage of the strength of the new generation of technology and then shift your code. So its definitely harder.

SE: Whats happening with the logic roadmap?

Fujimura: Were at 5nm in terms of what people are starting to do now. They are starting to plan 3nm and 2nm. And in terms of getting to the 2nm node, people are pretty comfortable. The question is what happens beyond that. It wasnt too long ago that people were saying: Theres no way were going to have 2nm. Thats been the general pattern in the semiconductor industry. The industry is constantly re-inventing itself. It is extending things longer than people ever thought possible. For example, look how long 193nm optical lithography lasted at advanced nodes. At one time, people were waiting for EUV. There was once a lot of doom and gloom about EUV. But despite being late, companies developed new processes and patterning schemes to extend 193nm. It takes coordination by a lot of people to make this happen.

SE: How long can we extend the current technology?

Fujimura: Theres no question that there is a physical limit, but we are still good for the next 10 years.

SE: Theres a lot of activity around AI and machine learning. Where do you see deep learning fitting in?

Fujimura: Deep learning is a subset of machine learning. Its the subset thats made machine learning revolutionary. The general idea of deep learning is to mimic how the brain works with a network of neurons or nodes. The programmer first determines what kind of a network to use. The programmer then trains the network by presenting it with a whole bunch of data. Often, the network is trained by labeled data. Using defect classification as an example, a human or some other program labels each picture as being a defect or not, and may also label what kind of defect it is, or even how it should be repaired. The deep learning engine iteratively optimizes the weights in the network. It automatically finds a set of weights that would result in the network to best mimic the labels. Then, the network is tried on data that it wasnt trained on to test to see if the network learned as intended.

SE: What cant deep learning do?

Fujimura: Deep learning does not reason. Deep learning does pattern matching. Amazingly, it turns out that many of the worlds problems are solvable purely with pattern matching. What you can do with deep learning is a set of things that you just cant do with conventional programming. I was an AI student in the early 1980s. Many of the best computer scientists in the world back then (and ever since) already were trying hard to create a chess program that could beat the chess masters. It wasnt possible until deep learning came along. Applied to semiconductor manufacturing, or any field, there are classes of problems that had not been practically possible without deep learning.

SE: Years ago, there wasnt enough compute power to make machine learning feasible. What changed?

Fujimura: The first publication describing convolutional neural networks was in 1975. The researcher, Dr. Kunihiko Fukushima, called it neocognitron back then, but the paper basically describes deep learning. But computational capability simply wasnt sufficient. Deep learning was enabled with what I call useful waste in massive computations by cost-effective GPUs.

SE: What problems can deep learning solve?

Fujimura: Deep learning can be used for any data. For example, people use it for text-to-speech, speech-to-text, or automatic translation. Where deep learning is most evolved today is when we are talking about two-dimensional data and image processing. A GPU happens to be a good platform for deep learning because of its single instruction multiple data (SIMD) processing nature. The SIMD architecture is also good at image processing, so it makes sense that its applied in that way. So for any problem in which a human expert can look at a picture without any other background knowledge and tell something with high probability, deep learning is likely to be able to do well.

SE: What about machine learning in semiconductor manufacturing?

Fujimura: We have already started to see products incorporating deep learning both in software and equipment. Any tedious and error-prone process that human operators need to perform, particularly those involving visual inspection, are great candidates for deep learning. There are many opportunities in inspection and metrology. There are also many opportunities in software to produce more accurate results faster to help with the turnaround time issues in leading-edge mask shops. There are many opportunities in correlating big data in mask shops and machine log files with machine learning for predictive maintenance.

SE: What are the challenges?

Fujimura: Deep learning is only as good as the data that is being given, so caution is required in deploying deep learning. For example, if deep learning is used to screen resumes by learning from labels provided by prior hiring practices, deep learning learns the biases that are already built into the past practices, even if unintended. If operators tend to make a type of a mistake in categorizing an image, deep learning that learned from the data labeled by that operators past behavior would learn to make the same mistake. If deep learning is used to identify suspected criminal behavior in the street images captured by cameras on the street based on past history of arrests, deep learning will try the best it can to mimic the past behavior. If deep learning is used to identify what a social media user tends to want to see in order to maximize advertising revenues, deep learning will learn to be extremely good at showing the user exactly what the user tends to watch, even if it is highly biased, fake or inappropriate. If misused, deep learning can accentuate and accelerate human addiction and biases. Deep learning is a powerful weapon that relies on the humans wielding it to use it carefully.

SE: Is machine learning more accurate than a human in performing pattern recognition tasks?

Fujimura: In many cases, its found that a deep learning-based program can inference better with a higher percentage of accuracy than a human, particularly when you look at it over time. A human might be able to look at a picture and recognize it with a 99% accuracy. But if the same human has to look at a much larger data set, and do it eight hours a day for 200 days a year, the performance of the human is going to degrade. Thats not true for a computer-based algorithm, including deep learning. The learning algorithms process vast amounts of data. They go through small sections at a time and go through every single one without skipping anything. When you take that into account, deep learning programs can be useful for these error prone processes that are visually oriented or can be cast into being visually oriented.

SE: The industry is working on other technologies to replicate the functions of the brain. Neuromorphic computing is one example. How realistic is this?

Fujimura: The brain is amazing. It will take a long time to create a neural network of the actual brain. There are very interesting computing models in the future. Neuromorphic is not a different computing model. Its a different architecture of how you do it. Its unclear if neuromorphic computing will necessarily create new kinds of capabilities. It does make some of them more efficient and effective.

SE: What about quantum computing?

Fujimura: The big change is quantum computing. That takes a lot of technology, money and talent. Its not an easy technology to develop. But you can bet that leading technology countries are working on it, and there is no question in my mind that its important. Take security, for example. 256-bit encryption is nothing in basic quantum computing. Security mechanisms would have to be significantly revamped in the world of quantum computing. Quantum computing used in a wrong way can be destructive. Staying ahead of that is a matter of national security. But quantum computing also can be very powerful in solving problems that were considered intractable. Many iterative optimization problems, including deep learning training, will see major discontinuities with quantum computing.

SE: Lets move back to the photomask industry. Years ago, the mask was simple. Over time, masks have become more complex, right?

Fujimura: At 130nm or around there, you started to see decorations on the mask. If you wanted to draw a circle on the wafer using Manhattan or rectilinear shapes, you actually drew a square on the mask. Eventually, it would become a circle on the wafer. However, starting at around 130nm, that square on the mask had to be written with decorations in all four corners. Then, SRAFs (sub-resolution assist features) started to appear on the mask around 90nm. There might have been some at 130nm, but mostly at 90nm. By 22nm, you couldnt find a critical layer mask that didnt have SRAFs on them. SRAFs are features on the mask that are designed explicitly not to print on the wafer. Through an angle, SRAFs project light into the main features that you do want to print on a wafer enough so that it helps to augment the amount of energy thats being applied to the resist. Again, this makes the printing of the main features more resilient to manufacturing process variation.

SE: Then multiple patterning appeared around 16nm/14nm, right?

Fujimura: The feature sizes became smaller and more complex. When we reached the limit of resolution for 193i, there was no choice but to go to multiple patterning, where multiple masks printed one wafer layer. You divide the features that you want on a given wafer layer and you put them on different masks. This provided more space for SRAFs for each of the masks. EUV for some layers is projected to go to multiple patterning, too. It costs more to do multiple patterning, but it is a familiar and proven technique for extending lithography to smaller nodes.

SE: To pattern a photomask, mask makers use e-beam mask writer systems based on variable shaped beam (VSB) technology. Now, using thousands of tiny beams, multi-beam mask writers are in the market. How do you see this playing out?

Fujimura: Most semiconductor devices are being patterned using VSB writers for the critical layers. Thats working fine. The write times are increasing. If you look at the eBeam Initiatives recent survey, the average write times are still around 8 hours. Going forward, we are moving toward more complex processes with EUV masks. Today, EUV masks are fairly simple. Rectangular writing is enough. But you need multi-beam mask writers because of the resist sensitivity. The resists are slow in order to be more accurate. We need to apply a lot of energy to make it work, and that is better with multi-beam mask writers.

SE: Whats next for EUV masks?

Fujimura: EUV masks will require SRAFs, too. They dont today at 7nm. SRAFs are necessary for smaller features. And, for 193i as well as for EUV, curvilinear masks are being considered now for improvements in wafer quality, particularly in resilience to manufacturing variation. But for EUV in particular, because of the reflective optics, curvilinear SRAFs are needed even more. Because multi-beam mask writing enables curvilinear mask shapes without a write time penalty, the enhanced wafer quality in the same mask write time is attractive.

SE: What are the big mask challenges going forward?

Fujimura: There are still many. EUV pellicles, affordable defect-free EUV mask blanks, high- NA EUV, and actinic or e-beam-based mask inspection both in the mask shop and in the wafer shop for requalification are all important areas for advancement. Now, the need to adopt curvilinear mask shapes has been widely acknowledged. Data processing, including compact and lossless data representation that is fast to write and read, is an important challenge. Optical proximity correction (OPC) and inverse lithography technology (ILT), which are needed to produce these curvilinear mask shapes to maximize wafer performance, need to run fast enough to be practical.

SE: What are the challenges in developing curvilinear shapes on masks?

Fujimura: There are two issues. Without multi-beam mask writers, producing masks with curvilinear shapes can be too expensive or may practically take too long to write. Second, controlling the mask variation is challenging. Once again, the reason you want curvilinear shapes on the mask is because wafer quality improves substantially. That is even more important for EUV than in 193nm immersion lithography. EUV masks are reflective. So, there is also a 6-degree incidence angle on EUV masks. And that creates more desire to have curvilinear shapes or SRAFs. They dont print on wafer. They are printed on the mask in order to help decrease process variation on the wafer.

SE: What about ILT?

Fujimura: ILT is an advanced form of OPC that computes the desired mask shapes in order to maximize the quality of wafer lithography. Studies have shown that ILT in particular, unconstrained curvilinear ILT can produce the best results in terms of resilience to manufacturing variation. D2S and Micron recently presented a paper on the benefits of full-chip, curvilinear stitchless ILT with mask-wafer co-optimization for memory applications. This approach enabled more than a 2X improvement in process windows.

SE: Will AI play a big role in mask making?

Fujimura: Yes. In particular, with deep learning, the gap between a promising prototype and a production-level inference engine is very wide. While there was quite a bit of initial excitement over deep learning, the world still has not seen very much in production adoption of deep learning. A large amount of this comes from the need for data. In semiconductor manufacturing, data security is extremely important. So while a given manufacturer would have plenty of data of its own kind, a vendor of any given tool, whether software or equipment, has a difficult time getting enough customer data. Even for a manufacturer, creating new data say, a SEM picture of a defect can be difficult and time-consuming. Yet deep learning programming is programming with data, instead of writing new code. If a deep learning programmer wants to improve the success rate of an inference engine from 92% to 95%, that programmer needs to analyze the engine to see what types of data it needs to be additionally trained to make that improvement, then acquire many instances of that type of data, and then iterate. The only way this can be done efficiently and effectively is to have digital twins, a simulated environment that generates data instead of relying only on physical real sample data. Getting to 80% success rate can be done with thousands of collected real data. But getting to 95% success rate requires digital twins. It is the lack of this understanding that is preventing production deployment of deep learning in many potential areas. It is clear to me that many of the tedious and error-prone processes can benefit from deep learning. And it is also clear to me that acceleration of many computing tasks using deep learning will benefit the deployment of new software capabilities in the mask shop.

Related Stories

EUVs Uncertain Future At 3nm And Below

Challenges Linger For EUV

Mask/Lithography Issues For Mature Nodes

The Evolution Of Digital Twins

Next-Gen Mask Writer Race Begins

Link:

What's Next In AI, Chips And Masks - SemiEngineering

Read the Rest...

One for the haters: Twitter considers adding a dislike button – The Next Web

§ November 20th, 2020 § Filed under Quantum Computer Comments Off on One for the haters: Twitter considers adding a dislike button – The Next Web

Over the years there have been two missing components everyone on Twitter moans about: the notorious edit option and the dislike button. Well, it turns out we might be getting one of those in the future.

Responding to a tweet from security expert Jackie Singh, Twitter product lead Kayvon Beykpour revealed the company is exploring adding a dislike button to its platform but its simply not one of its most urgent priorities.

Instead, Twitter is currently concentrating its efforts on cutting the spread of inauthentic behavior, enhancing the safety of its users with better tools to curb and report harassment, and cracking down on misinformation that could have harmful effects on itsusers.

Anyone who actively uses Twitter already knows the company has spent a considerable amount of time on battling harassment and the spread of misinformation on its platform. Indeed, it has introduced a slewof features aimed at solving those two issues over the years.

More recently, the company shared it had labeled over 300,000 tweets for election misinformation, some of which were posted by none other than US President Donald Trump.

To be fair, Twitter has previously experimented with the idea of a dislike button, although not quite in the same way its like button works.

The company had briefly made it possible for users to report tweets they dont like, but it was impossible for other users to see a tally of the dislikes a tweet had received. Its unclear if Twitter is exploring any alternatives beyond this, but time will tell.

Until then, youll simply have to do with the good old ratio.

via Gizmodo

Read next: Google Chrome introduces tab search here's how to use it

Read more:

One for the haters: Twitter considers adding a dislike button - The Next Web

Read the Rest...

Is Now the Time to Start Protecting Government Data from Quantum Hacking? – Nextgov

§ November 18th, 2020 § Filed under Quantum Computer Comments Off on Is Now the Time to Start Protecting Government Data from Quantum Hacking? – Nextgov

My previous column about the possibility of pairing artificial intelligence with quantum computing to supercharge both technologies generated a storm of feedback via Twitter and email. Quantum computing is a science that is still somewhat misunderstood, even by scientists working on it, but might one day be extremely powerful. And artificial intelligence has some scary undertones with quite a few trust issues. So I understand the reluctance that people have when considering this marriage of technologies.

Unfortunately, we dont really get a say in this. The avalanche has already started, so its too late for all of us pebbles to vote against it. All we can do now is deal with the practical ramifications of these recent developments. The most critical right now is protecting government encryption from the possibility of quantum hacking.

Two years ago I warned that government data would soon be vulnerable to quantum hacking, whereby a quantum machine could easily shred the current AES encryption used to protect our most sensitive information. Government agencies like NIST have been working for years on developing quantum-resistant encryption schemes. But adding AI to a quantum computer might be the tipping point needed to give quantum the edge, while most of the quantum-resistant encryption protections are still being slowly developed. At least, that is what I thought.

One of the people who contacted me after my last article was Andrew Cheung, the CEO of 01 Communique Laboratory and IronCAP. They have a product available right now which can add quantum-resistant encryption to any email. Called IronCAP X, its available for free for individual users, so anyone can start protecting their email from the threat of quantum hacking right away. In addition to downloading the program to test, I spent about an hour interviewing Cheung about how quantum-resistant encryption works, and how agencies can keep their data protection one step ahead of some of the very same quantum computers they are helping to develop.

For Cheung, the road to quantum-resistant encryption began over 10 years ago, long before anyone was seriously engineering a quantum computer. It almost felt like we were developing a bulletproof vest before anyone had created a gun, Cheung said.

But the science of quantum-resistant encryption has actually been around for over 40 years, Cheung said. It was just never specifically called that. People would ask how we could develop encryption that would survive hacking by a really fast computer, he said. At first, nobody said the word quantum, but that is what we were ultimately working against.

According to Cheung, the key to creating quantum-resistant encryption is to get away from the core strength of computers in general, which is mathematics. He explained that RSA encryption used by the government today is fundamentally based on prime number factorization, where if you multiply two prime numbers together, the result is a number that can only be broken down into those primes. Breaking encryption involves trying to find those primes by trial and error.

So if you have a number like 21, then almost anyone can use factorization to quickly break it down and find its prime numbers, which are three and seven. If you have a number like 221, then it takes a little bit longer for a human to come up with 13 and 17 as its primes, though a computer can still do that almost instantaneously. But if you have something like a 500 digit number, then it would take a supercomputer more than a century to find its primes and break the related encryption. The fear is that quantum computers, because of the strange way they operate, could one day do that a lot more quickly.

To make it more difficult for quantum machines, or any other kind of fast computer, Cheung and his company developed an encryption method based on binary Goppa code. The code was named for the renowned Russian mathematician who invented it, Valerii Denisovich Goppa, and was originally intended to be used as an error-correcting code to improve the reliability of information being transmitted over noisy channels. The IronCAP program intentionally introduces errors into the information its protecting, and then authorized users can employ a special algorithm to decrypt it, but only if they have the private key so that the numerous errors can be removed and corrected.

What makes encryption based on binary Goppa code so powerful against quantum hacking is that you cant use math to guess at where or how the errors have been induced into the protected information. Unlike encryption based on prime number factorization, there isnt a discernible pattern, and theres no way to brute force guess at how to remove the errors. According to Cheung, a quantum machine, or any other fast system like a traditional supercomputer, cant be programmed to break the encryption because there is no system for it to use to begin its guesswork.

A negative aspect to binary Goppa code encryption, and also one of the reasons why Cheung says the protection method is not more popular today, is the size of the encryption key. Whether you are encrypting a single character or a terabyte of information, the key size is going to be about 250 kilobytes, which is huge compared with the typical 4 kilobyte key size for AES encryption. Even ten years ago, that might have posed a problem for many computers and communication methods, though it seems tiny compared with file sizes today. Still, its one of the main reasons why AES won out as the standard encryption format, Cheung says.

I downloaded the free IronCAP X application and easily integrated it into Microsoft Outlook. Using the application was extremely easy, and the encryption process itself when employing it to protect an email is almost instantaneous, even utilizing the limited power of an average desktop. And while I dont have access to a quantum computer to test its resilience against quantum hacking, I did try to extract the information using traditional methods. I can confirm that the data is just unreadable gibberish with no discernable pattern to unauthorized users.

Cheung says that binary Goppa code encryption that can resist quantum hacking can be deployed right now on the same servers and infrastructure that agencies are already using. It would just be a matter of switching things over to the new method. With quantum computers evolving and improving so rapidly these days, Cheung believes that there is little time to waste.

Yes, making the switch in encryption methods will be a little bit of a chore, he said. But with new developments in quantum computing coming every day, the question is whether you want to maybe deploy quantum-resistant encryption two years too early, or risk installing it two years too late.

John Breeden II is an award-winning journalist and reviewer with over 20 years of experience covering technology. He is the CEO of the Tech Writers Bureau, a group that creates technological thought leadership content for organizations of all sizes. Twitter: @LabGuys

Continued here:

Is Now the Time to Start Protecting Government Data from Quantum Hacking? - Nextgov

Read the Rest...

Inside the Competition That Will Save Bitcoin From Quantum Computers – Decrypt

§ November 18th, 2020 § Filed under Quantum Computer Comments Off on Inside the Competition That Will Save Bitcoin From Quantum Computers – Decrypt

Andersen Cheng's wife wanted him to take it easy after he sold his cyber-security companies for ~$200 million in 2006 at the age of 43. But he returned to the fray for one last missionto save the world from quantum computers, whose immense power he believes threatens total social and economic collapse.

They can hack into any cell phone, laptopsanything, he told Decrypt in a recent interview. Even Bitcoin wallets.

For the past 14 years, Cheng, now 57, has run Post-Quantum, a British company building an encryption algorithm resistant to quantum computers. Quantum computers, still prototypes, are thousands of times faster than supercomputers and could crack all modern encryption within seconds.

It'll be about a decade until Googles quantum computer hits the shelves (Google is believed to be a frontrunner in the race to build a quantum machine.) Yet Cheng said he was tipped off by anonymous friends from the British intelligence world, to whom he has sold cybersecurity software since the 80s, that quantum computers produced in secrecy by governments could crack encryption within three years.

While the timeline might be debatable, the end result is not: Unless we get in front of the problem, a quantum computer, once operational, could reveal every governments secrets, drain any bank account and overpower nuclear power stations, said Cheng. The machines could also destroy Bitcoina hacker could use a quantum computer to reverse-engineer your public keys to work out your private ones, then drain your Bitcoin wallet.

Its like walking into a bank vault without drawing a gun: Its totally wide open, he said.

Cheng claims that unless we act soon the computerized world could devolve into complete and utter financial collapse. And thats precisely what his company wants to avert.

Post-Quantum believes it has created a quantum-resistant encryption protocol that banks and governments could use to re-encrypt their files, and that blockchains could use to prevent people from hacking the network.

According to CJ Tjhai, one of the co-founders of Post-Quantum and an architect of the protocol, heres how it works. Post-Quantums algorithm encrypts a message by padding it out with redundant data and deliberately corrupting it with random errors. The ciphertext recipient with the correct private key knows which fluff to cut and how to correct any errors.

You add some extra data to the filesome garbage thats only meaningful to the private key holder. And you then also corrupt the file: you add errors to itflip the bits, he said. Its a little like how archivists use artificial intelligence to restore grainy videos of WW2 dogfights.

Tjhai said that this algorithm is far more secure than todays common encryption algorithm, RSA, whose private keys are forged from the factorization of two numbers. It would take thousands of years for even the most powerful supercomputer to guess the numbers, though a quantum computer would have no problem.

Of Post-Quantums encryption method, Tjhai said, People can try to break this thing using quantum computers, but from what we understand now, they can do it, but it will take an extremely long time. Thats because quantum computers arent designed to be efficient at cracking these kinds of codes.

Post-Quantums algorithm is based on an algorithm created in 1978 by Caltech professor Robert McEliece. It doesnt require a powerful computer and is pretty fast. But its only feasible today because hard drives are larger and internet speeds are faster. RSA-2048 has a public key size of 256 bytes, while a code-based algorithm like Post Quantum's can be a minimum of 255 kilobytes.

Tjhai said the algorithm could also project Bitcoin. It would be trivial for someone using a quantum computer to work out the private keys to your wallet, so long as they knew the public key. With quantum computers, we will be able to reverse that [public key] into the private key, he said.

In July 2020, the National Institute of Standards and Technologythe US agency that sets global standards for encryption protocolsannounced that Post-Quantums encryption algorithm had beaten 82 others to become one of 15 finalists of a four-year-long competition to build a quantum-resistant algorithm.

Post-Quantums algorithm is up against three finalists from another class of cryptography: lattice-based schemes, whose algorithms crack codes by finding lines in a grid. Its expected that NIST will choose a finalist from each scheme for standardization by early 2022.

To reach the final round, Post-Quantum in February merged its submission into one created by one of the worlds foremost cryptographers, Daniel Bernstein.

Post-Quantum is the smaller fishthough Cheng said that it is by no means less able. Bernsteins work has thousands of citations and hes a professor at two leading universities; Chengs 14-person-strong company (plus ten contractors) receives no government funding (in 2016 it raised $10.3 million in a Series A), and until the pandemic, operated from an office above a busy McDonalds abridged to a central London train station.

Andreas Hlsing, a cryptographer from the Eindhoven University of Technology and a finalist on a digital signature submission to the NIST competition called SPHINCS+ and a public-key encryption algorithm called NTRU, told Decrypt that the NIST competition feels more cooperative than a fight to the death; Hlsing, for instance, has worked with many of his competitors and once studied under Bernstein.

The schemes which made it to the end are actually the schemes which were around already for the last maybe 10 years, and were essentially tweaked, he said. Post-Quantums submission is a tweak of a scheme created back in the 70s.

There were a bunch of proposals which really tried to do a lot [of new things], and sadly, most of them actually failed, said Hlsing. The finalists, such as Post-Quantums proposal, are well-studiedthey just werent suitable for the last generation of computers.

You don't have many different options. Theyre all old schemes, which people try to optimize in a certain way," he said.

Post-Quantums ambitions extend beyond the NIST competition. The protocol powers a forthcoming VPN and was the backbone of its short-lived quantum-secure chat app; the company removed it from the Google Play store after ISIS started using it to coordinate attacks. Too much hassle, said Cheng.

Dont get me wrongwe still want to make some money out of it, said Cheng, who headed JPMorgans credit risk department in Europe back in the late 90s, saving the world from Y2Ka computer bug many feared would crash the programs holding society together on January 1, 2000, because programmers in the 60s hadnt the foresight to believe that people would still use them in the new millennium.

It sure beats retirement. "There's only so much golf you can play," he said.

Read more:

Inside the Competition That Will Save Bitcoin From Quantum Computers - Decrypt

Read the Rest...

Does Schrdinger’s Cat Think Quantum Computing Is a Sure Thing? – Walter Bradley Center for Natural and Artificial Intelligence

§ November 18th, 2020 § Filed under Quantum Computer Comments Off on Does Schrdinger’s Cat Think Quantum Computing Is a Sure Thing? – Walter Bradley Center for Natural and Artificial Intelligence

Some hope that a move to quantum computingqubits instead of bits, analog instead of digitalwill work wonders, including the invention of the true thinking computer. In last weeks podcast, futurist George Gilder and computer engineer Robert J. Marks looked at, among other things, whats really happening with quantum computing:

(The quantum computing discussion begins at 15:04.)

Robert J. Marks: Whats your take on quantum computing? It seems to me that theres been glacial progress in the technology.

George Gilder (pictured): I think quantum computing is rather like AI, in that it moves the actual problem outside the computational process and gives the illusion that it solved the problem, but its really just pushed the problem out. Quantum computing is analog computing, thats what it is. Its changing primitives of the computation to quantum elements, which are presumably the substance of all matter in the universe.

Note: Quantum computing would use actual quantum elements (qubits) to compute instead of digital signals, thus taking advantage of their subatomic speed. But AI theorists have noted, that doesnt get around the halting problem (the computer actually doesnt know what it is doing). That means that a computer still wouldnt replicate human intelligence. That, in turn, is one reason that quantum supremacy can sound a lot like hype.

George Gilder: But still youve got to translate the symbols in the world, which in turn have to be translated from the objects in the world, into these qubits, which are quantum entities. Once youve defined all these connections and structured the data, then the problem is essentially solved by the process of defining it and inputting it into the computer but quantum computing again is a very special purpose machine, extremely special purpose. Because everything has to be exactly structured right for it.

Robert J. Marks: Yeah, thats my point. I think that once we get quantum computing and if it works well, we can also do quantum encryption, which quantum computing cant decode. So thats the next step. So yeah, thats fascinating stuff.

In his new book, Gaming AI (free download here. ), Gilder explains one of the ways quantum computing differs from digital computing:

The qubit is one of the most enigmatic tangles of matter and ghost in the entire armament of physics. Like a binary digit, it can register 0 or 1; what makes it quantum is that it can also register a nonbinary superposition of 0 and 1.

In 1989 I published a book, Microcosm, with the subtitle The Quantum Era in Economics and Technology. Microcosm made the observation that all computers are quantum machines in that they shun the mechanics of relays, cogs, and gears, and manipulate matter from the inside following quantum rules. But they translate all measurements and functions into rigorous binary logicevery bit is 1 or 0. At the time I was writing Microcosm, a few physicists were speculating about a computer that used qubits rather than bits, banishing this translation process and functioning directly in the quantum domain. (P. 39)

The quantum world impinges on computer technology whether we like it or not:

For example, today the key problem in microchips is to avoid spontaneous quantum tunneling, where electrons can find themselves on the other side of a barrier that by the laws of classical physics would have been insurmountable and impenetrable. In digital memory chips or processors, spontaneous tunneling can mean leakage and loss. In a quantum computer, though, such quantum effects may endow a portfolio of features, providing a tool or computational primitive that enables simulation of a world governed by quantum rules. (p. 40)

Quantum rules, while strange, might insure the integrity of a connection because entangled quantum particles respond to each other no matter how far they are separated:

A long-ago thought experiment of Einsteins showed that once any two photonsor other quantum entitiesinteract, they remain in each others influence no matter how far they travel across the universe (as long as they do not interact with something else). Schrdinger christened this entanglement: The spinor other quantum attributeof one behaves as if it reacts to what happens to the other, even when the two are impossibly remote. (p. 40)

So, apart from interaction, no one can change only the data on their side without it being noticed

Underlying all this heady particle physics and quantum computing speculations is actually a philosophical shift. As Gilder puts it in Gaming AI,

John Wheeler provocatively spoke of it from bit and the elementary act of observer-participancy: in short all things physical are information-theoretic in origin and this is a participatory universe.(p. 41)

Which is another way of saying that in reality information, rather than matter and energy, rules our universe.

Also discussed in last weeks podcast (with links to the series and transcripts):

While the West hesitates, China is moving to blockchain. Life After Google by George Gilder, advocating blockchain, became a best seller in China and received a social sciences award. George Gilder, also the author of Gaming AI, explains why Bitcoin might not do as well as blockchain in general, as a future currency source.

You may also enjoy: Will quantum mechanics produce the true thinking computer. Quantum computers come with real world problems of their own.

and

Why AI geniuses havent created true thinking machines. The problems have been hinting at themselves all along.

Next: Whats the future for carbon computing?

Follow this link:

Does Schrdinger's Cat Think Quantum Computing Is a Sure Thing? - Walter Bradley Center for Natural and Artificial Intelligence

Read the Rest...

Neurals guide to the glorious future of AI: Heres how machines become sentient – The Next Web

§ November 18th, 2020 § Filed under Quantum Computer Comments Off on Neurals guide to the glorious future of AI: Heres how machines become sentient – The Next Web

Welcome to Neurals guide to the glorious future of AI. What wonders will tomorrows machines be capable of? How do we get from Alexa and Siri to Rosie the Robot and R2D2? In this speculative science series well put our optimist hats on and try to answer those questions and more. Lets start with a big one: The Singularity.

The future realization of robot lifeforms is referred to by a plethora of terms sentience, artificial general intelligence (AGI), living machines, self-aware robots, and so forth but the one that seems most fitting is The Singularity.

Rather than debate semantics, were going to sweep all those little ways of saying human-level intelligence or better together and conflate them to mean: A machine capable of at least human-level reasoning, thought, memory, learning, and self-awareness.

Modern AI researchers and developers tend to gravitate towards the term AGI. Normally, wed agree because general intelligence is grounded in metrics we can understand to qualify, an AI would have to be able to do most stuff a human can.

But theres a razor-thin margin between as smart as and smarter than when it comes to hypothetical general intelligence and it seems likely a mind powered by super computers, quantum computers, or a vast network of cloud servers would have far greater sentient potential than our mushy organic ones. Thus, well err on the side of superintelligence for the purposes of this article.

Before we can even start to figure out what a superintelligent AI would be capable of, however, we need to determine how its going to emerge. Lets make some quick decisions for the purposes of discussion:

So how will our future metal buddies gain the spark of consciousness? Lets get super scientific here and crank out a listicle with five separate ways AI could gain human-level intelligence and awareness:

In this first scenario, if we predict even a modest year-over-year increase in computation and error-correction abilities, it seems entirely plausible that machine intelligence could be brute-forced into existence by a quantum computer running strong algorithms in just a couple centuries or so.

Basically, this means the incredibly potent combination of exponentially increasing power and self-replicating artificial intelligence could cook up a sort of digital, quantum, primordial soup for AI where we just toss in some parameters and let evolution take its place. Weve already entered the era of quantum neural networks, a quantum AGI doesnt seem all that far-fetched.

What if intelligence doesnt require power? Sure, our fleshy bodies need energy to continue being alive and computers need electricity to run. But perhaps intelligence can exist without explicit representation. In other words: what if intelligence and consciousness can be reduced to purely mathematical concepts that only when properly executed became apparent?

A researcher by the name of Daniel Buehrer seems to think this could be possible. They wrote a fascinating research paper proposing the creation of a new form of calculus that would, effectively, allow an intelligent master algorithm to emerge from its own code.

The master algorithm idea isnt new the legendary Pedro Domingos literally wrote the book on the concept but what Buehrers talking about is a different methodology. And a very cool one at that.

Heres Buehrers take on how this hypothetical self-perpetuating calculus could unfold into explicit consciousness:

Allowing machines to modify their own model of the world and themselves may create conscious machines, where the measure of consciousness may be taken to be the number of uses of feedback loops between a class calculuss model of the world and the results of what its robots actually caused to happen in the world.

They even go on to propose that such a consciousness would be capable of having little internal thought wars to determine which actions occurring in the machines minds eye should be effected into the physical world. The whole paper is pretty wild, you can read more here.

This ones pretty easy to wrap your head around (pun intended). Instead of a bunch of millionaire AI developers with billion-dollar big tech research labs figuring out how to create a new species of intelligent being out of computer code, we just figure out how to create a perfect artificial brain.

Easy right? The biggest upside here would be the potential for humans and machines to occupy the same spaces. This is clearly a recipe for augmented humans cyborgs. Perhaps we could become immortal by transferring our own consciousnesses into non-organic brains. But the bigger picture would be the ability to develop robots and AI in the true image of humans.

If we can figure out how to make a functional replica of the human brain, including the entire neural network housed within it, all wed need to do iskeep it running and shovel the right components and algorithms into it.

Maybe conscious machines are already here. Or maybe theyll quietly show up a year or a hundred years from now completely hidden in the background. Im talking about cloud consciousness and the idea that a self-replicating, learning AI created solely to optimize large systems could one day gain a form of sentience that would, qualitatively, indicate superintelligence but otherwise remain unnoticed by humans.

How could this happen? Imagine if Amazon Web Services or Google Search released a cutting-edge algorithm into their respective systems a few decades from now and it created its own self-propagating solution system that, through the sheer scope of its control, became self-aware. Wed have a ghost in the machine.

Since this self-organized AI system wouldnt have been designed to interface with humans or translate its interpretations of the world it exists in into something humans can understand, it stands to reason that it could live forever as a superintelligent, self-aware, digital entity without ever alerting us to its presence.

For all we know theres a living, sentient AI chilling out in the Gmail servers just gathering data on humans (note: there almost certainly isnt, but its a fun thought exercise).

Dont laugh. Of all the methods by which machines could hypothetically gain true intelligence, alien tech is the most likely to make it happen in our lifetimes.

Here we can make one of two assumptions: Aliens will either visit us sometime in the near future (perhaps to congratulate us on achieving quantum-based interstellar communication) or well discover some ancient alien technology once we put humans on Mars within the next few decades. These are the basicplots of Star Trek andthe Mass Effect video game series respectively.

Heres hoping that, no matter how The Singularity comes about, it ushers in a new age of prosperity for all intelligent beings.But just in case it doesnt work out so well, weve got something thatll help you prepare for the worst. Check out these articles in Neurals Beginners Guide to the AI Apocalypse series:

Published November 18, 2020 19:50 UTC

See the original post here:

Neurals guide to the glorious future of AI: Heres how machines become sentient - The Next Web

Read the Rest...

CCNY & partners in quantum algorithm breakthrough | The City College of New York – The City College of New York News

§ November 18th, 2020 § Filed under Quantum Computer Comments Off on CCNY & partners in quantum algorithm breakthrough | The City College of New York – The City College of New York News

Researchers led by City College of New York physicist Pouyan Ghaemi report the development of a quantum algorithm with the potential to study a class of many-electron quantums system using quantum computers. Their paper, entitled Creating and Manipulating a Laughlin-Type =1/3 Fractional Quantum Hall State on a Quantum Computer with Linear Depth Circuits, appears in the December issue of PRX Quantum, a journal of the American Physical Society.

Quantum physics is the fundamental theory of nature which leads to formation of molecules and the resulting matter around us, said Ghaemi, assistant professor in CCNYs Division of Science. It is already known that when we have a macroscopic number of quantum particles, such as electrons in the metal, which interact with each other, novel phenomena such as superconductivity emerge.

However, until now, according to Ghaemi, tools to study systems with large numbers of interacting quantum particles and their novel properties have been extremely limited.

Our research has developed a quantum algorithm which can be used to study a class of many-electron quantum systems using quantum computers. Our algorithm opens a new venue to use the new quantum devices to study problems which are quite challenging to study using classical computers. Our results are new and motivate many follow up studies, added Ghaemi.

On possible applications for this advancement, Ghaemi, whos also affiliated with the Graduate Center, CUNY noted: Quantum computers have witnessed extensive developments during the last few years. Development of new quantum algorithms, regardless of their direct application, will contribute to realizeapplications of quantum computers.

I believe the direct application of our results is to provide tools to improve quantum computing devices. Their direct real-life applicationwould emerge when quantum computers can be used for daily life applications.

His collaborators included scientists from: Western Washington University, University of California, Santa Barbara; Google AI Quantum and theUniversity of Michigan, Ann Arbor.

About the City College of New York Since 1847, The City College of New York has provided a high-quality and affordable education to generations of New Yorkers in a wide variety of disciplines. CCNY embraces its position at the forefront of social change. It is ranked #1 by the Harvard-based Opportunity Insights out of 369 selective public colleges in the United States on the overall mobility index. This measure reflects both access and outcomes, representing the likelihood that a student at CCNY can move up two or more income quintiles. In addition, the Center for World University Rankings places CCNY in the top 1.8% of universities worldwide in terms of academic excellence. Labor analytics firm Emsi puts at $1.9 billion CCNYs annual economic impact on the regional economy (5 boroughs and 5 adjacent counties) and quantifies the for dollar return on investment to students, taxpayers and society. At City College, more than 16,000 students pursue undergraduate and graduate degrees in eight schools and divisions, driven by significant funded research, creativity and scholarship. CCNY is as diverse, dynamic and visionary as New York City itself. View CCNY Media Kit.

Originally posted here:

CCNY & partners in quantum algorithm breakthrough | The City College of New York - The City College of New York News

Read the Rest...

#SpaceWatchGL Opinion: Quantum Technology and Impact of the Global Space Security – SpaceWatch.Global

§ November 18th, 2020 § Filed under Quantum Computer Comments Off on #SpaceWatchGL Opinion: Quantum Technology and Impact of the Global Space Security – SpaceWatch.Global

by Rania Toukebri

Cyberattacks are exponentially increasing over time, improving the security of communications is crucial for guaranteeing the protection of sensitive information for states and individuals. For states, securing communications is mandatory for a strategic geopolitical influence.

Most technologies have been based on classical laws of physics. Modern communication technology transfers encrypted data with complex mathematical algorithms. The complexity of these algorithms ensures that a third parties cannot easily crack them. However, with stronger computing power and the increasing sophistication of hacking technologies, such methods of communication are increasingly vulnerable to interference. The worlds first quantum-enabled satellite is the Chinese Satellite (Micius). The purpose of the mission is to investigate space-based quantum communications for a couple of years in order to create future hack-proof communication networks.

In a classical computer, each processing is a combination of bits. A bit can either be zero or one. A qubit, the quantum bit, can be a zero and a one at the same time. So, processing qubits is processing several combinations of zeroes and ones simultaneously, and the increased speed of quantum computing comes from exploiting this parallelism.

According to quantum theory, subatomic particles can act as if they are in two places at once. This property is manipulated so that a particle can adopt either one of two states. If the particle is not observed, it will be in a state of superposition.

There have been successful quantum encryption experiments with some limitation. The messages were sent through optical fibers, the signal would be absorbed by the medium and then it wont be possible to make for long distance. Making such communications over long distances would require quantum repeaters that are devices that capture and retransmit the quantum information.

China found another solution by beaming entangled photons through the vacuum of space, so they wont be absorbed.

Micius satellite works by firing a laser through a crystal creating a pair in a state of entanglement. A half of each pair is sent to two separate stations on earth.

The objective of this method is to generate communication keys encrypted with an assembly of entangled photons. The information that will be transmitted will be encoded by a set of random numbers generated between the transmitter and the receiver. If a hacker tries to spy or interfere with one of the beams of entangled photons, the encryption key will be changed and will become unreadable due to the observer effect of Quantum theory. In consequence, the transmitter will be able to change the information in security.

The Quantum communication in Military and defense will enable China to be a strong leader in military sophistication and it will empower its geopolitical influence, decreasing by that the US authority.

China has already started the economic and technological development while US foreign policy is declining her dominance on the global geopolitical scene. Technically, Quantum technological development will speed up a multipolar power balance in international relations.

On another hand, USA is also making research on Quantum Technologies but the US investments remains limited compared to ones in China and Europe. Which is making China the leader in quantum communication. But the USA recognizes the importance of this filed and started making more efforts technically and financially. But the question remains, who will be able to reach the frontier before?

Following the Chinese space strategy, in the last years, China invested a lot in technological development including the pioneer space program, her aim was to reach a dominance in air and force. Micius satellite will be able to make a boom in military advancement and an information dominance. This space program is symptomatic to the Chinese strategy on technological development.

The first Chinese satellite was launched after USA and Russia in 1970. The strategy followed afterwards enhanced an exponential growth in space and technological development by a huge financial investment gained after an exponential economical growth. Beidou ( China space navigation satellite) provides precise geolocation information for Chinese weapon systems and communication coverage for its military. Which is a strength point on military and geopolitical aspects.

The policy is still going in that direction by having a global network coverage of 35 Chinese satellites. The Chinese space program launched already two space laboratories, its aim is the launch of a permanent manned space station in 2022 knowing that the international space station will retire before 2028.

In consequence, China would become the only country with a space station, making it necessary to the countries and in consequence a center of power. More Chinese space missions including robotics and AI took place, preparing for the next generation space technology. Quantum is the accelerator to reach the ultimate goal of this space program and then became the first priority in the technological researches. By 2030, China aims to establish a network of quantum satellites supporting a quantum internet.

The network of quantum satellites (2030 China Project) is aiming to increase the record distance for successful quantum entanglement between two points on Earth. Technically, the lasers being used to beam the entangled photons between the stations will have to achieve a high level of precision to reach the selected targets. But the limitations are:

Rania Toukebri is a Systems engineer for spacecrafts, Regional Coordinator for Africa in Space Generation Advisory Council in support of the United Nations, Space strategy consultant and Cofounder of HudumaPlus company.

Go here to see the original:

#SpaceWatchGL Opinion: Quantum Technology and Impact of the Global Space Security - SpaceWatch.Global

Read the Rest...

A Scoville Heat Scale For Measuring The Progress Of Emerging Technologies In 2021 – Forbes

§ November 18th, 2020 § Filed under Quantum Computer Comments Off on A Scoville Heat Scale For Measuring The Progress Of Emerging Technologies In 2021 – Forbes

A Scoville Heat Scale For Emerging Technologies in 2021

A couple of years back I wrote an article in FORBES called a A Scoville Heat Scale For Measuring Cybersecurity. The Scoville Scale is a measurement chart used to rate the heat of peppers or other spicy food. For that article, I devised my own Scoville Scale-like heat characterizations of the cyber threats and rated the heat on the corresponding cyber security impact.

As we enter a new decade of transformation, I am applying that same Scoville scale to the topic of emerging technologies. It could be surmised that all these emerging technologies are already hot on a heat scale as they are already facilitating exponential changes in our society. True but some areas of emerging tech are further along than others in how it will be impacting our lives in the coming year.

Health Technologies:

Medicine doctor and robotics research and analysis, Scientist diagnose checking coronavirus or ... [+] covid-19 testing result with modern virtual screen in laboratory, Medical technology and inhibition of disease outbreaks.

I will start my measurement activities at the hottest emerging tech measured on Scoville heat scale. Health and medical technologies are really a diverse area of tech that has been impacted by Covid19, especially in research, development and prototyping. Healthcare technologies include everything from biotechnology, nano deliveries of therapeutics, drug discovery, telemedicine (Augmented Reality and Virtual Reality), genomics, cybernetics, bionics, wearables, robotics, and the internet of medical things. All of these component technologies are now being fused with new capabilities in machine learning/artificial intelligence algorithms for better diagnosis and treatment of patients.

Heat Scale Rating: Trinidad Scorpion Pepper. Covid19 has pushed us to explore and bring to market new heath related technologies. We are on the way to smarter health and medical care and this technology area is both multidimensional and very promising.

Artificial Intelligence & Machine learning (AI/ML):

Conceptual background of Artificial intelligence , humans and cyber-business on programming ... [+] technology element ,3d illustration

The cognitive technologies AI & ML also have quite a hot measurement on the Scoville pepper scale. AI & ML are not necessarily new innovations, but they are ones that still have yet to reach full potential. In 2020, both AI & ML started to flourish and it will continue to do so throughout 2021. At its core, AI & ML are really about data integration, quality (image definition) and collection and processing of that data that allows for meaningful analytics. Applications for AI are increasing in variety and capability (especially automation)and are now being applied to almost every industry vertical, including finance, healthcare, energy, transportation, and cybersecurity. Most intriguing, but only in the earliest stages is AL/ML neural human augmentation. Neuromorphic technologies, and human/computer interface will extend our human brain capacities, memories and capabilities. Please see my recent FORBES article for a more in-depth analysis on the merging of human and machine:

Heat Scale Rating: Chocolate Haberno. AI & ML are certainly making significant impact to anything and everything tech related. Its very hot but will get hotter as we continue to aim higher for sentient capabilities in our machines. Of course that capability may turn into a double edged sword and we may end up having regrets in the not so distant future.

The Internet of Things (IoT):

Smart city and communication network concept. 5G. LPWA (Low Power Wide Area). Wireless ... [+] communication.

IoT refers to the general idea of things that are readable, recognizable, locatable, addressable, and/or controllable via the Internet. Essentially this connotes physical objects communicating with each other via sensors. The IoT networks include everything from edge computing devices, to home appliances, from wearable technology, to cars. In essence, IoT represents the melding of the physical world and the digital world. According to Gartner, there are nearly 26 billion networked devices currently on the Internet of Things in 2020, That actually may be a conservative estimate as more and more people are getting connected to the internet in a remote work oriented world. IoT is being boosted by edge computing combined with next gen microchips, and lower costs of manufacturing sensors.

Heat Scale Rating: Scotch Bonnet. IoT is still a work in progress, it is growing rapidly in size, and faces a myriad of regulatory and cybersecurity challenges. Eventually it will be the backbone of smart cities. The connectivity and operational expansion of IoT infrastructures and devices will be integral to the conduct of many business and personal activities in the near future.In 2021 the IoT roll out will continue.

5G:

5G (5th generation) communication technology concept. Smart city. Telecommunication.

In 2020 advanced 5G and wireless networks have started to bring benefits, including faster speeds, higher traffic capacities, lower latency, and increased reliability to consumers and businesses. As it grows, 5G will impact commercial verticals such as retail, health, and financial by enabling processing, communications, and analytics in real time. Compared to the last generation of 4G networks, 5G is estimated to have the capability to run 100 times faster, up to 10 gigabits per second making quick downloads of information and streaming of large bandwidth content a breeze. Although 5G is in the initial stages of deployment, connectivity is already exponentially expanding. The industry trade group 5G Americas cited an Omdia report that counted more than 17.7 million 5G connections at the end of last year, including a 329 percent surge during the final three months of 2019. Omdia is also predicting 91 million 5G connections by the end of 2020. In 20121, the 5G roll out will continue on a larger scale.

Heat Scale Rating: Tabasco Pepper. 5G is evolving but still only has limited deployments. Many compliance and security issues are still being worked out. No doubt that in the next few years as 5G is implemented and upgraded, the Scoville pepper rating will become much hotter.

Quantum-computing:

Abstract science, hands holding atomic particle, nuclear energy imagery and network connection on ... [+] dark background.

Quantum Computing like AI & ML, has already arrived. IBM, Google, Intel, Honeywell, D-Wave, and several others are all in various stages of developing quantum computers. It is also a U.S. government priority. Recently, the Department of Energy announced the investment of over $1 billion for five quantum information science centers. Quantum computing works by harnessing the special properties of atoms and subatomic particles. Physicists are designing quantum computers that can calculate at amazing speeds and that would enable a whole new type of cryptography. It is predicted that quantum computers will be capable of solving certain types of problems up to 100 million times faster than conventional systems. As we get closer to a fully operational quantum computer, a new world of smart computing beckons.

Heat Scale Rating: Serrano Pepper. Quantum science is a new frontier and the physics can be complicated. Good progress is being made, especially on quantum encryption, but a fully operational quantum computer is still a few years away from fruition.

Big Data: Real-time Analytics and Predictive Analytics:

young asian woman uses digital tablet on virtual visual screen at night

Big Data: Real-time Analytics and Predictive Analytics flourishes in the world of software algorithms combined with evolving computing firmware and hardware. Data is the new gold but much more plentiful. According to Eric Schmidt , former CEO of Google, we now produce more data every other day than we did from the inception of early civilization until the year 2003 combined. It is estimated that the amount of data stored in the world's computer systems is doubling every two years, Therefore, the challenges of organizing, processing, managing, and analyzing data have become more important than ever. Emerging big data analytics tools are helping collapse information gaps and giving businesses and governments the tools they need to uncover trends, demographics, and preferences, and solutions to a wide variety of problem sets in many industries.

Heat Scale Rating: Thai Pepper. Solid heat but much room for more. Big data analytics ultimately will rely on the fusion of other technologies such as AL/MI and 5G. Fusion of emerging tech will be a growing factor in most future development and use cases. For a deeper dive, please see my FORBES article: The New Techno-Fusion: The Merging Of Technologies Impacting Our Future

Other Tech Trends:

Abstract pixelated digital world map silhouette in cold blue colors, with infographic icons, line ... [+] graph and year labels. Horizontal focused on the year 2021.

There are really too many emerging technologies to match with the heat peppers on the Scoville Heat Scale. I have only touched upon a few of them. Others include materials science (including self-assembling materials), enabling nanotechnologies, 3D Printing (photovoltaics and printed electronics), wearables (flexible electronics). The world of augmented and virtual reality is also exciting and paradigm changing. And, like 5G cloud computing is a vital network backbone for increased productivity and security moving and storing data and applications over the internet from remote servers. I would be remiss if I did not add cybersecurity as the all encompassing blanket for emerging technologies. Cybersecurity is a critical component for most tech, whether it be Health Technologies, IoT, 5G, AL/ML, Quantum, and Big Data that will allow for information assurance, privacy, and resilience. No matter how you view it 2021 will be a hot year for emerging tech and hopefully a safer, happier and more prosperous one for all.

A great idea changes the idea - today and tomorrow - with chalk on blackboard

About the author:

Chuck Brooks, President of Brooks Consulting International, is a globally recognized thought leader and evangelist for Cybersecurity and Emerging Technologies. LinkedIn named Chuck as one of The Top 5 Tech Experts to Follow on LinkedIn. Chuck was named as a 2020 top leader and influencer in Whos Who in Cybersecurity by Onalytica. He was named by Thompson Reuters as a Top 50 Global Influencer in Risk, Compliance, and by IFSEC as the #2 Global Cybersecurity Influencer. He was named by The Potomac Officers Club and Executive Mosaic and GovCon as at One of The Top Five Executives to Watch in GovCon Cybersecurity. Chuck is a two-time Presidential appointee who was an original member of the Department of Homeland Security. Chuck has been a featured speaker at numerous conferences and events including presenting before the G20 country meeting on energy cybersecurity.

Chuck is on the Faculty of Georgetown University where he teaches in the Graduate Applied Intelligence and Cybersecurity Programs. He is a contributor to FORBES, a Cybersecurity Expert for The Network at the Washington Post, Visiting Editor at Homeland Security Today, He has also been featured speaker, author on technology and cybersecurity topics by IBM, AT&T, Microsoft, General Dynamics, Xerox, Checkpoint, Cylance, and many others.

Chuck Brooks LinkedIn Profile:

Chuck Brooks on Twitter: @ChuckDBrooks

See the original post here:

A Scoville Heat Scale For Measuring The Progress Of Emerging Technologies In 2021 - Forbes

Read the Rest...

NTTs Kazuhiro Gomi says Bio Digital Twin, quantum computing the next-gen tech – Backend News

§ November 18th, 2020 § Filed under Quantum Computer Comments Off on NTTs Kazuhiro Gomi says Bio Digital Twin, quantum computing the next-gen tech – Backend News

ICT

At the recently concluded Philippine Digital Convention (PH Digicon 2020) by PLDT Enterprise, Kazuhiro Gomi, president and CEO, NTT Research, shared the fundamental research milestones coming out of its three labs: the Physics and Informatics (PHI) Lab, the Cryptography and Information Security (CIS) Lab, and the Medical and Health Informatics (MEI) Lab, that are hoped to lead to monumental tech innovations.

The three-day virtual convention drew in more than 3,000 views during the live stream broadcast of the plenary sessions and breakout sessions covering various topics.

Gomi headlined the second day with his topic Upgrading Reality, a glimpse into breakthrough research that NTT Research is currently working on that could hasten digital transformations.

PLDT sets up Data Privacy and Information Security Committee

PLDT Home broadband service expands 46% nationwide

In a discussion with Cathy Yap-Yang, FVP and head Corporate Communications, PLDT, Gomi elaborated on next-generation technologies, particularly the Bio Digital Twin project, that could potentially be game-changing in the medical field, quantum computing, and advanced cryptography.

Bido Digital Twin

The Bio Digital Twin is an initiative where a digital replica of a patients internal system functions first as a model for possible testing of procedures and chemical reactions and seeing possible results before actual application to the person.

We are trying to create an electronic replica of the human body. If we are able to create something like that, the future of clinical and medical activities will be very different, Gomi said. If we have a precise replica of your human body, you can predict what type of disease or what type of problem you might have maybe three years down the road. Or, if your doctor needs to test a new drug for you, he can do so onto the digital twin.

NTT Research is a fundamental research organization in Silicon Valley that carries out advanced research for some of the worlds most important and impactful technologies, including quantum computing, cryptography, information security, and medical and health informatics.

Computing power

However, to get there and make the Bio Digital Twin possible, there are hurdles from various disciplines, including the component of computing power.

Gomi explained that people believed that todays computers can do everything, but in reality, it might actually take years to solve complex problems, whereas a quantum computer could solve these problems in seconds.

There are different kinds of quantum computers, but all are based upon quantum physics. At NTT Research, Gomi revealed that their group is working on a quantum computer called a coherent Ising machine which could solve combinatorial optimization problems.

We may be able to bring those superfast machines to market, to reality, much quicker. That is what we are aiming for, he said.

Basically, the machine, using many parameters and complex optimization, finds the best solution in a matter of seconds which may take months or years using conventional computers.

Some examples where quantum computing may be applied include lead optimization problems such as effects on small molecule drugs, peptide drugs, and Biocatalyst, or resource optimization challenges such as logistics, traffic control, or using wireless networks. Gomi also expounded on compressed sensing cases, including use in astronomical telescopes, magnetic resonance imaging (MRI), and computed tomography.

Quantum computing

Apart from quantum computing, Gomi reiterated the issues of cybersecurity and privacy. Today, encryption is able to address those challenges but it would soon require a more advanced and sophisticated type of technology if we are to upgrade reality.

From the connected world, obviously we want to exchange more data among each other, but we have to make sure that security and privacy are maintained. We have to have those things together to get the best out of a connected world, he said.

Among next-generation advanced encryptions, Gomi highlighted Attribute-Based Encryption where various decryption keys define access control of the encrypted data. For example, depending on the user (or the type of key he/she has) what they are allowed to view is different or controlled by the key issuers.

He noted that in the next couple of years, we should be able to commercialize this type of technology. We can maintain privacy while encouraging the sharing of data with this mechanism.

Gomi reiterated that we are at the stage of all kinds of digital transformations.

Digital transformation

Those digital transformations are making our lives so much richer and business so much more interesting and efficient. I would imagine those digital transformations will continue to advance even more, he said.

However, there are limiting factors that could impede or slow down those digital transformations such as energy consumption, Moores law of limitation as we cannot expect too much of the capacities of the electronic chips from current computers, and the issues on privacy and security. Hence, we need to address those factors.

PH Digicon 2020 is the annual convention organized by PLDT Enterprise which gathered global industry leaders to speak on the latest advancements in the digital landscape. This years roster of speakers included tech experts and heads from Cisco, Nokia, Salesforce, NTT Research, and goop CEO and multi-awarded Hollywood actress Gwyneth Paltrow who headlined the first virtual run.

Originally posted here:

NTTs Kazuhiro Gomi says Bio Digital Twin, quantum computing the next-gen tech - Backend News

Read the Rest...

Weaponizing of the Intelligent Edge Will Dramatically Alter Speed and Scale of Future Cyberattacks, says Fortinet – Express Computer

§ November 18th, 2020 § Filed under Quantum Computer Comments Off on Weaponizing of the Intelligent Edge Will Dramatically Alter Speed and Scale of Future Cyberattacks, says Fortinet – Express Computer

Fortinet today unveiled predictions from the FortiGuard Labs global threat intelligence and research team about the threat landscape for 2021 and beyond. These predictions reveal strategies the team anticipates cybercriminals will employ in the near future, along with recommendations that will help defenders prepare to protect against these oncoming attacks.

Cyber adversaries leveraging intelligent edges, 5G-enabled devices, and advances in computing power will create a wave of new and advanced threats at unprecedented speed and scale. In addition, threat actors will continue to shift significant resources to target and exploit emerging edge environments, such as remote workers, or even new OT edge environments, rather than just targeting the core network.

For defenders, it is critical to plan ahead now by leveraging the power of artificial intelligence (AI) and machine learning (ML) to speed threat prevention, detection, and response. Actionable and integrated threat intelligence will also be important to improve an organizations ability to defend in realtime as the speed of attacks continues to increase.

Highlights of the predictions:

The Intelligent Edge Is an Opportunity and a Target Over the past few years, the traditional network perimeter has been replaced with multiple edge environments, WAN, multi-cloud, data center, remote worker, IoT, and more, each with its unique risks. One of the most significant advantages to cybercriminals in all of this is that while all of these edges are interconnected many organizations have sacrificed centralized visibility and unified control in favor of performance and digital transformation. As a result, cyber adversaries are looking to evolve their attacks by targeting these environments and will look to harness the speed and scale possibilities 5G will enable.

Trojans Evolve To Target the Edge: While end-users and their home resources are already targets for cybercriminals, sophisticated attackers will use these as a springboard into other things going forward. Corporate network attacks launched from a remote workers home network, especially when usage trends are clearly understood, can be carefully coordinated so they do not raise suspicions. Eventually, advanced malware could also discover even more valuable data and trends using new EATs (Edge Access Trojans) and perform invasive activities such as intercept requests off the local network to compromise additional systems or inject additional attack commands.

Edge-enabled Swarm Attacks: Compromising and leveraging new 5G-enabled devices will open up opportunities for more advanced threats. There is progress being made by cybercriminals toward developing and deploying swarm-based attacks. These attacks leverage hijacked devices divided into subgroups, each with specialized skills. They target networks or devices as an integrated system and share intelligence in realtime to refine their attack as it is happening. Swarm technologies require large amounts of processing power to enable individual swarmbots and to efficiently share information in a bot swarm. This enables them to rapidly discover, share, and correlate vulnerabilities, and then shift their attack methods to better exploit what they discover.

Social Engineering Could Get Smarter: Smart devices or other home-based systems that interact with users, will no longer simply be targets for attacks, but will also be conduits for deeper attacks. Leveraging important contextual information about users including daily routines, habits, or financial information could make social engineering-based attacks more successful. Smarter attacks could lead to much more than turning off security systems, disabling cameras, or hijacking smart appliances, it could enable the ransoming and extortion of additional data or stealth credential attacks.

Ransoming OT Edges Could Be a New Reality: Ransomware continues to evolve, and as IT systems increasingly converge with operational technology (OT) systems, particularly critical infrastructure, there will be even more data, devices, and unfortunately, lives at risk. Extortion, defamation, and defacement are all tools of the ransomware trade already. Going forward,human lives will be at risk when field devices and sensors at the OT edge, which include critical infrastructures, increasingly become targets of cybercriminals in the field.

Innovations in Computing Performance Will Also Be Targeted Other types of attacks that target developments in computing performance and innovation in connectivity specifically for cybercriminal gain are also on the horizon. These attacks will enable adversaries to cover new territory and will challenge defenders to get ahead of the cybercriminal curve.

Advanced Cryptomining: Processing power is important if cybercriminals want to scale future attacks with ML and AI capabilities. Eventually, by compromising edge devices for their processing power, cybercriminals would be able to process massive amounts of data and learn more about how and when edge devices are used. It could also enable cryptomining to be more effective. Infected PCs being hijacked for their compute resources are often identifiedsince CPU usage directly impactsthe end-users workstation experience. Compromising secondary devices could be much less noticeable.

Spreading Attacks from Space: The connectivity of satellite systems and overall telecommunications could be an attractive target for cybercriminals. As new communication systems scale and begin to rely more on a network of satellite-based systems, cybercriminals could target this convergence and follow in pursuit. As a result, compromising satellite base stations and then spreading that malware through satellite-based networks could give attackers the ability to potentially target millions of connected usersat scale or inflict DDoS attacks that could impede vital communications.

The Quantum Computing Threat: From a cybersecurity perspective, quantum computing could create a new risk when it eventually is capable of challenging the effectiveness of encryption in the future. The enormous compute power of quantum computers could render some asymmetric encryption algorithms solvable. As a result, organizations will need to prepare to shift to quantum-resistant crypto algorithms by using the principle of crypto agility, toensure the protection of current and future information. Although the average cybercriminal does not have access to quantum computers, some nation-states will, therefore the eventual threat will be realized if preparations are not made now to counter it by adopting crypto agility.

AI Will Be Critical To Defending Against Future Attacks As these forward-looking attack trends gradually become reality, it will only be a matter of time before enabling resources are commoditized and available as a darknet service or as part of open-source toolkits. Therefore,it will take a careful combination of technology, people, training, and partnerships to secure against these types of attacks coming from cyber adversaries in the future.

AI Will Need To Evolve: The evolution of AI is critical for future defense against evolving attacks. AI will need to evolve to the next generation. This will include leveraging local learning nodes powered by MLas part of an integrated system similar to the human nervous system. AI-enhanced technologies that can see, anticipate, and counter attacks will need to become reality in the future because cyberattacks of the future will occur in microseconds. The primary role of humans will be to ensure that security systems have been fed enough intelligence to not only actively counter attacks but actually anticipate attacks so that they can be avoided.

Partnerships Are Vital for the Future: Organizations cannot be expected to defend against cyber adversaries on their own. They will need to know who to inform in the case of an attack so that the fingerprints can be properly shared and law enforcement can do its work. Cybersecurity vendors, threat research organizations, and other industry groups need to partner with each other for information sharing, but also with law enforcement to help dismantle adversarial infrastructures to prevent future attacks. Cybercriminals face no borders online,so the fight against cybercrime needs to go beyond borders as well. Only by working together will we turn the tide against cybercriminals.

Enabling Blue Teams: Threat actor tactics, techniques, and procedures (TTPs), researched by threat intelligence teams, such as threat actor playbooks, can be fed to AI systems to enable the detection of attack patterns. Similarly, as organizations light up heatmaps of currently active threats, intelligent systems will be able to proactively obfuscate network targets and place attractive decoys along attack paths. Eventually, organizations could respond to any counterintelligence efforts before they happen, enabling blue teams to maintain a position of superior control. This sort of training gives security team members the ability to improve their skills while locking down the network.

If you have an interesting article / experience / case study to share, please get in touch with us at [emailprotected]

Srikanth is an award winning journalist with more than 16 years of experience. In 2010 and 2013, Srikanth received the Polestar award for Excellence in IT Journalism, from the PoleStar Foundation, an independent trust established in 1998 to recognize Excellence in Business and IT Journalism.

In the past, Srikanth has led the editorial operations for InformationWeek (UBM) and Dataquest (CyberMedia). Srikanth has also been associated with Patni Computer Systems and Capgemini India, in marketing and communications roles. He can be reached at [emailprotected]

View post:

Weaponizing of the Intelligent Edge Will Dramatically Alter Speed and Scale of Future Cyberattacks, says Fortinet - Express Computer

Read the Rest...

Quantum Computing in the CloudCan It Live Up to the Hype? – Electronic Design

§ November 12th, 2020 § Filed under Quantum Computer Comments Off on Quantum Computing in the CloudCan It Live Up to the Hype? – Electronic Design

What youll learn:

Quantum computing has earned its place on the Gartner hype cycle. Pundits have claimed that it will take over and change everything forever. The reality will likely be somewhat less dramatic, although its fair to say that quantum computers could spell the end for conventional cryptography. Clearly, this has implications for technologies like blockchain, which are slated to support financial systems of the future.

While the Bitcoin system, for example, is calculated to keep classical mining computers busy until 2140, brute-force decryption using a quantum computer could theoretically mine every token almost instantaneously. More powerful digital ledger technologies based on quantum cryptography could level the playing field.

All of this presupposes that quantum computing will become usable and affordable on a widespread scale. As things stand, this certainly seems achievable. Serious computing players, including IBM, Honeywell, Google, and Microsoft, as well as newer specialist startups, all have active programs that are putting quantum computing in the cloud right now and inviting engagement from the wider computing community. Introduction packs and development kits are available to help new users get started.

Democratizing Access

These are important moves that will almost certainly drive further advancement as users come up with more diverse and demanding workloads and figure out ways of handling them using quantum technology. Equally important is the anticipated democratizing effect of widespread cloud access, which should bring more people from a wider variety of backgrounds into contact with quantum to understand it, use it, and influence its ongoing development.

Although its here, quantum computing remains at a very experimental stage. In the future, commercial cloud services could provide affordable access in the same way that scientific or banking organizations can today rent cloud AI applications to do complex workloads that are billed according to the number of computer cycles used.

Hospitals, for example, are taking advantage of genome sequencing apps hosted on AI accelerators in hyperscale data centers to identify genetic disorders in newborn babies. The process costs just a few dollars and the results are back within minutes, enabling timely and potentially life-saving intervention by clinicians.

Quantum computing as a service could further transform healthcare as well as deeply affect many other fields such as materials science. Simulating a caffeine molecule, for example, is incredibly difficult to do with a classical computer, demanding the equivalent of over 100 years of processing time. A quantum computer can complete the task in seconds. Other applications that could benefit include climate analysis, transportation planning, bioinformatics, financial services, encryption, and codebreaking.

A Real Technology Roadmap

For all its power, quantum computing isnt here to kill off classical computing or turn the entire world upside down. Because quantum bits (qubits) can be in both states, 0 and 1, unlike conventional binary bits that are in one state or another, they can store exponentially more information. However, their state when measured is determined by probability, so quantum is only suited to certain types of algorithms. Others can be handled better by classical computers.

In addition, building and running a quantum computer is incredibly difficult and complex. On top of that, the challenges intensify as we try to increase the number of qubits in the system. As with any computer, more bits corresponds to more processing power, so increasing the number of bits is a key objective for quantum-computer architects.

Keeping the system stable, with a low error rate, for longer periods is another objective. One way to achieve this is by cryogenically cooling the equipment to near absolute zero to eliminate thermal noise. Furthermore, extremely pure and clean RF sources are needed. Im excited that, at Rohde & Schwarz, we are working with our academic partners to apply our ultra-low-noise R&S SGS100A RF sources (Fig. 1) to help increase qubit count and stability.

1. Extremely pure and clean RF sources like the R&S SGS100A are needed in quantum-computing applications.

The RF source is one of the most important building blocks as it determines the amount of errors that must be corrected in the process of reading out the quantum-computation results. A cleaner RF signal increases quantum-system stability, reducing errors due to quantum decoherence that would result in information loss.

Besides the low phase and amplitude noise requirements, multichannel solutions are essential to scale up the quantum-computing system. Moreover, as we start to consider scalability, a small form factor of the signal sources becomes even more relevant. Were combining our RF expertise with the software and system know-how of our partners in pursuit of a complete solution.

Equipment Needs

In addition, scientists are constantly looking for new material to be applied in quantum-computing chips and need equipment to help them accurately determine the exact properties. Then, once the new quantum chip is manufactured, its resonance frequencies must be measured to ensure that no undesired resonances exist. Rohde & Schwarz has developed high-performance vector network analyzers (Fig. 2) for both tasks and can assist in the debugging of the quantum-computing system itself.

2. VNAs such as the R&S ZNA help determine properties of material used in quantum computing.

Our partners are relying on us to provide various other test-and-measurement solutions to help them increase the performance and capabilities of quantum computers. The IQ mixing is a crucial part of a quantum computer, for example, and our spectrum analyzers help to characterize and calibrate the IQ mixers and suppress undesired sidebands. Moreover, R&S high-speed oscilloscopes (Fig. 3) help enable precise temporal synchronization of signals in the time domain, which is needed to set up and debug quantum-computing systems.

3. High-speed oscilloscopes, for example, the R&S RTP, can be used to set up and debug quantum-computing systems.

As we work with our partners in the quantum world to improve our products for a better solution fit, at the same time were learning how to apply that knowledge to other products in our portfolio. In turn, this helps to deliver even better performing solutions.

While cloud access will enable more companies and research institutes to take part in the quantum revolution, bringing this technology into the everyday requires a lot more work on user friendliness. That involves moving away from the temperature restrictions, stabilizing quantum computers with a high number of qubits, and all for a competitive price.

Already, however, we can see that quantum has the potential to profoundly change everything it touches. No hype is needed.

Sebastian Richter is Vice President of Market Segment ICR (Industry, Components, Research & Universities) at Rohde & Schwarz.

Read more from the original source:

Quantum Computing in the CloudCan It Live Up to the Hype? - Electronic Design

Read the Rest...

The next 20 years: five new technologies on the horizon – MoneyWeek

§ November 12th, 2020 § Filed under Quantum Computer Comments Off on The next 20 years: five new technologies on the horizon – MoneyWeek

Its November 2040. You check your phone and see that youre due to meet a friend in New York for lunch. You get in your car and order it to drive to Heathrow. During the brief flight, you chat to your grandmother, who just celebrated her 100th birthday by running a marathon thanks to her new bionic leg. When you arrive in New York, another driverless car whisks you to the restaurant. While you wait at the bar, you scroll through the news: the World Health Organisation (WHO) says the seasonal flu has finally been eradicated. All this sounds like science fiction in 2020. But major advances in transport, medicine and quantum computing are set to revolutionise our lives.

One thing keeping carmakers managers awake at night is the decline in the number of young adults across the developed world learning to drive. According to Britains Department for Transport, the number of 17 to 20-year-olds with driving licences has fallen from around 50% in 1994 to just 29% in 2014. The trend is due to the rise in insurance costs and tougher tests. The good news is that in the future driving tests may become unnecessary thanks to the rise of cars that can drive without any human input.

Ben Barringer, an equity research analyst with Quilter Cheviot, thinks that fully autonomous cars are likely to take around eight to ten years to hit the mainstream. Real progress has been made in decreasing the amount of driver input needed. If automation is measured on a scale of zero to five, with five denoting total automation, an increasing number of carmakers offer level-two features such as automatic steering assistance and distance control, says Barringer.

Several companies are devoting large sums of money to designing cars that reach levels four and five. While none of these cars are commercially available yet, they are undergoing real-life testing on public roads across the world. They do occasionally make mistakes, but the number of miles they can travel without needing to be overruled by their human test-driver has been rising. Waymo (owned by Alphabet, Googles parent company) claims that its driverless cars travel an average distance of 13,000 miles per human intervention (more than the average Briton drives in a year). Chinas Baidu says its cars boast an average distance per human intervention of 18,000 miles.

One of the core technologies underpinning autonomous car technologies is Lidar, which measures the distance between two objects by using a combination of lasers and sensors. Infineon Technologies (Frankfurt: IFX) manufactures several key components used in Lidar systems, and should benefit from the rise of autonomous driving. It is on a 2021 price/earnings (p/e) ratio of 22 and yields 1.1%.

The way we travel longer distances also looks set to change. In the 1970s everyone expected Concorde to usher in a new age of supersonic travel. Sadly, while it was technologically advanced, it consumed a huge amount of fuel and made an extremely loud noise (which led to the US and other countries forcing it to stick to subsonic speeds as it flew through their airspace). As a result, demand suffered irreversible damage when oil prices surged in the 1970s. Concorde limped on until it finally retired from service in 2003.

Nonetheless, research over the past few decades into how supersonic travel can be made commercially viable is now starting to bear fruit. For example, Aerion Corporation claims to have developed a business jet in conjunction with General Electric that can fly as fast as Concorde while consuming much less fuel and being quiet enough to fly over land. They have already secured an order for 20 jets and hope to start selling them by 2026. Virgin Group and Japan Airlines have also agreed to buy a 75-seat passenger jet from rival Boom Supersonic.

While the possibility of supersonic travel is exciting enough, the real game-changer would be travel at a flight speed in excess of Mach 5 (five times the speed of sound and more than twice Concordes velocity). This would cut travel times between London and New York to 90 minutes, opening up the possibility of... day trips across the Atlantic, without having to bother about the need to adapt your body to different timezones, says Dr. Adam Dissel, president of Reaction Engines, one of the many companies researching this area. Dissel predicts that we will see not only supersonic passenger jets by the end of decade but also hypersonic ones within 20 years.

One firm heavily involved in efforts to develop both supersonic and hypersonic planes is engine maker Rolls-Royce (LSE: RR). It has a substantial equity stake in Reaction Engines and is also developing the engines for both Boom and Virgin Galactics supersonic jets as well. While the Covid-19-induced collapse in demand for air travel has caused its share price to plummet, a 2bn rights issue, along with government support, should enable it to ride out the crisis until demand recovers.

The idea of being able to travel from London to New York in less time than it takes to watch a football match is one thing, but at present most people are living under major restrictions on where they can go or what they do thanks to the ongoing pandemic. The good news is that the long-awaited Covid-19 vaccine may be only weeks away. According to the WHO, there are now 11 vaccines undergoing stage three-trials (the final pre-approval stage of clinical trials). These include vaccines developed in America, Europe, the UK, Russia and China, with a further 16 in stage two.

Of course, even after the Covid-19 pandemic ends there will be many diseases left to tackle. Alex Hunter of Sarasin & Partners notes that infectious diseases still account for one in every four deaths across the world. No wonder, then, that experts have calculated that every $1 spent by public bodies on vaccination programmes can yield a societal return of $44. More effective vaccines, against a wider range of diseases, with fewer side effects, could transform the world.

The way vaccines are produced is currently undergoing a revolution. Plant-based vaccines could slash the time it takes to manufacture a treatment, compared with the traditional approach using chicken eggs (finding a vaccine entails growing viruses in a cell, which they then take over, because they cant reproduce on their own). The elements of a vaccine build up much more quickly in plants than in eggs.

Meanwhile, vaccines that target the stem of a virus, rather than just its surface, means that the vaccine will still work even if the virus mutates, making a universal flu vaccine a possibility. Finally, so-called messenger RNA (MRNA) vaccines aim to improve the efficiency of the immune systems response by getting the body to create proteins that stimulate the immune system.

Vaccines are also playing an increasing role in treating diseases such as cancer. Drug companies are putting increasing resources into immunotherapy, which aims to train the bodys own immune system to turn against cancer calls. One company at the forefront of vaccines, both to prevent and treat disease, is biotechnology company Moderna (Nasdaq: MRNA). Not only is it one of the leaders in the race to bring a Covid-19 vaccine to market, but it is also working with Merck to develop personalised cancer vaccines based on MRNA technology.

Vaccines arent the only area of medicine experiencing a revolution. Artificial limbs have become much more sophisticated over the past decade, with the industry moving way from passive lumps of metal or plastic to prostheses that anticipate and support users movements. Whats more, exoskeletons, wearable robotic devices similar to the bionic suit in the Iron Man film series, are moving from science fiction to reality. Several companies now offer exoskeletons that give stroke victims, or those partially paralysed, a chance to walk again.

At present even the best prostheses work by either predicting what moves the user will make next or measuring activity in the nerves surrounding the missing limb. This can be a big problem if the user makes unexpected movements, or if there is something preventing the signal from the brain reaching the area (such as a break in the spinal cord). However, there has been much research on creating brain-computer interfaces that allow users to communicate with the devices through their thoughts alone.

One approach to producing brain-computer interfaces is through devices that measure signals in various parts of the brain and then decode them into messages that can be sent to the artificial limb. However, another approach is to develop computer chips that can be implanted directly into the brain. In August, Neuralink, one of Elon Musks many start-ups, demonstrated a device that had been successfully trialled in a pig. A stock to consider in the prosthetics sector is ReWalk Robotics (Nasdaq: RWLK), which offers a rigid exoskeleton for those who have damaged their spines as well as an exosuit for people undergoing therapy.

While the company is not yet profitable, it recently became an approved provider for Medicare (the public-health part of Americas insurance system), which has the potential to boost sales dramatically.

Medical advances have the potential to save lives and expand the human lifespan. However, attempts to use computers to speed up medical innovations by simulating human cells or even organs have been limited by the fact that even the fastest supercomputers struggle to carry out the complex calculations needed.

The good news is that quantum computers, which rely on energy on the scale of electrons and subatomic particles rather than the flow of electricity, may be able to help. Their unique properties allow them to tackle an enormous number of problems at the same time, unlike traditional computers, which tackle them one by one. The upshot is an exponential increase in computing power.

Drug and medical research isnt the only area that could be transformed by quantum computers. They could eventually deliver solutions to previously intractable problems, improve efficiencies and reduce costs across a broad range of industries, says Colm Harney of Sarasin & Partners. They could massively improve the management of transport networks, speed up development in machine learning, and lead to incredible new discoveries in the materials industry. Banks and engineering firms are also starting to explore potential applications of quantum computers.

Of course, building quantum computers that can tackle real-world problems is not going to be easy, warns Harney. Designers are still struggling with the task of keeping calculation errors to a minimum. Whats more, just building a quantum computer itself wont be enough, as the industry needs to develop an entire ecosystem of quantum-specific operating systems, software, algorithms, programmers and so on. Note that even after 40 years of research and development, quantum computing hasnt yet made a tangible difference to our lives.

Nevertheless, despite these previous disappointments there are some encouraging signs that the 2020s might be the decade that quantum computingfinally makes the leap from hype to reality, says Harney. Not only have Google and IBM built working quantum computers over the past 12 months, but there has been a step-change in the amount of private and public money invested this area. This includes 600m from the UK government, $1bn from the US, 2bn from the EU and $10bn from China, who all hope to become global leaders in this area.

Tom Weller of Evenlode Investment thinks that Microsoft (Nasdaq: MSFT) is one of the companies set to benefit from advances in quantum computing. Not only is the group developing quantum computers that can be accessed via the cloud, but they are also creating a range of applications that can exploit the extra processing power provided by quantum computers. The stock is on a 2022 p/e of 27.

More here:

The next 20 years: five new technologies on the horizon - MoneyWeek

Read the Rest...

Quantum computers: This group wants to get them out of the lab and into your business – ZDNet

§ November 11th, 2020 § Filed under Quantum Computer Comments Off on Quantum computers: This group wants to get them out of the lab and into your business – ZDNet

Five quantum computing companies, three universities and one national physical laboratory in the UK have come together in a 10 million ($13 million) new project, with an ambitious goal: to spend the next three years trying to make quantum technologies work for businesses.

Called Discovery, the program is partly funded by the UK government and has been pitched as the largest industry-led quantum computing project in the country to date. The participating organizations will dedicate themselves to making quantum technologies that are commercially viable, marking a shift from academic research to implementations that are relevant to, and scalable for, businesses.

The Discovery program will focus on photonic quantum computing, which is based on the manipulation of particles of light a branch of the field that has shown great promise but is still facing large technological barriers.

SEE: An IT pro's guide to robotic process automation (free PDF) (TechRepublic)

On the other hand, major players like IBM and Google are both developing quantum computers based on superconducting qubits made of electrons, which are particles of matter. The superconducting qubits found in those quantum devices are notoriously unstable, and require very cold temperatures to function, meaning that it is hard to increase the size of the computer without losing control of the qubits.

Photonic quantum computers, on the contrary, are less subject to interference in their environment, and would be much more practical to use and scale up. The field, however, is still in its infancy. For example, engineers are still working on ways to create the single quantum photons that are necessary for photonic quantum computers to function.

The companies that are a part of the Discovery program will be addressing this type of technical barrier over the next few years. They include photonics company M Squared, Oxford Ionics, ORCA Computing, Kelvin Nanotechnology and TMD Technologies.

"The Discovery project will help the UK establish itself at the forefront of commercially viable photonics-enabled quantum-computing approaches. It will enable industry to capitalize on the government's early investment into quantum technology and build on our strong academic heritage in photonics and quantum information," said Graeme Malcolm, CEO of M Squared.

Another key objective of the Discovery program will consist of developing the wider UK quantum ecosystem, by establishing commercial hardware supply and common roadmaps for the industry. This will be crucial to ensure that businesses are coordinating across the board when it comes to adopting quantum technologies.

Andrew Fearnside, senior associate specializing in quantum technologies at intellectual property firm Mewburn Ellis, told ZDNet: "We will need sources of hardware that all have the same required standards that everyone can comply with. This will enable everyone to speak the same language when building prototypes. Getting all the players to agree on a common methodology will make commercialization much easier."

Although quantum computers are yet to be used at a large commercial scale, the technology is expected to bring disruption in many if not all industries. Quantum devices will shake up artificial intelligence thanks to improved machine-learning models, solve optimization problems that are too large for classical computers to fathom, and boost new material discovery thanks to unprecedented simulation capabilities.

Finance, agriculture, drug discovery, oil and gas, or transportation are only a few of the many industries awaiting the revolution that quantum technology will bring about.

The UK is now halfway through a ten-year national program designed to boost quantum technologies, which is set to represent a 1 billion ($1.30 billion) investment over its lifetime.

SEE: Technology's next big challenge: To be fairer to everyone

The Discovery project comes under the umbrella of the wider national program; and according to Fearnside, it is reflective of a gradual shift in the balance of power between industry and academia.

"The national program has done a good job of enabling discussion between blue-sky researchers in university labs and industry," said Fearnside. "Blue-sky projects have now come to a point where you can think about pressing ahead and start commercializing. There is a much stronger focus on commercial partners playing a leading role, and the balance is shifting a little bit."

Last month, the UK government announced that US-based quantum computing company Rigetti would be building the country's first commercial quantum computer in Abingdon, Oxfordshire, and that partners and customers will be able to access and operate the system over the cloud. The move was similarly hailed as a step towards the commercialization of quantum technologies in the UK.

Although Fearnside acknowledged that there are still challenges ahead for quantum computing, not the least of which are technical, he expressed confidence that the technology will be finding commercial applications within the next decade.

Bridging between academia and industry, however, will require commitment from all players. Experts have previously warned that without renewed efforts from both sides, quantum ideas might well end up stuck in the lab.

Go here to read the rest:

Quantum computers: This group wants to get them out of the lab and into your business - ZDNet

Read the Rest...

Supply Chain: The Quantum Computing Conundrum | Logistics – Supply Chain Digital – The Procurement & Supply Chain Platform

§ November 11th, 2020 § Filed under Quantum Computer Comments Off on Supply Chain: The Quantum Computing Conundrum | Logistics – Supply Chain Digital – The Procurement & Supply Chain Platform

From artificial intelligence to IoT, each technology trend is driven by finding solutions to a problem, some more successfully than others. Right now, the worlds technology community is focused on harnessing the exponential opportunities promised by quantum computing. While it may be some time before we see the true benefits of this emerging technology, and while nothing is certain, the possibilities are great.

What is Quantum Computing?

Capable of solving problems up to 100 million times faster than traditional computers, quantum computing has the potential to comprehensively speed up processes on a monumental scale.

Quantum computers cost millions of dollars to produce, so it perhaps goes without saying that these computers are not yet ready for mass production and rollout. However, their powerful potential to transform real-world supply chain problems should not (and cannot) be ignored. Quantum bits (qubits) can occupy more than one state at the same time (unlike their binary counterparts), embracing nuance and complexity. These particles are interdependent on each other and analogous to the variables of a complex supply chain. Qubits can be linked to other qubits, a process known as entanglement. This is a key hallmark that separates quantum from classical computing.

It is possible to adjust an interaction between these qubits so that they can sense each other. The system then naturally tries to arrange itself in such a way that it consumes as little energy as possible says Christoph Becher, a Professor in Experimental Physics at Saarland University.

Right now, tech giants such as Microsoft, IBM and Intel continue to lead the charge when it comes to the development of quantum computers. While continuous improvement will still be required in the years to come, many tech companies are already offering access to quantum computing features.

According to Forbes contributor Paul Smith-Goodson, IBM is committed to providing clients with quantum computing breakthroughs capable of solving todays impossible problems. Jay Gambetta, Vice President, IBM Quantum, said: With advancements across software and hardware, IBMs full-stack approach delivers the most powerful quantum systems in the industry to our users.

This is good news for multiple industries but in particular those areas of the supply chain where problems around efficiency occur.

Preventing Failure of Supply Chain Optimisation Engines

Current optimisation systems used in inventory allocation and order promising fail to meet the expectations of supply chain planners for a few reasons. Sanjeev Trehan, a member of the Enterprise Transformation Group at TATA Consultancy Services, highlighted two of the key reasons for this in a discussion around digital supply chain disruption:

Inadequate system performance capabilities lie at the heart of both planning problems. By speeding up these processes on an exponential scale, these problems are almost completely eradicated, and the process is made more efficient.

Practical Data and Inventory Applications

As manufacturers incorporate more IoT sensors into their daily operations, they harvest vast amounts of enterprise data. Quantum computing can handle these complex variables within a decision-making model with a high degree of excellence. Harmonising various types of data from different sources makes it especially useful for optimising resource management and logistics within the supply chain.

Quantum computing could be applied to improve dynamic inventory allocation, as well as helping manufacturers govern their energy distribution, water usage, and network design. The precision of this technology allows for a very detailed account of the energy used on the production floor in real-time, for example. Microsoft has partnered with Dubais Electricity and Water Authority in a real-life example of using quantum for grid and utility management.

Logistics

Quantum computing holds huge potential for the logistics area of the supply chain, says Shiraz Sidat, Operations Manager of Speedel, a Leicestershire based B2B courier firm that works in the supply chain of a number of aerospace and manufacturing companies.

Quantum offers real-world solutions in areas such as scheduling, planning, routing and traffic simulations. There are huge opportunities to optimise energy usage, create more sustainable travel routes and make more informed financially-savvy decisions. The sheer scale of speed-up on offer here could potentially increase sustainability while saving time and money he adds.

TATA Consultancy Services provide a very good example to support Shirazs statement.

Lets say a company plans to ship orders using ten trucks over three possible routes. This means the company has 310 possibilities or 59,049 solutions to choose from. Any classical computer can solve this problem with little effort. Now lets assume a situation where a transport planner wants to simulate shipments using 40 trucks over the same three routes. The possibilities, in this case, are approximately 12 Quintillion a tough ask for a classical computer. Thats where quantum computers could potentially come in.

Looking Ahead

Quantum computing has the potential to disrupt the planning landscape. Planners can run plans at the flick of a button, performing scenario simulations on the fly.

At present, the full use of quantum computers in the supply chain would be expensive and largely impractical. Another current issue is the higher rate of errors (when compared to traditional computers) experienced due to the excessive speed at which they operate. Experts and companies around the world are working to address and limit these errors.

As mentioned earlier in the article, many tech companies are providing aspects of quantum computing through an as-a-service model, which could well prove the most successful path for future widespread use. As-a-service quantum computing power would help enterprises access these capabilities at a fraction of the cost, in a similar way such models have helped businesses utilise simulation technology, high-performance computing and computer-aided engineering.

Alongside AI, the IoT, blockchain and automation, quantum computing is one of many digital tools likely to shape, streamline and optimise the future of the supply chain. As with all emerging technology, it requires an open mind and cautious optimism.

Go here to read the rest:

Supply Chain: The Quantum Computing Conundrum | Logistics - Supply Chain Digital - The Procurement & Supply Chain Platform

Read the Rest...

A Modem With a Tiny Mirror Cabinet Could Help Connect The Quantum Internet – ScienceAlert

§ November 11th, 2020 § Filed under Quantum Computer Comments Off on A Modem With a Tiny Mirror Cabinet Could Help Connect The Quantum Internet – ScienceAlert

Quantum physics promises huge advances not just in quantum computing but also in a quantum internet a next-generation framework for transferring data from one place to another. Scientists have now invented technology suitable for a quantum modem that could act as a network gateway.

What makes a quantum internet superior to the regular, existing internet that you're reading this through is security: interfering with the data being transmitted with quantum techniques would essentially break the connection. It's as close to unhackable as you can possibly get.

As with trying to produce practical, commercial quantum computers though, turning the quantum internet from potential to reality is taking time not surprising, considering the incredibly complex physics involved. A quantum modem could be a very important step forward for the technology.

"In the future, a quantum internet could be used to connect quantum computers located in different places, which would considerably increase their computing power!" says physicist Andreas Reiserer, from the Max Planck Institute in Germany.

Quantum computing is built around the idea of qubits, which unlike classical computer bits can store several states simultaneously. The new research focuses on connecting stationary qubits in a quantum computer with moving qubits travelling between these machines.

That's a tough challenge when you're dealing with information that's stored as delicately as it is with quantum physics. In this setup, light photons are used to store quantum data in transit, photons that are precisely tuned to the infrared wavelength of laser light used in today's communication systems.

That gives the new system a key advantage in that it'll work with existing fibre optic networks, which would make a quantum upgrade much more straightforward when the technology is ready to roll out.

In figuring out how to get stored qubits at rest reacting just right with moving infrared photons, the researchers determined that the element erbium and its electrons were best suited for the job but erbium atoms aren't naturally inclined to make the necessary quantum leap between two states. To make that possible, the static erbium atoms and the moving infrared photons are essentially locked up together until they get along.

Working out how to do this required a careful calculation of the space and conditions needed. Inside their modem, the researchers installed a miniature mirrored cabinet around a crystal made of ayttrium silicate compound. This set up was then was cooled to minus 271 degrees Celsius (minus 455.8 degrees Fahrenheit).

The modem mirror cabinet. (Max Planck Institute)

The cooled crystal kept the erbium atoms stable enough to force an interaction, while the mirrors bounced the infrared photons around tens of thousands of times essentially creating tens of thousands of chances for the necessary quantum leap to happen. The mirrors make the system 60 times faster and much more efficient than it would be otherwise, the researchers say.

Once that jump between the two states has been made, the information can be passed somewhere else. That data transfer raises a whole new set of problems to be overcome, but scientists are busy working on solutions.

As with many advances in quantum technology, it's going to take a while to get this from the lab into actual real-world systems, but it's another significant step forward and the same study could also help in quantum processors and quantum repeaters that pass data over longer distances.

"Our system thus enables efficient interactions between light and solid-state qubits while preserving the fragile quantum properties of the latter to an unprecedented degree," write the researchers in their published paper.

The research has been published in Physical Review X.

Link:

A Modem With a Tiny Mirror Cabinet Could Help Connect The Quantum Internet - ScienceAlert

Read the Rest...

« Older Entries Newer Entries »



Page 10«..9101112..20..»