Page 11234..10..»

You are currently browsing the Quantum Computer category

What is an algorithm? How computers know what to do with data – The Conversation US

§ October 16th, 2020 § Filed under Quantum Computer Comments Off on What is an algorithm? How computers know what to do with data – The Conversation US

The world of computing is full of buzzwords: AI, supercomputers, machine learning, the cloud, quantum computing and more. One word in particular is used throughout computing algorithm.

In the most general sense, an algorithm is a series of instructions telling a computer how to transform a set of facts about the world into useful information. The facts are data, and the useful information is knowledge for people, instructions for machines or input for yet another algorithm. There are many common examples of algorithms, from sorting sets of numbers to finding routes through maps to displaying information on a screen.

To get a feel for the concept of algorithms, think about getting dressed in the morning. Few people give it a second thought. But how would you write down your process or tell a 5-year-old your approach? Answering these questions in a detailed way yields an algorithm.

To a computer, input is the information needed to make decisions.

When you get dressed in the morning, what information do you need? First and foremost, you need to know what clothes are available to you in your closet. Then you might consider what the temperature is, what the weather forecast is for the day, what season it is and maybe some personal preferences.

All of this can be represented in data, which is essentially simple collections of numbers or words. For example, temperature is a number, and a weather forecast might be rainy or sunshine.

Next comes the heart of an algorithm computation. Computations involve arithmetic, decision-making and repetition.

So, how does this apply to getting dressed? You make decisions by doing some math on those input quantities. Whether you put on a jacket might depend on the temperature, and which jacket you choose might depend on the forecast. To a computer, part of our getting-dressed algorithm would look like if it is below 50 degrees and it is raining, then pick the rain jacket and a long-sleeved shirt to wear underneath it.

After picking your clothes, you then need to put them on. This is a key part of our algorithm. To a computer a repetition can be expressed like for each piece of clothing, put it on.

Finally, the last step of an algorithm is output expressing the answer. To a computer, output is usually more data, just like input. It allows computers to string algorithms together in complex fashions to produce more algorithms. However, output can also involve presenting information, for example putting words on a screen, producing auditory cues or some other form of communication.

So after getting dressed you step out into the world, ready for the elements and the gazes of the people around you. Maybe you even take a selfie and put it on Instagram to strut your stuff.

Sometimes its too complicated to spell out a decision-making process. A special category of algorithms, machine learning algorithms, try to learn based on a set of past decision-making examples. Machine learning is commonplace for things like recommendations, predictions and looking up information.

[Deep knowledge, daily. Sign up for The Conversations newsletter.]

For our getting-dressed example, a machine learning algorithm would be the equivalent of your remembering past decisions about what to wear, knowing how comfortable you feel wearing each item, and maybe which selfies got the most likes, and using that information to make better choices.

So, an algorithm is the process a computer uses to transform input data into output data. A simple concept, and yet every piece of technology that you touch involves many algorithms. Maybe the next time you grab your phone, see a Hollywood movie or check your email, you can ponder what sort of complex set of algorithms is behind the scenes.

See the original post:

What is an algorithm? How computers know what to do with data - The Conversation US

Read the Rest...

A tiny particle collider yields new evidence for a type of ‘quasiparticles’ called anyons – Massive Science

§ October 16th, 2020 § Filed under Quantum Computer Comments Off on A tiny particle collider yields new evidence for a type of ‘quasiparticles’ called anyons – Massive Science

The president has had a life-threatening, infectious disease for over a week, and he and his doctors havent been very transparent about the timeline and course of his affliction. In lieu of detailed disclosures, reporters have to piece together his condition based on the treatments hes been receiving.

Trump was started off on an experimental therapeutic an antibody cocktail and then advanced to another remdesivir. The other biomolecules coursing through Donald Trump's system (and this week's headlines) are corticosteroids, called dexamethasone.

You may have heard of cytokine storms, where the body's immune response to severe COVID-19 bombards healthy cells, making the illness worse. Trump has been given dexamethasone, an immuno-supressant that doctors prescribe to temper that effect. Unlike the other experimental treatments, dexamethasone is common and somewhat easy to access. However, it is rarely administered to a patient with a case as (self-)reportedly mild as Donald Trumps. In an interview with New York Magazine's Intelligencer, the co-author of a recent study testing dexamethasone elaborates:

That lack of evidence is concerning as Trump heads into a critical point in the course of his illness. COVID-19 is known for being a bit of a roller coaster, with intermittent fevers, mysterious symptoms, and rapid declines. Abraar Karan, a physician with experience treating patients with COVID-19, told Monique Brouillette at Scientific American that some people have turned corners and left the hospital, only to come back feeling much sicker, with even worse oxygen levels and possibly other harm to the bodys organs.

It is theoretically possible that the early steroid treatment may ward off a dangerous auto-inflammatory reaction. But beyond the inherent risks of immuno-supression, corticosteroids may also cause behavioral side effects in the President. Trump's cognitive and behavioral state has been a point of concern for years. Potent steroids such as dexamethasone are known to increase appetite, decrease restful sleep, and bring about heightened "maniacal" energy states.

As the nation enters the weekend, Speaker of the House Nancy Pelosi is rolling out a 25th amendment commission, Trump is boasting a miraculous recovery with a Fox News doctor, and the rest of us continue to wait and learn how biology will run its course. For better or worse, the side effects our president experiences may prove to have historical consequences. To my knowledge, roid rage has never been a factor in nuclear geopolitics.

Read more here:

A tiny particle collider yields new evidence for a type of 'quasiparticles' called anyons - Massive Science

Read the Rest...

The Future of Computing: Hype, Hope, and Reality – CIOReview

§ October 16th, 2020 § Filed under Quantum Computer Comments Off on The Future of Computing: Hype, Hope, and Reality – CIOReview

Bill Reichert, Partner, Pegasus Tech Ventures

For roughly 75 years, the fundamental architecture of computers has not changed much. Certainly, the hardware has changed tremendously, and software has evolved accordingly. But the basic idea of storing instructions and data in binary code, and using on/off digital hardware to execute mathematical and logical operations, has remained roughly the same for decades.

All that is changing.

The same advances in semiconductor fabrication technology that powered Moores Lawthe exponential increase in the power of computers over the last several decadeshave enabled hardware engineers to develop new architectures that promise to transform the computing landscape over the coming decades.

At the same time, software engineering is also progressing. Marc Andreessen has famously said, Software is eating the world. What he did not make clear, though, is that virtually all the progress in computing over the past 30 years has been thanks to hardware, not software.

Heterogeneous Computing

New architectures, however,require that both software engineers and hardware engineers work together. A new class of hardware is emerging that takes advantage of what is called heterogeneous computing, multi-core chips that incorporate multiple different co-processors on the chip that are optimized for specialized tasks. Writing software that takes full advantage of these new chips is extremely challenging, and so companies like SambaNova Systems are developing operating systems and software compilers that optimize the application code automatically and allocate resources to compute tasks dynamically in real-time as computing demands change.

AI Chips

With the emergence of deep neural network software, engineers realized that Graphics Processing Units, an architecture commercialized by Nvidia, were nicely designed for doing the massive matrix calculations required by neural network models. But GPUs are not exactly optimized for AI, and so there has been an explosion of startups seeking to develop chips that offer 10x or 100x the performance and power efficiency of GPUs. On the server side, companies like Cerebras Systems and Graphcore, and more recently SambaNova, are promising order of magnitude improvements. And on the edge, companies like Gyrfalcon Technology, Syntiant, and Blaize are promising even greater improvements inperformance and power efficiency.

Virtually all the progress in computing over the past 30 years has been thanks to hardware, not software

Edge Computing

The second half of the 20th century was all about moving computing from centralized mainframe computers to desktop and laptop distributed computers. With the development of a high-speed Internet, the thinking shifted, and an application could sit in the cloud and support thousands, even millions, of users. But as the Internet of Things took off and enabled data collection from literally billions of devices, moving all that data up to the cloud in order to crunch it has become a challenge. Now companies are looking to process data at the edge, at the point of collection, rather than sending it up to the cloud, thereby reducing latency and cutting bandwidth and storage costs. At its simplest level, edge computing filters out unimportant data and sends only the most important data to the cloud. For more complex tasks, such as autonomous driving, edge computing requires processing massive AI models and making very accurate judgments in milliseconds. For these tasks, the new special-purpose chips discussed above and below are fighting for design wins.

Analog Computing

As brilliant as binary code is for enabling absolutely precise calculations, the real world is analog, not digital, and many compute tasks could be done more efficiently if we could operate with analog values rather than having to digitize them. But analog computing is imprecise, and most computing problems require exact values, not approximate values. (How much money do you have in your bank account?) Some problems, like AI inference and monitoring sensor data, do not need six sigma precision to get the right answer or make the right decision. Companies like Mythic, Analog Inference, and Aspinity are incorporating analog computing architectures into their chips to make them up to 100x more efficient solving problems involving data from our analog world.

Photonic Computing

Light has been used for digital communications and computer networks for decades, but using photons to do the math and putting photonic processors on a chip are extremely challenging. That is what several startups are trying to do. Spinning technologies out of MIT and Princeton, three companies, Lightelligence, Lightmatter, and Luminous Computing, are racing to commercialize the first photonic chip for doing AI inference at the edge.

Neuromorphic Computing

In spite of what the media portrays as the imminent cyber-apocalypse where robots rebel against their human masters and take over the world, we are a long way away from the science fiction world imagined in popular culture. The fact is that the human brain is still massively more powerful and efficient than the most powerful supercomputers on earth. But computer scientists think there is a path to create an artificial brain. The branch of artificial intelligence that uses neural network mathematical frameworks to compute information in a manner similar to the human brain is sometimes referred to as neuromorphic, because it mimics human neuro-biology. But researchers have been working on models that even more closely mimic the human brain in its design and efficiency. The brain sends signals as electrochemical spikes, not digital bytes, and the brains roughly 86 billion neurons are interconnected in a way that is very different from transistors on a chip. Researchers at Stanford, Intel, IBM, and several startup companies, such as Rain Neuromorphics and BrainChip, are trying to develop hardware and software that uses neuromorphic principles to deliver very high-power computing on very small semiconductor chips.

Quantum Computing

Almost certainly the most radical initiative in computing is the attempt to harness the potential of quantum computing. At the subatomic level, particles of matter behave in extraordinary and wonderful ways they can exist in more than one state simultaneously, and they can entangle with one another across a distance without any apparent physical connection. It turns out that electronic devices like transistors and diodes wouldnt even work if the universe were strictly Newtonian. If we can figure out how to control the quantum properties of light and matter the way we figured out how to use gears to make adding machines and transistors to make computers, we will be able to make quantum computers that are as superior to current supercomputers as supercomputers are to adding machines.

Some people say we are still a long way away from quantum supremacy, when quantum computers can solve problems that no classical computer can solve. But recent advances indicate that we may not be that far away from quantum advantage, when quantum computers can solve certain specialized problems faster than classical computers.

Already big players like IBM, Google, Intel, Honeywell, and Microsoft are demonstrating machines that can execute quantum algorithms and startups like Rigetti Computing,IonQ, and PsiQuantum are joining the race, along with quantum software companies like QC Ware, Cambridge Quantum Computing, and Zapata Computing. Big corporations and governments are investing in projects that will take advantage of the power of quantum computing in chemistry, pharmaceuticals, finance, logistics, failure analysis, and artificial intelligence.

Each of these emerging technologies promises to significantly advance computing, and with these advances will come new technology leaders. The evolution of computing has given rise to multiple generations of spectacular success stories like IBM, Intel, Microsoft, Nvidia, Google, and Amazon Web Services. Most of these companies are trying to reinvent themselves to catch the next wave of computing technology, but certainly new companies will emerge in these new sectors, and some famous names will founder and go the way of the dinosaurs, like Univac, Digital Equipment, MIPS, and Silicon Graphics. Meanwhile, corporate CIOs will have to decide where to place their bets and start investing in these new technologies, if they havent already.

The rest is here:

The Future of Computing: Hype, Hope, and Reality - CIOReview

Read the Rest...

Billionaire Investor Vinod Khosla Speaks Out On AI’s Future and the COVID-19 Economy – EnterpriseAI

§ October 16th, 2020 § Filed under Quantum Computer Comments Off on Billionaire Investor Vinod Khosla Speaks Out On AI’s Future and the COVID-19 Economy – EnterpriseAI

Vinod Khosla, a co-founder of the former Sun Microsystems and a longtime technology entrepreneur, venture capitalist and IT sage, makes billions of dollars betting on new technologies.

Khosla shared some of his technology and investment thoughts at a recent tech conference about the future of AI in business, AI chip design and quantum computing -- and even gave some advice to AI developers and companies about how they can successfully navigate the tumultuous times of the COVID-19 pandemic. Khosla gave his remarks at an Ask Me Anything Industry Luminary Keynote at the virtual AI Hardware Summit earlier in October. The Q&A was hosted by Rene Haas, the president of Arms IP products group, and a former executive with AI chipmaker Nvidia.

Khosla, who is ranked #353 on the Forbes 400 2020 list, has a net worth today of $2.6 billion, largely earned through his investment successes in the tech field. He founded his VC firm, Khosla Ventures, in 2004.

Here are edited segments from that 30-minute Q&A, which centered on questions asked by viewers of the virtual conference:

Rene Haas: What has been the most significant technological advancement in AI in the last year or two? And how do you anticipate it is going to change the landscape of business?

Vinod Khosla

Vinod Khosla: What's surprised me the most is bifurcation along two lines one that argues that deep learning goes all the way, and others who argue that AGI (artificial general intelligence) requires very different kinds [of uses]. My bet is that each will be good at certain functions. Now, I don't worry about AGI. Being a philosopher, I do worry about AI and AGI being used for most valuable economic functions human beings do. That's where the big opportunity is. What surprised me most is there's been great progress in language models and algorithms. But the outsize role of hardware in building models that are much more powerful, trillions of parameters per model, and how effective they can be, has been surprising. I'm somewhat biased because we have large investors in open AI. On the flip side, we are large investors in companies like Vicarious, which are taking that AGI in a very different approach.

Haas: Building on that a little bit, there are a lot of AI hardware startup companies. Some are well funded, some with high burn rates. When you think about competing with the software support ecosystem, like Nvidia has, how can startups really rely on the strength of their architecture alone? What are the kinds of things that you look at it in terms of guidelines for startups in this space?

Khosla: There's many different markets, you have to be clear. There is a training market in the data center. There's an inferencing market in the data center. There's a market for edge devices where the criteria are very different. And then there's this emerging area of what quantum computing might do in hardware. We can talk about any of these, but what's really interesting to me is how much innovation we are seeing. Companies like Nvidia and the big cloud providers, especially Google and others, have very strong efforts.

And probably the thing we've learned in semiconductors, having access to process technology and process nodes that others don't thats where the software ecosystem gives them such a large advantage. It's hard for startups to compete. Now, I could be wrong, but we've tended to avoid digital architectures, for the data center or for inferencing. We've looked at a dozen of those and chosen not to jump in. Because there's bigger players with huge software and process and resource advantages. On the analog side, it's a whole different ballgame. We've invested in analog inference. There's been multiple analog efforts. I think some haven't addressed enough of the problem to get a large enough power advantage.

So, the bottom line for a startup, is that to do better than Nvidia or one of the other larger players or cloud providers, then you've got to talk about 20X to 100X advantage in TeraOPS per watt. I think if you're not in the hundred TeraOPS per watt range, it's going to be hard to sustain a large advantage. And I see most digital efforts sort of in this one to 10 TeraOPS per watt power range. So I find the edge much more promising than the data center.

Haas: What about the difficulties of startups or companies trying to enter this field? Much of it is horizontal in nature. Do they need some kind of vertical stack or some tie into the ecosystems? Do the same challenges apply, relative to being a horizontal versus vertical business or do you think there are some different opportunities there?

Khosla: I think there will be classes of algorithms. There's clearly one class of algorithms around deep learning and things like that. The question of how architecture maps to different types of algorithms, and algorithmic approaches, is a little too early to predict, and that will determine what architectures work best.

On the edge, what's clearly going to be important is power efficiency. The really volume markets are under five watts and $5 and a couple of hundred TeraOPS. That's the price point I look at as differentiated enough for edge devices to do a lot of interesting things. Every speaker, every microphone, every sensor. You start to see price points that go from tens of pennies to a few dollars that go into these very high volume devices. I think that would be a different architecture than the stuff in the data center.

In the data center, whether inferencing and training are the same architecture or the same software stack even, I still think it's open for debate. I think in inferencing, cost matters and efficiency matters. In training, especially for the really large algorithms, probably not so much. So, hard to tap.

And then there's this really surprise thing of what quantum computing will do, and what kinds of algorithms that will run. The things we are most interested in is very specialized applications for quantum computing. We have one effort in drug discovery for quantum computing. I think material science with quantum computing is going to be interesting, possibly some financial services products based on quantum computing. So, plenty of these interesting options. I think for a while we'll see more of a bifurcation, but if I were to predict five years from now I think we'll see more unification around the types of algorithms that do certain economic tasks well.

Rene Haas

Haas: Quantum is something that has been written about for a long time and now you're starting to see some things product-wise that are looking a bit more real. As an investor, and looking at private company opportunities around quantum, do you feel like the time is now to start investing in companies that are doing things around the hardware space in quantum? Or do you look at it and say it's still years away from being commercially viable?

Khosla: In the big company world, it's definitely time for the big companies to be investing, and they're investing heavily. But that's Microsoft, Google, IBM and others. There's also a whole slew of startups where the market and products have emerged slower. And whenever things emerge slower especially on the hardware side, the big companies have an advantage because they can catch up. Whenever it takes lots and lots of resources, then the big companies have an advantage. Autonomous driving is the one area where that's mostly true, but not completely true. We've seen some radical innovation out of startups there.

So, it depends on the pace of development of a technology or deployment. I do think the time is very ripe for quantum software applications, specialized applications, to develop. But given how complex quantum is to use, such as the the interface between quantum and the regular computing world, and the full stack of software and how it runs algorithms, I think specialized algorithms will do better there.

Haas: You're obviously involved in AI chip startups. Looking at the last four years of AI chip startups, are you bullish, in general, looking back? And if so, which areas are you most excited about?

Khosla: When there's radical innovation, it's still interesting. We've seen a lot of startups, but I wouldn't say we've seen radical innovation in architectures or performance or power efficiency. And when I say power efficiency, it's really TeraOPS per watt, which is performance per watt that is really the key metric. If you see the kinds of large jumps, like 20X, 50X, 100X, then that's really interesting. Still, there's less room for it in the data center, more room for it in the edge, but every time I say something like this then some really clever person surprises me with a counter-narrative that actually is pretty compelling. So would I say I'm open for architectures? Yes. Radical changes, yes, and I think that will happen, but it's just very hard to predict today. The predictability on where things go is still low on innovation. But I always say, improbables are not unimportant. We just don't know which improbable is important. In the meantime, the traditional digital data center, even the digital edge, will probably belong to the larger players.

I do want to encourage the folks out there trying to build products. When we did the Nextgen product to compete with Intel, we very quickly got to 50% market share of the under $1,000 PC market, where we were competing on an x86 architecture with Intel. So surprises are possible, and people who take specialized approaches in market segments, there can be very interesting innovation to be done.

Haas: How large is the economic opportunity around AI and what do you think drives it?

Khosla: I'm probably more bullish. Whether you call it, AI or AGI, I think this area will be able to do most economically valuable human functions within the next decade. Probably a lot sooner. They will take time, integrating into regular workflows and traditional systems and all that. But the way I look at it, if we can replace human judgment in a task, you're saving far more money than selling a chip or a computer or something. So, if you can replace a security analyst and do their job, or have one security analyst do the job of five security analysts, or have one physician do the job of five physicians, you're saving gobs of money. And then you get to share in the human labor saving, which is where the large opportunities are. That could belong to both these combination software and hardware systems, I think that opportunity is orders of magnitude larger than any estimate I've seen today.

Haas: 2020 has been a very turbulent year. What advice would you give to tech entrepreneurs who are pushing through a recession and the remarkable situation involving the COVID-19 pandemic, while trying to build a product and build a company? What advice would you give to those entrepreneurs?

Khosla: I think the best ideas survive turbulent times. I find recessions are really the times when bigger companies cut back on some of their spending. I haven't seen that happen in this particular area. That's when people with the best ideas or with passion for a particular vision, leave those companies. So, I do see very good startups during turbulent times in general. Now, one has to be just pragmatic and adapt to the times. When money's cheap, you raise lots of money. When money is not cheap or not easily available you spend less, and take more time doing some fundamental work and getting it right. Which by the way is usually a better strategy than raising lots of money.

I do think that there is lots of opportunity. I think they have to adapt to the times and be much more thoughtful, maybe even more radical in their approach. Take larger leaps because you can take more time before you start spending the money to go to market. One of the things to keep in mind with most technologies thinking about the technology has huge implications downstream, but takes very little money. It takes very special talent. Then there's the building of the technology. And then there's the selling, and the sales or marketing usually ends up costing the most. Now's a good time to trade off for more compelling product and postpone some of the sales and marketing while the markets are uncertain. You can't afford to spend lots of money on that. So you have to adjust strategy as an entrepreneur and entrepreneurs do that fairly well.

Haas: What is your own investment philosophy, particularly when it comes to tech companies, and how does your overall portfolio, reflect that philosophy?

Khosla: We like the higher-risk, higher-upside things. I find investors generally reduce risk for good reasons, but make the consequences of success relatively inconsequential. I personally prefer larger risk, which is why I like analog right now, and make the consequential success, be it 50X or 100X better than what's available in the digital domain. I do see plenty of those kinds of opportunities still. I am not discouraged. I'm actually quite encouraged about the opportunities in this area. But, entrepreneurs usually find specialized paths to get to that first MVP product, that early traction, and then use it to broaden.

Haas: Model performance has been increasing slowly in the field of AI. Can you share your insights about that?

Khosla: In certain dimensions, I think that's true. When a technology plays out a certain way, it makes rapid progress in the beginning and then starts to peter out. Software models themselves are getting to a level of saturation. The progress on the hardware side, just scaling hardware, has been stunningly valuable as GTC-3 shows. It may give more of an advantage to the large cloud providers the people who can build, 500,000 CPU, GPU systems. But that's not for everyday use. I think that still needs to be told.

There are alternative approaches that still need to be discovered. I gave you the example of Vicarious, the robotics company we've invested in. Instead of needing 10 million or 100 million cats to recognize a cat, they're saying can we do it from 10 cats? So, maybe data becomes a lot less important. And what implications does that have for hardware architectures? It's very clear to me seeing the early results at Vicarious that it is entirely possible for AI systems to learn as rapidly and with as few examples as humans do, if the architecture is different than deep learning.

My bet is different approaches will be very good at different points, and we'll see that kind of specialization of architectures. A long time ago, 25 to 30 years ago, when you looked at Lego blocks, it came in large yellow, white, red, black and blue blocks. And there were three or four types of components. I think that's where software algorithms in AI may be today. Now, you couldn't build the Sydney Opera House out of Lego blocks back then, but then they got all these specialized components. The possibilities explode exponentially, so the combinations allow a lot more flexibility on what can happen, what systems can do. So, it might be we just need different types of algorithms to explore the capability of end-use systems. And that might have large implications for which hardware architectures work.

Hardware scaling may matter in some of these and clever architectures may matter in others. That's why I'm tracking what quantum computing may do for algorithms. Not just your standard quantum computing Shor's algorithm, etc. but real applications like drug discovery or material science. Or could you do better battery material? Those are really interesting now.

Haas: What advice do you have for first time hardware entrepreneurs, with strong architecture ideas, with really smart engineers, who don't really have a track record, and who haven't done this before -- how do you advise them to position themselves to get into this segment?

Khosla: Silicon Valley is very good at recognizing thoughtful, clever people -- they don't have to have a track record. Most successful entrepreneurs don't have track records. So, I wouldn't be afraid of that. I don't think you need a lot of management experience. Building great teams is probably the single piece of advice I give to entrepreneurs. Great and multi-dimensional teams to go after the problem, even if they haven't done it yet. Also, how the cleverness of your architecture isnt as important as the end results you deliver. Can you deliver that 20X, 50X over what the traditional players will do for your market? I think people underappreciate how much of an advantage you need in your architecture to make it worthwhile to do that startup.

And one more thing. There's a whole lot of tricks both on the models on the software side, on the hardware side. You can do hardware tricks and there's half a dozen which are very common in hardware and half a dozen that are pretty common in software, like reducing the model size. Everybody really gets there. Others have fundamental long-lasting advantages and if you're doing the startup, focus not on the tricks that give you a 5X improvement, because others will catch up to those tricks, either on software or hardware. Instead, focus on what will be the fundamental innovations five years from now, where you'll still have an advantage.

View original post here:

Billionaire Investor Vinod Khosla Speaks Out On AI's Future and the COVID-19 Economy - EnterpriseAI

Read the Rest...

IBM: Five ways technology will shape our lives | Technology & AI | Business Chief North America – Business Chief

§ October 16th, 2020 § Filed under Quantum Computer Comments Off on IBM: Five ways technology will shape our lives | Technology & AI | Business Chief North America – Business Chief

Capturing carbon dioxide to slow climate change and repurposing existing drugs to produce a vaccine for COVID-19 are two of the predictions from IBMs annual 5 in 5 technology report.

Five ways technology will change our lives within five years, is the IBM Research paper which outlines how accelerating the process of discovery will result in a sustainable future.

Each year, IBM showcases how they believe technology will reshape business and society, informed by work occurring within IBM Researchs global labs and industry trends.

Today, the convergence of emerging technologies including Artificial Intelligence (AI) and quantum computing is enabling us to consider a wider range of questions once thought out of reach, states the report.

We urgently need to design new materials to tackle pressing societal challenges addressed in the UN Sustainable Development Goals, from fostering good health and clean energy to bolstering sustainability, climate action and responsible production.

Top five predictions by IBM Research include:

Carbon dioxide conversion - Slow climate change by the capture and reuse of CO2 in the atmosphere

Antivirals Repurpose drugs to reduce time spent on drug discovery to beat COVID-19 and future pandemics

Energy storage - Accelerated discovery of new materials for better batteries to meet global demand for electricity without raising the temperature of the Earth

Nitrogen fixation - AI and quantum computing will come up with a solution to enable nitrogen fixation to feed the worlds growing population (estimated to be 10 billion by 2050)

Photoresists - Scientists will embrace a new approach to materials that lets the tech industry more quickly to produce sustainable materials to produce semiconductors and electronic devices

Taking a closer look at the five predictions reveal the following points:

IBM predicts that it will be possible to capture and reuse carbon dioxide from the atmosphere in a bid to slow down climate change.

It is reported that climate change will lead to higher levels of CO2 by 2025 than those seen during the warmest period of the last 3.3 million years. A team of IBM researchers are creating a cloud-based knowledge base of existing methods and materials to capture CO2

Progressing carbon capture and sequestration before it is too late requires an acceleration of the discovery process. Sophisticated AI systems and AI-guided automatic lab experiments would test large numbers of chemical reactions.

The goal over the next five years is to make CO2 capture and reuse efficient enough to scale globally so it can reduce the amount of CO2 released into the atmosphere and slow climate change.

IBM predicts medical researchers will identify new opportunities for drug repurposing which would help find a vaccine against COVID-19 and future viruses.

Scientists estimate there are more than a million viruses in nature with a potential to spread like COVID-19. It can take up to $2.6 billion and more than a decade for a new drug to reach the market.

One way to kick-start the process is to identify potential therapies from existing drugs - jumpstarting subsequent research to help enable rapid clinical trials and regulatory review.

IBM Research outlines that solutions could include a combination of AI analytics and data that could potentially help with real-world medical evidenceto suggest new candidates for drug repurposing.

In the context of COVID-19, researchers used this technology with real-world evidence to suggest the use of two existing drugs. The first was approved for specific immunological and endocrine disorders and the second was one in use for treating prostate cancer.

Energy storage - Rethinking batteries

IBM predicts it will be possible to discover new materials for safer and more environmentally preferable batteries capable of supporting a renewable-based energy grid and more sustainable transportation.

Many renewable energy sources are intermittent and require storage. The use of AI and quantum computing will result in batteries built with safer and more efficient materials for improved performance, stresses the report.

IBM predict that it will be possible to replicate natures ability to convert nitrogen in the atmosphere into nitrate-rich fertiliser, feeding the growing world population while reducing the environmental impact of fertilisers.

Using the accelerated discovery cycle, researchers will sift through existing knowledge about catalysts. In a few years, a quantum computer might be able to precisely simulate different nitrogen fixation catalytic processes, further augmenting our knowledge.

Well come up with an innovative solution to enable nitrogen fixation at a sustainable scale.

Semiconductor transistors have shrunk, giving us smaller, more powerful gadgets as more processing power onto a single chip. This shrinking has been enabled by materials known as photoresists.

But with billions of phones, TVs, and cars in the world it is imperative all the chemicals, materials and processes used in their manufacture are sustainable.

IBM predicts it will be possible to advance materials manufacturing, enabling semiconductor manufacturers to improve the sustainability of their coveted products.

Read more

For more information on business topics in the United States and Canada, please take a look at the latest edition of Business Chief North America.

Follow Business Chief on LinkedIn and Twitter.

Go here to read the rest:

IBM: Five ways technology will shape our lives | Technology & AI | Business Chief North America - Business Chief

Read the Rest...

4 Reasons Why Now Is the Best Time to Start With Quantum Computing – Medium

§ October 14th, 2020 § Filed under Quantum Computer Comments Off on 4 Reasons Why Now Is the Best Time to Start With Quantum Computing – Medium

Quantum computing is a rapidly developing field, with everyone trying to build the perfect hardware, find new applications for current algorithms, or even develop new algorithms. Because of that, the near-future demand for quantum programmers and researchers will increase shortly.

Many governmental and industrial institutions have set aside substantial funds to develop quantum technologies. The Quantum Daily (TQD) estimated the current market for quantum computing to be around $235 million. This number is predicted to grow substantially to $6.25 billion by 2025.

This incredible amount of funds leads to an increase in the number of academia, government, and industry positions. Almost all technology companies are changing their business model to adapt to when quantum technology makes an impact.

TQD also adds that the U.S. Bureau of Labor Statistics estimates that in 2020 so far, there are around 1.4 million more quantum software development jobs than applicants who can fill them.

In 2019, MIT published an article called Q&A: The talent shortage in quantum computing that addressed the different challenges the field faces right now. Afterward, it developed MIT xPRO, a group addressing the reality that students arent the only people interested in learning about the different aspects of quantum information.

View post:

4 Reasons Why Now Is the Best Time to Start With Quantum Computing - Medium

Read the Rest...

Ten-year Forecasts for Quantum Networking Opportunities and Deployments Over the Coming Decade – WFMZ Allentown

§ October 14th, 2020 § Filed under Quantum Computer Comments Off on Ten-year Forecasts for Quantum Networking Opportunities and Deployments Over the Coming Decade – WFMZ Allentown

DUBLIN, Oct. 12, 2020 /PRNewswire/ -- The "Quantum Networking: A Ten-year Forecast and Opportunity Analysis" report has been added to ResearchAndMarkets.com's offering.

This report presents detailed ten-year forecasts for quantum networking opportunities and deployments over the coming decade.

Today there increasing talk about the Quantum Internet. This network will have the same geographical breadth of coverage as today's Internet but where the Internet carries bits, the Quantum Internet will carry qubits, represented by quantum states. The Quantum Internet will provide a powerful platform for communications among quantum computers and other quantum devices. It will also further enable a quantum version of the Internet-of-Things. Finally, quantum networks can be the most secure networks ever built - completely invulnerable if constructed properly.

Already there are sophisticated roadmaps showing how the Quantum Internet will come to be. At the present time, however, quantum networking in the real world consists of three research programs and commercialization efforts: Quantum Key Distribution (QKD) adds unbreakable coding of key distribution to public-key encryption. Cloud/network access to quantum computers is core to the business strategies of leading quantum computer companies. Quantum sensor networks promise enhanced navigation and positioning; more sensitive medical imaging modalities, etc. This report provides power ten-year forecasts of all three of these sectors.

This report provides a detailed quantitative analysis of where the emerging opportunities can be found today and how they will emerge in the future:

With regard to the scope of the report, the focus is, of course, on quantum networking opportunities of all kinds. It looks especially, however, on three areas: quantum key distribution (QKD,) quantum computer networking/quantum clouds, and quantum sensor networks. The report also includes in the forecasts breakouts by all the end-user segments of this market including military and intelligence, law enforcement, banking and financial services, and general business applications, as well as niche applications. There are also breakouts by hardware, software and services as appropriate.

In addition, there is also some discussion of the latest research into quantum networking, including the critical work on quantum repeaters. Quantum repeaters allow entanglement between quantum devices over long distances. Most experts predict repeaters will start to prototype in real-world applications in about five years, but this is far from certain.

This report will be essential reading for equipment companies, service providers, telephone companies, data center managers, cybersecurity firms, IT companies and investors of various kinds.

Key Topics Covered:

Executive Summary E.1 Goals, Scope and Methodology of this Report E.1.1 A Definition of Quantum Networking E.2 Quantum Networks Today: QKD, Quantum Clouds and Quantum Networked Sensors E.2.1 Towards the Quantum Internet: Possible Business Opportunities E.2.2 Quantum Key Distribution E.2.3 Quantum Computer Networks/Quantum Clouds E.2.4 Quantum Sensor Networks E.3 Summary of Quantum Networking Market by Type of Network E.4 The Need for Quantum Repeaters to Realize Quantum Networking's Potential E.5 Plan of this Report

Chapter One: Ten-year Forecast of Quantum Key Distribution 1.1 Opportunities and Drivers for Quantum Key Distribution Networks 1.1.1 QKD vs. PQC 1.1.2 Evolution of QKD 1.1.3 Technology Assessment 1.2 Ten-year Forecasts of QKD Markets 1.2.1 QKD Equipment and Services 1.2.2 A Note on Mobile QKD 1.3 Key Takeaways from this Chapter

Chapter Two: Ten-Year Forecast of Quantum Computing Clouds 2.1 Quantum Computing: State of the Art 2.2 Current State of Quantum Clouds and Networks 2.3 Commercialization of Cloud Access to Quantum Computers 2.4 Ten-Year Forecast for Cloud Access to Quantum Computers 2.4.1 Penetration of Clouds in the Quantum Computing Space 2.4.2 Revenue from Network Equipment for Quantum Computer Networks by End-User Industry 2.4.3 Revenue from Network Equipment Software by End-User Industry 2.5 Key Takeaways from this Chapter

Chapter Three: Ten-Year Forecast of Quantum Sensor Networks 3.1 The Emergence of Networked Sensors 3.1.1 The Demand for Quantum Sensors Seems to be Real 3.2 The Future of Networked Sensors 3.3 Forecasts for Networked Quantum Sensors 3.4 Five Companies that will Shape the Future of the Quantum Sensor Business: Some Speculations

Chapter Four: Towards the Quantum Internet 4.1 A Roadmap for the Quantum Internet 4.1.1 The Quantum Internet in Europe 4.1.2 The Quantum Internet in China 4.1.3 The Quantum Internet in the U.S. 4.2 Evolution of Repeater Technology: Ten-year Forecast 4.3 Evolution of the Quantum Network 4.4 About the Analyst 4.5 Acronyms and Abbreviations Used In this Report

For more information about this report visit https://www.researchandmarkets.com/r/rksyxu

About ResearchAndMarkets.com ResearchAndMarkets.com is the world's leading source for international market research reports and market data. We provide you with the latest data on international and regional markets, key industries, the top companies, new products and the latest trends.

Research and Markets also offers Custom Research services providing focused, comprehensive and tailored research.

Media Contact:

Research and Markets Laura Wood, Senior Manager press@researchandmarkets.com

For E.S.T Office Hours Call +1-917-300-0470 For U.S./CAN Toll Free Call +1-800-526-8630 For GMT Office Hours Call +353-1-416-8900

U.S. Fax: 646-607-1907 Fax (outside U.S.): +353-1-481-1716

Continued here:

Ten-year Forecasts for Quantum Networking Opportunities and Deployments Over the Coming Decade - WFMZ Allentown

Read the Rest...

Canadian quantum computing firms partner to spread the technology – IT World Canada

§ October 7th, 2020 § Filed under Quantum Computer Comments Off on Canadian quantum computing firms partner to spread the technology – IT World Canada

In a bid to accelerate this countrys efforts in quantum computing, 24 Canadian hardware and software companies specializing in the field are launching an association this week to help their work get commercialized.

Called Quantum Industry Canada, the group says they represent Canadas most commercial-ready technologies, covering applications in quantum computing, sensing, communications, and quantum-safe cryptography.

The group includes Burnaby, B.C., manufacturer D-Wave Systems, Vancouver software developer 1Qbit, Torontos photonic quantum computer maker Xanadu Quantum Technologies, the Canadian division of software maker Zapata Computing, Waterloo, Ont.,-based ISARA which makes quantum-safe solutions and others.

The quantum opportunity has been brewing for many years, association co-chair Michele Mosca of the University of Waterloos Institute for Quantum Computing and the co-founder of two quantum startups, said in an interview, explaining why the new group is starting now. Canadas been a global leader at building up the global opportunity, the science, the workforce, and we didnt want this chance to pass. Weve got over 24 innovative companies, and we wanted to work together to make these companies a commercial success globally.

Its also important to get Canada known as a leader in quantum-related products and services, he added. This will help assure a strong domestic quantum industry as we enter the final stages of quantum readiness.

And while quantum computing is a fundamental new tool, Mosca said, its also important for Canadian organizations to start planning for a quantum computing future, even if the real business value isnt obvious. We dont know exactly when youll get the real business advantage you want to be ready for when quantum computers can give you an advantage.

Adib Ghubril, research director at Toronto-based Info-Tech Research Group, said in an interview creation of such a group is needed. When you want to foster innovation you want to gain critical mass, a certain number of people working in different disciplines it will help motivate them, even maybe compete.

Researchers from startups and even giants like Google, Microsoft, Honeywell and IBM have been throwing billions at creating quantum computers. So are countries, especially China, but also Australia, the U.K., Germany and Switzerland. Many big-name firms are touting projects with experimental equipment, or hybrid hardware that does accelerated computations but dont meet the standard definition of a quantum computer.

True quantum computers may be a decade off, some suggest. Ghubril thinks were 15 years from what he calls reliable, effective quantum computing. Still, last December IDC predicted that by 2023, one-quarter of the Fortune Global 500 will gain a competitive advantage from emerging quantum computing solutions.

Among the recent signposts:

Briefly, quantum computers take the theory of quantum mechanics to change the world of traditional computation of bits represented by zeros and ones. Instead, a bit can be a zero or a one. In a quantum computer, such basic elements are called qubits. With their expected ability to do astonishing fast computations, quantum computers may be able to help pharmaceutical companies create new drugs and nation-states to break encryption protecting government secrets.

Companies are taking different approaches. D-Wave uses a quantum annealing process to make machines it says are suited to solving real-world computing problems today. Xanadu uses what Mosca calls a more circuit-type computing architecture. Theres certainly the potential that some of the nearer-term technologies will offer businesses advantage, especially as they scale.

We know the road towards a full-fledged quantum computer is long. But there are amazing milestones in that direction.

Ghubril says Canada is in the leading pack of countries working on quantum computing. The momentum out of China is enormous, he said, but it looks like the country will focus on using quantum for telecommunications and not business solutions.

From his point of view companies are taking two approaches to quantum computers. Some, like D-Wave, are trying to use quantum ideas to optimize solving modelling problems. The problem is not every problem is an optimization problem, he said. Other companies are trying for the Grand Poobah the real (quantum) computer. So the IBMs of the world are going for the gusto. They want the real deal. They want to solve the material chemistry and biosynthesis and so on. Theyve gone big, but by doing so theyve gone slower. You cant do much on the IBM platform. You can learn a lot, but you cant do much. You can do more on a D-Wave, but you can only do one thing.

Ghburil encourages companies to dabble in the emerging technology.

Thats Infotechs recommendation: Just learn about it. Join a forum, open an account, try a few things. Nobody is going to gain a (financial) competitive advantage. Its a learning advantage.

MapleSEC: How municipalities can guard against inevitable ransomware threats

Hashtag Trending - COVID Alert open source; AT&T ends DSL; John McAfee arrested

Read the original post:

Canadian quantum computing firms partner to spread the technology - IT World Canada

Read the Rest...

Quantum computing: Photon startup lights up the future of computers and cryptography – ZDNet

§ October 6th, 2020 § Filed under Quantum Computer Comments Off on Quantum computing: Photon startup lights up the future of computers and cryptography – ZDNet

A fast-growing UK startup is quietly making strides in the promising field of quantum photonics. Cambridge-based company Nu Quantum is building devices that can emit and detect quantum particles of light, called single photons. With a freshly secured 2.1 million ($2.71 million) seed investment, these devices could one day underpin sophisticated quantum photonic systems, for applications ranging from quantum communications to quantum computing.

The company is developing high-performance light-emitting and light-detecting components, which operate at the single-photon level and at ambient temperature, and is building a business based on the combination of quantum optics, semiconductor photonics, and information theory, spun out of the University of Cambridge after eight years of research at the Cavendish Laboratory.

"Any quantum photonic system will start with a source of single photons, and end with a detector of single photons," Carmen Palacios-Berraquero, the CEO of Nu Quantum, tells ZDNet. "These technologies are different things, but we are bringing them together as two ends of a system. Being able to controllably do that is our main focus."

SEE: Hiring Kit: Computer Hardware Engineer (TechRepublic Premium)

As Palacios-Berraquero stresses, even generating single quantum particles of light is very technically demanding.

In fact, even the few quantum computers that exist today, which were designed by companies such as Google and IBM, rely on the quantum states of matter, rather than light. In other words, the superconducting qubits that can be found in those tech giants' devices rely on electrons, not photons.

Yet the superconducting qubits found in current quantum computers are, famously, very unstable. The devices have to operate in temperatures colder than those found in deep space to function, because thermal vibrations can cause qubits to fall from their quantum state. On top of impracticality, this also means that it is a huge challenge to scale up the number of qubits in the computer.

A photonic quantum computer could have huge advantages over its matter-based counterpart. Photons are much less prone to interact with their environment, which means they can retain their quantum state for much longer and over long distances. A photonic quantum computer could, in theory, operate at room temperature and as a result, scale up much faster.

The whole challenge comes from creating the first quantum photon, explains Palacios-Berraquero. "Being able to emit one photon at a time is a ground-breaking achievement. In fact, it has become the Holy Grail of quantum optics."

"But I worked on generating single photons for my PhD. That's the IP I brought to the table."

Carmen Palacios-Berraquero and the Nu Quantum team just secured a 2.1 million ($2.71 million) seed investment.

Combined with improved technologies in the fields of nanoscale semi-conductor fabrication, Palacios-Berraquero and her team set off to crack the single-photon generation problem.

Nu Quantum's products come in the form of two little boxes: the first one generates the single photons that can be used to build quantum systems for various applications, and the other measures the quantum signals emitted by the first one. The technology, maintains the startup CEO, is bringing quantum one step closer to commercialization and adoption.

"Between the source and the detector of single photons, many things can happen, from the simplest to the most complex," explains Palacios-Berraquero. "The most complex one being a photonic quantum computer, in which you have thousands of photons on one side and thousands of detectors on the other. And in the middle, of course, you have gates, and entanglement, and and, and and. But that's the most complex example."

A photonic quantum computer is still a very long-term ambition of the startup CEO. A simpler application, which Nu Quantum is already working on delivering commercially with the UK's National Physical Laboratory, is quantum random number generation a technology that can significantly boost the security of cryptographic keys that secure data.

The keys that are currently used to encrypt the data exchanged between two parties are generated thanks to classical algorithms. Classical computing is deterministic: a given input will always produce the same output, meaning that complete randomness is fundamentally impossible. As a result, classical algorithms are predictable to an extent. In cryptography, this means that security keys can be cracked fairly easily, given sufficient computing power.

Not so much with quantum. A fundamental property of quantum photons is that they behave randomly: for example, if a single photon is sent down a path that separates in two ways, there is no way of knowing deterministically which way the particle will choose to go through.

SEE: What is the quantum internet? Everything you need to know about the weird future of quantum networks

The technology that Nu Quantum is developing with the National Physical Laboratory, therefore, consists of a source of single photons, two detectors, and a two-way path linking the three devices. "If we say the right detector is a 1, and the left detector is a 0, you end up with a string of numbers that's totally random," says Palacios-Berraquero. "The more random, the more unpredictable the key is, and the more secure the encryption."

Nu Quantum is now focusing on commercializing quantum random number generation, but the objective is to build up systems that are increasingly complex as the technology improves. Palacios-Berraquero expects that in four or five years, the company will be able to start focusing on the next step.

One day, she hopes, Nu Quantum's devices could be used to connect quantum devices in a quantum internet a decade-long project contemplated by scientists in the US, the EU, and China, which would tap the laws of quantum mechanics to almost literally teleport some quantum information from one quantum device to the next. Doing so is likely to require single photons to be generated and distributed between senders and receivers, because of the light particles' capacity to travel longer distances.

In the shorter term, the startup will be focusing on investing the seed money it has just raised. On the radar, is a brand-new lab and headquarters in Cambridge, and tripling the size of the team with a recruitment drive for scientists, product team members and business functions.

Follow this link:

Quantum computing: Photon startup lights up the future of computers and cryptography - ZDNet

Read the Rest...

12 European Companies and Research Labs Join Forces to Boost Industrial Quantum Computing Applications – HPCwire

§ October 6th, 2020 § Filed under Quantum Computer Comments Off on 12 European Companies and Research Labs Join Forces to Boost Industrial Quantum Computing Applications – HPCwire

LES CLAYES, France, Oct. 5 2020 The NExt ApplicationS of Quantum Computing (NEASQC) project brings together a multidisciplinary consortium of academic and industry experts in Quantum Computing, High Performance Computing, Artificial Intelligence, chemistry and energy management. NEASQC aims to demonstrate that, though the millions of qubits that will guarantee fully fault-tolerant quantum computing are still far away, there are practical use cases for the NISQ (Noisy Intermediate- Scale Quantum) devices that will be available in the near future. NISQ computing can deliver significant advantages when running certain applications, thus bringing game-changing benefits to users, and particularly industrial users.

The NEASQC consortium has chosen 9 NISQ-compatible industrial and financial use-cases, and will develop new quantum software techniques to solve those use-cases with a practical quantum advantage.

The ultimate ambition of NEASQC is to encourage European user communities to investigate NISQ quantum computing. For this purpose, the project consortium will define and make available a complete and common toolset that new industrial actors can use to start their own practical investigation and share their results. explained Cyril Allouche, Fellow, VP, Head of the Atos Quantum R&D Program at Atos, and coordinator of the NEASQC project.

NEASQC also aims to build a much-needed bridge between Quantum Computing hardware activities, particularly those of the European Quantum Flagship, and the end-user community. Even more than in classical IT, NISQ computing demands a strong cooperation between hardware teams and software users. We expect our work in use cases will provide strong directions for the development of NISQ machines, what will be very valuable to the nascent quantum hardware industry.

The NEASQC project gathers 12 organisations from 8 European countries and is coordinated by Atos. The 4-year project has a budget of 4.67 million Euros, funded by the European Commission under the Horizon 2020 programme. It was launched on 5 October with an online kick-off meeting that virtually gathered representatives of all consortium members.

NEASQC objectives

1. Develop 9 industrial and financial use cases with a practical quantum advantage for NISQ machines. 2. Develop open source NISQ programming libraries for industrial use cases, with a view to facilitate quantum computing experimentation for new users. 3. Build and share knowledge with a strong user community dedicated to industrial NISQ applications. 4. Develop software stacks and benchmarks for the Quantum Technology Flagship

About the NEASQC project

The NEASQC project brings together academic experts and industrial end-users to investigate and develop a new breed of Quantum-enabled applications that can take advantage of NISQ systems in the near future. NEASQC is use-case driven, addressing practical problems such as drug discovery, CO2 capture, energy management, natural language processing, breast cancer detection, probabilistic risk assessment for energy infrastructures, or hydrocarbon well optimisation. NEASQC aims to initiate an active European community around NISQ Quantum Computing by providing a common toolset that will attract new industrial users.

The NEASQC project is run by a European consortium that includes:

NEASQC is one of the projects selected within the second wave of Quantum Flagship projects and will be included with the Quantum Computing Application Area. This project has received funding from the European Unions Horizon 2020 research and innovation programme under grant agreement No 951821

The Quantum Flagship was launched in 2018 as one of the largest and most ambitious research initiatives of the European Union. With a budget of at least 1 billion and a duration of 10 years, the flagship brings together research institutions, academia, industry, enterprises, and policy makers, in a joint and collaborative initiative on an unprecedented scale. The main objective of the flagship is to consolidate and expand European scientific leadership and excellence in this research area as well as to transfer quantum physics research from the lab to the market by means of commercial applications and disruptive technologies. With over 5000 researchers from academia and industry involved in this initiative throughout its lifetime, it aims to create the next generation of disruptive technologies that will impact Europes society, placing the region as a worldwide knowledge-based industry and technological leader in this field.

See the article here:

12 European Companies and Research Labs Join Forces to Boost Industrial Quantum Computing Applications - HPCwire

Read the Rest...

SC20 Invited Speakers Tackle Challenges for the Earth, Its Inhabitants, and Our Security Using ‘More Than HPC’ – HPCwire

§ October 6th, 2020 § Filed under Quantum Computer Comments Off on SC20 Invited Speakers Tackle Challenges for the Earth, Its Inhabitants, and Our Security Using ‘More Than HPC’ – HPCwire

Oct. 5, 2020 The Invited Talks for SC20 represent the breadth, depth and future outlook of technology and its societal and scientific impact. HPC has always played a critical role in advancing breakthroughs in weather and climate research. This years invited talks extend this further to data driven approaches, including biodiversity, geoscience, and quantum computing. Our speakers will also touch on responsible application of HPC and new technological developments to highlight the impact of this potent and versatile technology on a wide range of applications.

Hear these illustrious speakers during SC20 Invited Talks, TuesdayThursday, November 1719.

Lorena Barba(George Washington University) will explore the need for trustworthy computational evidence through transparency and reproducibility. With the explosion of new computational models for vital research, including COVID-19, applications that are of such importance to society highlight the requirement of building trustworthy computational models. Emphasizing transparency and reproducibility have helped us build more trust in computational findings. How should we adapt our practices for reproducibility to achieve unimpeachable provenance, and reach full accountability of scientific evidence produced via computation?

Shekhar Borkar(Qualcomm Inc.) will speak on the future of computing in the so-called post Moores law era. While speculations about the end of Moores law have created some level of fear in the community, this ending may not be coming as soon as we think. This talk will revisit the historic predictions of the end, and discuss promising opportunities and innovations that may further Moores law and continue to deliver unprecedented performance for years to come.

Dalia A. Conde(University of Southern Denmark) will offer a presentation on fighting the extinction crisis with data. With biodiversity loss identified by the World Economic Forum as one of humanitys greatest challenges, computational methods are urgently needed to secure a healthier planet. We must design and implement effective species conservation strategies, which rely on vast and disparate volumes of data, from genetics and habitat to legislation and human interaction. This talk will introduce the Species Knowledge Index initiative, which aims to map, quantify, analyze, and disseminate open information on animal species to policy makers and conservationists around the globe.

Tom Conte(Georgia Tech) will examine HPC after Moores law. Whether Moores law has ended, is about to end, or will never end, the slowing of the semiconductor innovation curve has left the industry looking for alternatives. Different approaches, beyond quantum or neuromorphic computing, may disrupt current algorithms and software development. This talk will preview the road ahead, and suggest some exciting new technologies on the horizon.

Marissa Giustina(Google LLC) will share the challenges and recent discoveries in the development of Googles Quantum computer, from both the hardware and quantum-information perspectives. This prototype hardware holds promise as a platform for tackling problems that have been impossible to address with existing HPC systems. The talk will include recent technological developments, as well as some perspective for the future of quantum computing.

Patrick Heimbach(The University of Texas at Austin) will discuss the need for advanced computing to help solve the global ocean state estimation problem. Because of the challenge of observing the full-depth global ocean circulation in its spatial detail, numerical simulations play an essential role in quantifying patterns of climate variability and change. New methods that are being developed at the interface of predictive data science remain underutilized in ocean climate modeling. These methods face considerable practical hurdles in the context of HPC, but will be indispensable for advancing simulation-based contributions to real world problems.

Simon Knowles(Graphcore) will discuss the reinvention of accelerated computing for artificial intelligence. As HPC changes in response to the needs of the growing user community, AI can harness enormous quantities of processing power even as we move towards power-limited computing. To balance these needs, the intelligence processor (IPU) architecture is able to capture learning processes and offer massive heterogeneous parallelism. This ground-up reinvention of accelerated computing will show considerable results for real applications.

Ronald P. Luijten(Data Motion Architecture and Consulting GmbH) will offer a presentation on data-centric architecture of a weather and climate accelerator. Using a co-design approach, a non-Von-Neumann accelerator targeting weather and climate situations was developed in tandem with the application code to optimize memory bandwidth. This also led to the filing of a patent for a novel CGRA (Course Grain Reconfigurable Array) layout that reflects grid points in the physical world. The talk will include benchmarks achieved in the project, and a discussion of next steps.

Catherine (Katie) Schuman(Oak Ridge National Laboratory) will introduce us to the future of AI and HPC, in the form of neuromorphic computing and neural accelerators. These two new types of computing technologies offer significant advantages over traditional approaches, including considerably increased energy efficiency and accelerated neural network-style computing. This talk will illustrate the fundamental computing concepts involved in these new hardware developments, and highlight some initial performance results.

Compton Tucker(NASA Goddard Space Flight Center) will speak on satellite tree enumeration outside of forests at the Fifty Centimeter Scale. Non-forest trees, which grow isolated outside of forests, and are not well documented, nevertheless play a crucial role for biodiversity, carbon storage, food resources, and shelter for humans & animals. This talk will detail the use of HPC and machine learning to enumerate isolated trees globally, to identify localized areas of degradation, and quantify the role of isolated trees in the global carbon cycle.

Cliff Young(Google LLC) will entertain the question of whether we can build a virtuous cycle between machine learning and HPC. While machine learning draws on many HPC components, the two areas are diverging in precision and programming models. However, it may be possible to construct a positive feedback loop between them. The Tensor Processing Unit (TPU) could provide opportunities to unite these fields to solve common problems through parallelization, mixed precision, and new algorithms.

Source: Melyssa Fratkin, SC20 Communications Chair

See the article here:

SC20 Invited Speakers Tackle Challenges for the Earth, Its Inhabitants, and Our Security Using 'More Than HPC' - HPCwire

Read the Rest...

JCSU to receive a portion of IBM investment aiming to show representation matters – WCNC.com

§ October 6th, 2020 § Filed under Quantum Computer Comments Off on JCSU to receive a portion of IBM investment aiming to show representation matters – WCNC.com

Representation Matters. Now, companies are showing their commitment to increasing diversity in the tech industries by investing in HBCUs.

CHARLOTTE, N.C. Johnson C. Smith University in Charlotte is one of just a handful of historically Black colleges and universities selected to receive a portion of a $100 Million investment from IBM, as part of a pledge to show that representation matters.

IBM announced its first Quantum Education and Research Initiative for HBCUs. The money will be used to enhance curriculum, create more research opportunities for faculty and staff, and carve a brighter path for students.

We need to invest in underserved and overlooked areas because were not competing with the person down the street, or in another city, were competing globally for jobs, said Terik Tidwell, executive director of the Smith Tech-Innovation Center at JCSU.

Tidwell said the investment will make a difference not only in what programs they can offer but also, and perhaps more importantly, what the future could look like for students who choose STEM careers.

This is a part of systems change and wealth-creating, Tidwell said. For them to get a job [with a starting salary of] $80,000-100,000 its changing their family, its changing their descendants thats going to come after them.

Statistics show Black students are underrepresented across STEM fields.

Only 15% of Black and Latinx students are exposed to computer science in high school, Tidwell cited. By college, the number of Black and Latinx students declaring STEM-related majors drops to less than 10%. Less than 5% of the employee population at most STEM-related industries is Black or Latinx, Tidwell said.

This huge economic wave is coming and we need to make sure that we are part of it, he said.

One of the students who is excited about the investment said she believes it could open doors for those who come behind her.

Its definitely challenging as a young black woman because its so easy to get overlooked, said JCSU senior Crystal Howard.

The computer science and information systems major recently accepted a position at Bank of America.

There is so much talent at HBCUs, Howard said. Now theyre looking, and man, are we gonna take corporate America by storm. I cant wait, I cant wait and Im so excited to be apart of it.

Read more from the original source:

JCSU to receive a portion of IBM investment aiming to show representation matters - WCNC.com

Read the Rest...

QuSoft celebrates five years of cutting-edge quantum algorithm and software research – Centrum Wiskunde & Informatica (CWI)

§ October 6th, 2020 § Filed under Quantum Computer Comments Off on QuSoft celebrates five years of cutting-edge quantum algorithm and software research – Centrum Wiskunde & Informatica (CWI)

After five years of close collaboration within QuSoft, CWI and the University of Amsterdam sign an agreement that consolidates their cooperation.

After five years of close collaboration within QuSoft, CWI and the University of Amsterdam sign an agreement that consolidates their cooperation.

Since 2015, QuSoft has grown into a leading research institute where over 60 scientists of the Dutch national research institute for mathematics and computer science (CWI) and Faculty of Science from the University of Amsterdam (FNWI) work together on fundamental and multidisciplinary quantum research.

After five years, CWI director Jos Baeten and Peter van Tienderen, dean FNWI, are signing the agreement that continues their collaboration. I am very glad that nine years ago, when I started as director of CWI, Harry Buhrman convinced me to invest in his idea of a new research institute for quantum software, says Baeten. Together with the University of Amsterdam, CWI has made major investments in making QuSoft successful.

Jos Baeten (left) and Peter van Tienderen (right) signed the collaboration agreement between CWI and UvA for research institute QuSoft.

Back in 1996, CWI started research in quantum computing. It was among the first groups worldwide to pioneer this field. Currently, this research resides in CWIs Algorithms and Complexity group, headed by Prof. Harry Buhrman. Other specialised research on quantum condensed matter theory was already done by the Theoretical Physics Group of the University of Amsterdam, headed by Prof. Kareljan Schoutens. Nowadays, quantum research in Amsterdam is bundled in the QuSoft research centre, which is hosted at CWI.

Commitment Todays agreement underlines the fact that all parties feel the commitment and urgency to further develop quantum software, says Prof. Buhrman, co-founder and director of QuSoft, who was recently installed as a member of the Royal Netherlands Academy of Arts and Sciences (KNAW). Im proud to see QuSoft taking these steps. The results we have achieved up until now certainly motivates us to further explore the future that quantum computing has in store.

Leading technology All researchers at QuSoft contribute to software and hardware solutions for quantum computation, communication and sensing. The impact of these quantum technologies on society is widely recognised, and QuSoft has a clear role to play in their further development, says Prof. Schoutens, co-founder and co-director of QuSoft. Quantum computers require specialised programming methods that are totally different from the software normal computers use. At the University of Amsterdam, we are convinced that quantum technology is one of the leading technologies of the future, opening a huge range of applications, says van Tienderen.

Lustrum This year QuSoft celebrates its first lustrum. Coming December activities are organised for everyone who is affiliated or wants to learn more about the institute. The lustrum will also highlight other research projects connected with, or initiated by, the QuSoft institute. This includes projects such as the Quantum Software Consortium, education program Quantum Quest and QuSofts new innovation hub Quantum.Amsterdam. If you want to celebrate this, register now at http://www.qusoft.org/lustrum.

More:

QuSoft celebrates five years of cutting-edge quantum algorithm and software research - Centrum Wiskunde & Informatica (CWI)

Read the Rest...

Google’s Billion Dollar News, Commercial Quantum Computers And More In This Week’s Top News – Analytics India Magazine

§ October 4th, 2020 § Filed under Quantum Computer Comments Off on Google’s Billion Dollar News, Commercial Quantum Computers And More In This Week’s Top News – Analytics India Magazine

The Dutch and the Finnish are doing their part in shedding the dystopian sci-fi rep that AI gets usually. These European nations often show up on the top when it comes to initiatives that take the human aspect seriously. Now they are at it again. Amsterdam and Helsinki are making moves to make sure that transparency of AI applications is established. Not only that but these cities want their citizens to play an active role going forward. In what can be a more sci-fi sounding announcement, quantum computing industry leader DWave opens up their tech for business applications making it the first to do so. There is more to news, thanks to Google and find out why in this weeks top news brought to you by Analytics India Magazine.

VMware and NVIDIA are coming together to offer an end-to-end enterprise platform for AI along with a new architecture for data center, cloud and edge; services that use NVIDIAs DPUs. We are partnering with NVIDIA to bring AI to every enterprise; a true democratization of one of the most powerful technologies, said Pat Gelsinger, CEO of VMware.

The full stack of AI software available on the NVIDIA NGCTM hub will be integrated into VMware vSphere, VMware Cloud Foundation and VMware Tanzu. This in turn will help accelerate AI adoption across the industru and allows enterprises to deploy AI-ready infrastructure across the data centers, cloud and edge.

On Thursday, Googles CEO Sundar Pichai announced that they would be sparing $1 billion for enabling high quality journalism. In a blog post penned by Pichai, underlined Googles mission to organize the worlds information and make it universally accessible and useful. Googles News Showcase features the editorial curation of award-winning newsrooms to give readers more insight on the stories that matter, and in the process, helps publishers develop deeper relationships with their audiences. Google has already signed partnerships for News Showcase with nearly 200 leading publications across Germany, Brazil, Argentina, Canada, the U.K. and Australia and will soon be expanding to India, Belgium and the Netherlands.

On Tuesday, D-Wave Systems, the Canadian quantum computing company announced the general availability of its next-gen quantum computing platform that flaunt new hardware, software, and tools to enable and accelerate the delivery of in-production quantum computing applications. The company stated that the platform is available in the Leap quantum cloud service and includes the Advantage quantum system, with more than 5000 qubits and 15-way qubit connectivity. In addition to this, there is an expanded hybrid solver service that can run problems with up to one million variables. Together, these services enables users to scale to address real-world problems with enabling businesses to run real-time quantum applications for the first time.

The PyTorch has announced that developers can leverage its libraries on Cloud TPUs. The XLA library, SAID pYtoRCH, has reached general availability (GA) on Google Cloud and supports a broad set of entry points for developers. It has a fast-growing community of researchers from MIT, Salesforce Research, Allen AI and elsewhere who train a wide range of models accelerated with Cloud TPUs and Cloud TPU Pods.

According to PyTorch, the aim of this project was to make it as easy as possible for the PyTorch community to leverage the high performance capabilities that Cloud TPUs offer while maintaining the dynamic PyTorch user experience. To enable this workflow, the team created PyTorch / XLA, a package that lets PyTorch connect to Cloud TPUs and use TPU cores as devices.

Github announced that the code scanning option, CodeQL is now generally available to all developers. With this new option developers get prompts It scans code as its created and surfaces actionable security reviews within pull requests and other GitHub experiences you use everyday, automating security as a part of your workflow. This helps ensure vulnerabilities never make it to production in the first place.Code scanning is powered by CodeQLthe worlds most powerful code analysis engine and will enable developers to use the 2,000+ CodeQL queries created by GitHub and the community, or create custom queries to easily find and prevent new security concerns.

No two palms are alike. Thats the idea behind Amazon One, a new service by the e commerce giant which allows customers to pay with their palm. Contactless payments were all the rage this pandemic and Amazon wants to step up their technology at one of their stores. All you need is a credit card, your mobile number, and of course, your palm. Once youre signed up, you can use your palm to enter, identify, and pay where Amazon One is available. Governments around the world started to ease the restrictions for public spaces like malls and stadiums and services like Amazon One might see a huge rise in demand because touching surfaces is so 2019!

On Monday, Amsterdam and Helsinki launched AI registries to detail how the respective governments use algorithms to deliver services. AI Register is a window into the artificial intelligence systems used by these cities through the register, citizens can get acquainted with the quick overviews of the citys artificial intelligence systems or examine their more detailed information based on your own interests. They can also give feedback and thus participate in building human-centred AI.

I have a master's degree in Robotics and I write about machine learning advancements. email:ram.sagar@analyticsindiamag.com

Read the original here:

Google's Billion Dollar News, Commercial Quantum Computers And More In This Week's Top News - Analytics India Magazine

Read the Rest...

Global QC Market Projected to Grow to More Than $800 million by 2024 – HPCwire

§ October 4th, 2020 § Filed under Quantum Computer Comments Off on Global QC Market Projected to Grow to More Than $800 million by 2024 – HPCwire

The Quantum Economic Development Consortium (QED-C) and Hyperion Research are projecting that the global quantum computing (QC) market worth an estimated $320 million in 2020 will grow at an anticipated 27% CAGR between 2020 and 2024, reaching approximately $830 million by 2024.

This estimate is based on surveys of 135 US-based quantum computing researchers, developers and suppliers across the academic, commercial and government sectors. Supplemental data and insights came from a companion effort that surveyed 115 current and potential quantum computing users in North America, Europe and the Asia/Pacific region on their expectations, schedules and budgets for the use of quantum computing in their existing and planned computational workloads.

(Keeping track of the various quantum computing organization is becoming a challenge in itself. The Quantum Economic Development Consortium (QED-C) is a consortium of stakeholders that aims to enable and grow the U.S. quantum industry. QED-C was established with support from the National Institute of Standards and Technology (NIST) as part of the Federal strategy for advancing quantum information science and as called for by theNational Quantum Initiative Actenacted in 2018.)

Additional results from the study:

Based on our study and related forecast, there is a growing, vibrant, and diverse US-based QC research, development, and commercial ecosystem that shows the promise of maturing into a viable, if not profitable and self-sustaining industry. That said, it is too early to start picking winners and losers from either a technology or commercial perspective, said Bob Sorensen, quantum analyst for Hyperion Research.

A key driver for commercial success could be the ability of any vendor to ease the requirements needed to integrate QC technology into a larger HPC and enterprise IT user base while still supporting advanced QC-related research for a more targeted, albeit smaller, class of end-user scientists and engineers. This sector is not for faint of heart, but this forecast gives some sense of what is at stake hereat least for the next few years, noted Sorensen.

Source: QED-C

QED-C commissioned and collaborated with Hyperion Research to develop this market forecast to help inform decision making for QC technology developers and suppliers, national-level QC-related policy makers, potential QC users in both the advanced computing and enterprise IT marketplace investors and commercial QC funding organizations. This is a baseline estimate, and Hyperion Research and QED-C are looking to provide periodic updates of their QC market forecast as events, information, or decision- making requirements dictate. Contact: Celia Merzbacher, QED-C Deputy Director, [emailprotected]

More here:

Global QC Market Projected to Grow to More Than $800 million by 2024 - HPCwire

Read the Rest...

Berkeley Lab Technologies Honored With 7 R&D 100 Awards – Lawrence Berkeley National Laboratory

§ October 4th, 2020 § Filed under Quantum Computer Comments Off on Berkeley Lab Technologies Honored With 7 R&D 100 Awards – Lawrence Berkeley National Laboratory

Innovative technologies from Lawrence Berkeley National Laboratory (Berkeley Lab) to achieve higher energy efficiency in buildings, make lithium batteries safer and higher performing, and secure quantum communications were some of the inventions honored with R&D 100 Awards by R&D World magazine.

For more than 50 years, the annual R&D 100 Awards have recognized 100 technologies of the past year deemed most innovative and disruptive by an independent panel of judges. The full list of winners, announced by parent company WTWH Media LLC is available at the R&D World website.

Berkeley Labs award-winning technologies are described below.

A Tool to Accelerate Electrochemical and Solid-State Innovation

(from left) Adam Weber, New Danilovic, Douglas Kushner, and John Petrovick (Credit: Berkeley Lab)

Berkeley Lab scientists invented a microelectrode cell to analyze and test electrochemical systems with solid electrolytes. Thanks to significant cost and performance advantages, this tool can accelerate development of critical applications such as energy storage and conversion (fuel cells, batteries, electrolyzers), carbon capture, desalination, and industrial decarbonization.

Solid electrolytes have been displacing liquid electrolytes as the focus of electrochemical innovation because of their performance, safety, and cost advantages. However, the lack of effective methods and equipment for studying solid electrolytes has hindered advancement of the technologies that employ them. This microelectrode cell meets the testing needs, and is already being used by Berkeley Lab scientists.

The development team includes Berkeley Lab researchers Adam Weber, Nemanja Danilovic, Douglas Kushner, and John Petrovick.

Matter-Wave Modulating Secure Quantum Communicator (MMQ-Com)

Information transmitted by MMQ-Com is impervious to security breaches. (Credit: Alexander Stibor/Berkeley Lab)

Quantum communication, cybersecurity, and quantum computing are growing global markets. But the safety of our data is in peril given the rise of quantum computers that can decode classical encryption schemes.

The Matter-Wave Modulating Secure Quantum Communicator (MMQ-Com) technology is a fundamentally new kind of secure quantum information transmitter. It transmits messages by modulating electron matter-waves without changing the pathways of the electrons. This secure communication method is inherently impervious to any interception attempt.

A novel quantum key distribution scheme also ensures that the signal is protected from spying by other quantum devices.

The development team includes Alexander Stibor of Berkeley Labs Molecular Foundry along with Robin Rpke and Nicole Kerker of the University of Tbingen in Germany.

Solid Lithium Battery Using Hard and Soft Solid Electrolytes

(from left) Marca Doeff, Guoying Chen, and Eongyu Yi (Credit: Berkeley Lab)

The lithium battery market is expected to grow from more than $37 billion in 2019 to more than $94 billion by 2025. However, the liquid electrolytes used in most commercial lithium-ion batteries are flammable and limit the ability to achieve higher energy densities. Safety issues continue to plague the electronics markets, as often-reported lithium battery fires and explosions result in casualties and financial losses.

In Berkeley Labs solid lithium battery, the organic electrolytic solution is replaced by two solid electrolytes, one soft and one hard, and lithium metal is used in place of the graphite anode. In addition to eliminating battery fires, incorporation of a lithium metal anode with a capacity 10 times higher than graphite (the conventional anode material in lithium-ion batteries) provides much higher energy densities.

The technology was developed by Berkeley Lab scientists Marca Doeff, Guoying Chen, and Eongyu Yi, along with collaborators at Montana State University.

Porous Graphitic Frameworks for Sustainable High-Performance Li-Ion Batteries

High-resolution transmission electron microscopy images of the Berkeley Lab PGF cathode reveal (at left) a highly ordered honeycomb structure within the 2D plane, and (at right) layered columnar arrays stacked perpendicular to the 2D plane. (Credit: Yi Liu/Berkeley Lab)

The Porous Graphitic Frameworks (PGF) technology is a lithium-ion battery cathode that could outperform todays cathodes in sustainability and performance.

In contrast to commercial cathodes, organic PGFs pose fewer risks to the environment because they are metal-free and composed of earth-abundant, lightweight organic elements such as carbon, hydrogen, and nitrogen. The PGF production process is also more energy-efficient and eco-friendly than other cathode technologies because they are prepared in water at mild temperatures, rather than in toxic solvents at high temperatures.

PGF cathodes also display stable charge-discharge cycles with ultrahigh capacity and record-high energy density, both of which are much higher than all commercial inorganic cathodes and organic cathodes known to exist.

The development team includes Yi Liu and Xinie Li of Berkeley Labs Molecular Foundry, as well as Hongxia Wang and Hao Chen of Stanford University.

Building Efficiency Targeting Tool for Energy Retrofits (BETTER)

The buildings sector is the largest source of primary energy consumption (40%) and ranks second after the industrial sector as a global source of direct and indirect carbon dioxide emissions from fuel combustion. According to the World Economic Forum, nearly one-half of all energy consumed by buildings could be avoided with new energy-efficient systems and equipment.

(from left) Carolyn Szum (Lead Researcher), Han Li, Chao Ding, Nan Zhou, Xu Liu (Credit: Berkeley Lab)

The Building Efficiency Targeting Tool for Energy Retrofits (BETTER) allows municipalities, building and portfolio owners and managers, and energy service providers to quickly and easily identify the most effective cost-saving and energy-efficiency measures in their buildings. With an open-source, data-driven analytical engine, BETTER uses readily available building and monthly energy data to quantify energy, cost, and greenhouse gas reduction potential, and to recommend efficiency interventions at the building and portfolio levels to capture that potential.

It is estimated that BETTER will help reduce about 165.8 megatons of carbon dioxide equivalent (MtCO2e) globally by 2030. This is equivalent to the CO2 sequestered by growing 2.7 billion tree seedlings for 10 years.

The development team includes Berkeley Lab scientists Nan Zhou, Carolyn Szum, Han Li, Chao Ding, Xu Liu, and William Huang, along with collaborators from Johnson Controls and ICF.

AmanziATS: Modeling Environmental Systems Across Scales

Simulated surface and subsurface water from Amanzi-ATS hydrological modeling of the Copper Creek sub-catchment in the East River, Colorado watershed. (Credit: Zexuan Xu/Berkeley Lab, David Moulton/Los Alamos National Laboratory)

Scientists use computer simulations to predict the impact of wildfires on water quality, or to monitor cleanup at nuclear waste remediation sites by portraying fluid flow across Earth compartments. The Amanzi-Advanced Terrestrial Simulator (ATS) enables them to replicate or couple multiple complex and integrated physical processes controlling these flowpaths, making it possible to capture the essential physics of the problem at hand.

Specific problems require taking an individual approach to simulations, said Sergi Molins, principal investigator at Berkeley Lab, which contributed expertise in geochemical modeling to the softwares development. Physical processes controlling how mountainous watersheds respond to disturbances such as climate- and land-use change, extreme weather, and wildfire are far different than the physical processes at play when an unexpected storm suddenly impacts groundwater contaminant levels in and around a nuclear remediation site. Amanzi-ATS allows scientists to make sense of these interactions in each individual scenario.

The code is open-source and capable of being run on systems ranging from a laptop to a supercomputer. Led by Los Alamos National Laboratory, Amanzi-ATS is jointly developed by researchers from Los Alamos National Laboratory, Oak Ridge National Laboratory, Pacific Northwest National Laboratory, and Berkeley Lab researchers including Sergi Molins, Marcus Day, Carl Steefel, and Zexuan Xu.

Institute for the Design of Advanced Energy Systems (IDAES)

The U.S. Department of Energys (DOEs) Institute for the Design of Advanced Energy Systems (IDAES) project develops next-generation computational tools for process systems engineering (PSE) of advanced energy systems, enabling their rapid design and optimization.

IDAES Project Team (Credit: Berkeley Lab)

By providing rigorous modeling capabilities, the IDAES Modeling & Optimization Platform helps energy and process companies, technology developers, academic researchers, and DOE to design, develop, scale-up, and analyze new and potential PSE technologies and processes to accelerate advances and apply them to address the nations energy needs. The IDAES platform is also a key component in the National Alliance for Water Innovation, a $100 million, five-year DOE innovation hub led by Berkeley Lab, which will examine the critical technical barriers and research needed to radically lower the cost and energy of desalination.

Led by National Energy Technology Laboratory, IDAES is a collaboration with Sandia National Laboratories, Berkeley Lab, West Virginia University, Carnegie Mellon University, and the University of Notre Dame. The development team at Berkeley Lab includes Deb Agarwal, Oluwamayowa (Mayo) Amusat, Keith Beattie, Ludovico Bianchi, Josh Boverhof, Hamdy Elgammal, Dan Gunter, Julianne Mueller, Jangho Park, Makayla Shepherd, Karen Whitenack, and Perren Yang.

# # #

Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams, Lawrence Berkeley National Laboratory and its scientists have been recognized with 13 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Labs facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory, managed by the University of California for the U.S. Department of Energys Office of Science.

DOEs Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.

The rest is here:

Berkeley Lab Technologies Honored With 7 R&D 100 Awards - Lawrence Berkeley National Laboratory

Read the Rest...

IonQ claims it has built the most powerful quantum computer yet – TechCrunch

§ October 2nd, 2020 § Filed under Quantum Computer Comments Off on IonQ claims it has built the most powerful quantum computer yet – TechCrunch

Trapped-ion quantum computing startup IonQ today announced the launch of its latest quantum computer, which features what IonQ calls 32 perfect qubits with low gate errors.

Using IBMs preferred quantum benchmark, IonQ expects to hit a quantum volume of 4,000,000. Thats a massive increase over the double-digit quantum volume numbers that IBM itself recently announced and its a pretty extraordinary claim on IonQs side, as this would make its system the most powerful quantum computer yet.

The (well-funded) company has never used this metric before. Through a spokesperson, IonQ also noted that it doesnt necessarily think quantum volume is the best metric, but since the rest of the industry is using it, it decided to release this number. The company argues that its ability to achieve 99.9% fidelity between qubits has allowed it to achieve this breakthrough.

In a single generation of hardware, we went from 11 to 32 qubits, and more importantly, improved the fidelity required to use all 32 qubits, said IonQ CEO and president Peter Chapman. Depending on the application, customers will need somewhere between 80 and 150 very high-fidelity qubits and logic gates to see quantum advantage. Our goal is to double or more the number of qubits each year. With two new generations of hardware already in the works, companies not working with quantum now are at risk of falling behind.

Image Credits: Kai Hudek, IonQ

Its worth noting that IonQs trapped-ion approach is quite different from IBMs (or D-Waves for that matter) which uses a very different technique. That makes it hard to compare raw qubit counts between different vendors. The quantum volume metric is meant to make it easier to compare these systems, however.

The new system were deploying today is able to do things no other quantum computer has been able to achieve, and even more importantly, we know how to continue making these systems much more powerful moving forward, said IonQ co-founder and chief scientist Chris Monroe. With our new IonQ system, we expect to be able to encode multiple qubits to tolerate errors, the Holy Grail for scaling quantum computers in the long haul.

Using new error correction techniques, IonQ believes that it will only need 13 qubits to create a near-perfect logical qubit.

For now, IonQs new system will be available as a private beta, and itll be interesting to see if its early users will back up the companys claims (unsurprisingly, given the magnitude of IonQs claims, theres a bit of skepticism within the quantum computing community). Later, the company will make it available through partners like Amazon with its Braket service and the Microsoft Azure Quantum Cloud.

Image Credits: Kai Hudek, IonQ

Read more:

IonQ claims it has built the most powerful quantum computer yet - TechCrunch

Read the Rest...

University research in quantum computing assists the study of genetic diseases – University of Virginia The Cavalier Daily

§ October 2nd, 2020 § Filed under Quantum Computer Comments Off on University research in quantum computing assists the study of genetic diseases – University of Virginia The Cavalier Daily

Since genetic information is vast and often on the scale of billions of bits, the time taken to process data by his algorithm is exponentially reduced.

With the human genome consisting of over 6.4 billion base pairs, quantum computing may prove to be an efficient way to process genetic data. Stephan Bekiranov, computational biologist and associate professor, has developed an algorithm that utilizes a quantum computer in order to study genetic diseases. This algorithm was designed to introduce efficiencies in the computations by reducing the number of calculations performed in an operation. This breakthrough opens up possibilities for researchers in the medical and genetics field to crunch data in a more faster and efficient manner, paving the way for more medical breakthroughs to be made.

Genetic diseases arise when there are variations in DNA sequences when compared to normal sequences. According to Wei-Min Chen, associate professor of public health sciences and genetics expert, genetic disorders can be caused by mutations in one or multiple genes or by damage to chromosomes, which can occur due to errors in cell division or exposure to toxic substances such as alcohol or drugs. Additionally, a person may be missing a chromosome or have an extra one.

The millions of structural units that make up DNA are called nucleotides. Ultimately, it is imperative that scientists study and identify the nucleotide differences in DNA in order to develop ways to treat genetic diseases. However, genetic data is vast, so computations are essential for analyzing it.

Computer algorithms can be more computationally efficient than before, Chen said. The genetic data are still growing exponentially, and even better computing technology is still needed.

Like Chen, Bekiranov recognizes the need for faster mass computing technology as exponentially growing genetic data has led to millions of nucleotide differences.

Just imagine datasets where you have billions of nucleotide variations across billions of people, Bekiranov said. If you can develop algorithms that are able to do computations over a vast data space, like a quantum computer, in principle you could introduce efficiencies in the computation.

As opposed to classical computing, where information is processed through bits ones or zeros quantum computers analyze information through qubits, the basic unit of quantum information.

Let us imagine a scenario where we have to process 2 bits. There are 4 possible states that can exist for it: 00, 01, 10 and 11. A classical computer would have to go through each possibility. However, using a quantum computer, all four states would be considered synchronously by using laws of probability, where each state has an equal probability of occuring.

This is the key behind the efficiency in Bekiranovs algorithm. Since genetic information is vast and often on the scale of billions of bits, the time taken to process data by his algorithm is exponentially reduced.

While a conventional computer would have to perform three billion operations on a computation of genetic data, this algorithm would only take 32 operations in a quantum computer, thus leading to an exponential gain in processing time.

Bekiranov has a doctorate in theoretical physics and has studied quantum mechanics, but 20 years ago, he transitioned into computational biology, which has now been his focus of study. He has been working on this project in quantum computing for one year.

As part of a collaboration with a colleague, we were working on the kind of the variations in the copy number of genomic segments, Bekiranov said. Turns out, you can have little bits of your genome kind of lost. Or you can actually have even extra copies in your cells.

His colleague Mike McConnel, assistant professor in the department of biochemistry and molecular genetics, once inquired about the applications of quantum mechanics in their project. After approaching the National Institute of Health about the applications of quantum computing in biomedical and neuroscience research, Bekiranovs team attended a series of workshops in order to gain funding for their research. With the help of his colleague, Kunal Kathuria, postdoctoral research associate and scientist at siebner institute of brain development, Bekiranov was then able to develop the algorithm that could work on genomic data.

Even with the field of quantum computing still in its budding stage, Bekiranovs and Kathurias work in developing this algorithm is an example of how breakthrough technology can be used to effectuate the research in the genetic and medical fields.

Bekiranov sees a broad future for the applications of quantum computing in biomedical research. From chemistry concepts such as density function theory, to biomolecule research in drug design, there are numerous applications that prove the important role of quantum computing in ongoing research today.

As for now, the powers of a quantum computer can be focused on furthering scientific findings.

We have tons and tons of large data sets in biomedical research now, and the ability to do computations efficiently is where quantum computing fits in with genetics, Bekiranov said.

Continue reading here:

University research in quantum computing assists the study of genetic diseases - University of Virginia The Cavalier Daily

Read the Rest...

NCCS’ James Simmons Recognized as Top Lustre Contributor of the Decade – HPCwire

§ September 30th, 2020 § Filed under Quantum Computer Comments Off on NCCS’ James Simmons Recognized as Top Lustre Contributor of the Decade – HPCwire

Sept. 30, 2020 The National Center for Computational Sciences (NCCSs) James Simmons was recognized by the Open Scalable File Systems, Inc.(OpenSFS) organization at the virtual user meeting on September 9 as the top Lustre contributor of the decade for his prolific contributions to the Lustre file system software, which have included more than 250,000 lines of code.

Generally used for high-performance computing (HPC) systems, Lustre has been employed in many of theOak Ridge Leadership Computing Facilitys (OLCFs) previous file systems at NCCS, including the shared file system used by the Jaguar and Titan supercomputers. It is currently used by the current Gaia system owned by the National Oceanic and Atmospheric Administration and the US Air Force Weather system owned by the US Air Force, both housed at the NCCS. The OLCF is aUS Department of Energy(DOE)Office of ScienceUser Facility at DOEsOak Ridge National Laboratory (ORNL).

Simmons contributions to Lustre began when he came to ORNL in 2008 to work in the NCCS Technology Integration Group (TechInt). Initially, his job was to evaluate the Lustre software stack and determine where improvements could be made. But soon, he began fixing issues himself and submitting those fixes to the Lustre community.

I cant stand broken things, Simmons said. Being able to analyze and debug issues on our large-sale file systems is a unique opportunity, so its nice to be able to contribute that back to the community.

OpenSFS is the nonprofit community organization dedicated to the success of the Lustre file system. OpenSFS was founded in 2010 to advance Lustre development, ensuring it remains vendor neutral, open, andfree. ORNL is a founding member of OpenSFS.

Lustre is an open-source parallel file system technology that supports the requirements of leadership-class HPC resources, such as those at NCCS, and the Enterprise environments worldwide, such as Amazon Web Services. Lustre scales to tens of thousands of clients and hundreds of petabytes of storage. It has demonstrated over a terabyte per second of sustained I/O bandwidth.Many of the largest and most powerful supercomputers in the world are powered by the Lustre file system, includingover 60 percent of theTOP500 sitessuch as the worlds fastest, the Fugaku system in Japanand the upcoming Frontier system, an exascale computing system to be housed at NCCS. Contributions to Lustre from the community, including the vendors and end users like NCCS, are pushed out to the community tree in new releases.

Lustre was originally developed on the Intel x86 architecture, but Simmons spearheaded the efforts to port Lustre on the IBM POWER and ARM architectures.

James is making Lustre less vendor- and architecture-dependent so that it will have a wider community base and more longevity, said Sarp Oral, group leader of TechInt. James efforts in the Lustre domain have always had a big impact on our production systems and will continue to impact our future systems.

The OLCF is slated to deliverFrontieras soon as next yearan exascale systemthat will feature Crays new Shasta architecture and Slingshot interconnect. Frontiers Orion file system will be Lustre based; therefore, any changes to Lustre will benefit work performed on Frontier. Simmons contributions have also impacted other national laboratories and supercomputing centers, even ones outside the DOE space.

We are providing a benefit to the entire Lustre community, which includes other DOE labs and other Lustre users entirely, Oral said. What James has done is different than other Lustre developers. He is the sole largest contributor to Lustre outside of one of the vendors, and that is something truly noteworthy.

UT-Battelle LLC manages Oak Ridge National Laboratory for DOEs Office of Science, the single largest supporter of basic research in the physical sciences in the United States. DOEs Office of Science is working to address some of the most pressing challenges of our time. For more information, visithttps://energy.gov/science.

Source: Rachel Harken, Oak Ridge National Laboratory

Original post:

NCCS' James Simmons Recognized as Top Lustre Contributor of the Decade - HPCwire

Read the Rest...

Schrdingers Web offers a sneak peek at the quantum internet – Science News

§ September 28th, 2020 § Filed under Quantum Computer Comments Off on Schrdingers Web offers a sneak peek at the quantum internet – Science News

Schrdingers Web Jonathan P. Dowling CRC Press, $40.95

When news broke last year that Googles quantum computer Sycamore had performed a calculation faster than the fastest supercomputers could (SN: 12/16/19), it was the first time many people had ever heard of a quantum computer.

Quantum computers, which harness the strange probabilities of quantum mechanics, may prove revolutionary. They have the potential to achieve an exponential speedup over their classical counterparts, at least when it comes to solving some problems. But for now, these computers are still in their infancy, useful for only a few applications, just as the first digital computers were in the 1940s. So isnt a book about the communications network that will link quantum computers the quantum internet more than a little ahead of itself?

Surprisingly, no. As theoretical physicist Jonathan Dowling makes clear in Schrdingers Web, early versions of the quantum internet are here already for example, quantum communication has been taking place between Beijing and Shanghai via fiber-optic cables since 2016 and more are coming fast. So now is the perfect time to read up.

Dowling, who helped found the U.S. governments quantum computing program in the 1990s, is the perfect guide. Armed with a seemingly endless supply of outrageous anecdotes, memorable analogies, puns and quips, he makes the thorny theoretical details of the quantum internet both entertaining and accessible.

Readers wanting to dive right in to details of the quantum internet will have to be patient. Photons are the particles that will power the quantum internet, so we had better be sure we know what the heck they are, Dowling writes. Accordingly, the first third of the book is a historical overview of light, from Newtons 17th century idea of light as corpuscles to experiments probing the quantum reality of photons, or particles of light, in the late 20th century. There are some small historical inaccuracies the section on the Danish physicist Hans Christian rsted repeats an apocryphal tale about his serendipitous discovery of the link between electricity and magnetism and the footnotes rely too much on Wikipedia. But Dowling accomplishes what he sets out to do: Help readers develop an understanding of the quantum nature of light.

Headlines and summaries of the latest Science News articles, delivered to your inbox

Like Dowlings 2013 book on quantum computers, Schrdingers Killer App, Schrdingers Web hammers home the nonintuitive truths at the heart of quantum mechanics. For example, key to the quantum internet is entanglement that spooky action at a distance in which particles are linked across time and space, and measuring the properties of one particle instantly reveals the others properties. Two photons, for instance, can be entangled so they always have the opposite polarization, or angle of oscillation.

In the future, a user in New York could entangle two photons and then send one along a fiber-optic cable to San Francisco, where it would be received by a quantum computer. Because these photons are entangled, measuring the New York photons polarization would instantly reveal the San Francisco photons polarization. This strange reality of entanglement is what the quantum internet exploits for neat features, such as unhackable security; any eavesdropper would mess up the delicate entanglement and be revealed. While his previous book contains more detailed explanations of quantum mechanics, Dowling still finds amusing new analogies, such as Fuzz Lightyear, a canine that runs along a superposition, or quantum combination, of two paths into neighbors yards. Fuzz helps explain physicist John Wheelers delayed-choice experiment, which illustrates the uncertainty, unreality and nonlocality of the quantum world. Fuzzs path is random, the dog doesnt exist on one path until we measure him, and measuring one path seems to instantly affect which yard Fuzz enters even if hes light-years away.

The complexities of the quantum web are saved for last, and even with Dowlings help, the details are not for the faint of heart. Readers will learn how to prepare Bell tests to check that a system of particles is entangled (SN: 8/28/15), navigate bureaucracy in the Department of Defense and send unhackable quantum communications with the dryly named BB84 and E91 protocols. Dowling also goes over some recent milestones in the development of a quantum internet, such as the 2017 quantum-secured videocall between scientists in China and Austria via satellite (SN: 9/29/17).

Just like the classical internet, we really wont figure out what the quantum internet is useful for until it is up and running, Dowling writes, so people can start playing around with it. Some of his prognostications seem improbable. Will people really have quantum computers on their phones and exchange entangled photons across the quantum internet?

Dowling died unexpectedly in June at age 65, before he could see this future come to fruition. Once when I interviewed him, he invoked Arthur C. Clarkes first law to justify why he thought another esteemed scientist was wrong. The first law is that if a distinguished, elderly scientist tells you something is possible, hes very likely right, he said. If he tells you something is impossible, hes very likely wrong.

Dowling died too soon to be considered elderly, but he was distinguished, and Schrdingers Web lays out a powerful case for the possibility of a quantum internet.

Buy Schrdingers Web from Amazon.com.Science Newsis a participant in the Amazon Services LLC Associates Program. Please see ourFAQfor more details.

Visit link:

Schrdingers Web offers a sneak peek at the quantum internet - Science News

Read the Rest...

« Older Entries



Page 11234..10..»