Last month, Utah Business partnered with Foley & Lardner to host a roundtable featuring Utah’s quantum and cloud computing experts. This conversation was moderated by Dr. Taylor Sparks, director of ReUSE REU and professor of materials science and engineering at the University of Utah.
What are the latest developments in cloud computing?
Adam Frisbee | Adjunct Professor, OIS Operations & Info Systems | University of Utah
I’ve seen the rise of specialty clouds, especially platforms. AWS [Amazon Web Services], Azure and GCP [Google Cloud Platform] are traditionally infrastructure providers, but we’ve seen the rise of platform providers like Salesforce or Workday. These are cloud platforms that serve a special purpose. And that reduces the risks and costs associated with building and maintaining a lot of the infrastructure. … The edge is an exciting frontier for the cloud. I’ve read that it’s called “the fog,” — not clouds in the sky anymore, clouds among us. The fog is where we are starting to do more computing.
Troy Rydman | Chief Information Security Officer | Amazon Web Services
At AWS, we are looking at how to help our customers get out of the data center business. Our customers don’t want to be in the data center business and maintaining infrastructure. Two things we’re doing are 1) a service architecture where you move to services and adopt them without having to spin up something that’s referred to as a bare metal system. You simply execute the service you want and utilize it. You don’t have to maintain anything with it. … And 2) is around very custom processor sets. We are now creating systems and processors and CPUs. There are specific use cases that significantly reduce energy costs by 80 to 90 percent and increase output.
Tony Kanell | Senior Engineering Manager | NVIDIA
There’s a massive retargeting of workforce skill sets where, with serverless technologies, you have to think about things in a highly parallel state that you’re not used to designing for. Engineers who are used to building things in a certain way suddenly have to relearn what “good” looks like in the system and how to architect things accordingly. … We’re using data to drive these decisions as opposed to feelings. We’re making clear trade-offs. … If we are sacrificing some difficulty and deployability or development time, we see clear benefits in performance or scalability.
Jeremy Fillingim | Co-Founder & CTO | PassiveLogic
As the computing power available at the edge becomes more and more substantial, … it becomes really important, I think, that all the computing becomes more the same, which puts a lot of stress and emphasis on networking technology. It’s really interesting to watch the pendulum swing back and forth between cloud and edge. … We have to be able to take advantage of all of the technologies in both places.
Dr. Barclay Burns | Assistant Dean, Applied AI | Utah Valley University, Smith College of Engineering & Technology
I’ll give you a use case. I’m an advisor for Intermountain Health on a project around neurodiverse kids. … We’re building a network of parents who can support each other … [using] the best knowledge we can get from research papers and run a model on it. That has to be private, specialized and secure. We can’t ever have anything ever reach into a healthcare system that’s not utterly secure. We have to know exactly what words have been in there to train it. We’re able to train these models, and we’re figuring out ways in which to interpret [the research] so parents can actually engage with scientific literature and understand what’s happening with their child. They’re not becoming the doctor or the therapist, but when they meet with the doctor or the therapist, they’re more effective and more efficient. What’s going to be really essential to this is the edge. … We’ll have a version of the AI modeling on the phone that the parent can talk to.
Peter Bookman | Founder & CEO | GUARDDOG AI
The argument has been made not to fear the current version of AI because, in a nutshell, it can’t be freed. There is no free thinking. You have to compile it. Therefore, it’ll always do what it’s instructed — every single time. There is no “it breaks free” because the hardware won’t let it, and neither will the software. It can’t. But quantum can. I am thinking, subconsciously, 11 million thoughts a second. I’m aware of 40 of them; those are all in the frontal lobe. Everything we’ve done so far is frontal lobe stuff, but how do we think subconsciously? When we put that all together, we get to this: I do my best thinking when I’m not thinking.
What is quantum computing? What can the general public use quantum computing for?
Dr. Massood Tabib-Azar | USTAR Professor, ECE | University of Utah, Department of Electrical & Computer Engineering
Classical computing is based on bits, zero and one. Zero is usually represented to some voltage level, let’s say zero voltage. Then one is represented by some other voltage — maybe three volts or four volts. What will happen if you try to superimpose them on each other? Usually, zero wins. It’s going to short out the one. You cannot have superimposition of many bits. … That’s the power you get in quantum computing: Qubits [quantum bits] enable you to add or perform the superimposition or come up with superimposition of zeros and ones. It enables you to add a whole bunch of quantum zeros and quantum ones. Once you have those, you have superimposition, and you can process them in parallel. It gives you an exponential speed-up.
The second thing is entanglement. Entanglement enables you to take two quantum bits and cause a correlation between them. The qubits themselves don’t know what state they are, but once you perform a measurement on one of them, the other one’s state is fixed.
Kamyar Maserrat | Senior Council | Foley & Lardner
Most of us would not need a quantum computer. It’s a really heavy load of computation that needs a quantum computer. IBM allows 15 minutes free on their quantum computer, and I played around with it. … If I had a special kind of machine learning model, it would probably train it faster or execute it faster, but nothing really for the general population.
What will it take for quantum computing to be everywhere? Will it ever be consumer-grade?
Dr. Vikram Deshpande | Associate Professor, Physics & Astronomy | University of Utah, Department of Physics & Astronomy
That moment has probably already arrived. Many may be familiar with the term quantum advantage or quantum supremacy. It basically means that if a quantum computer can do something better than the strongest, biggest classical computer in the world, you say you have quantum supremacy. … In 2019, Google had 53 quantum bits and said they had achieved quantum supremacy in a particular problem. … Then, a few months later, IBM showed that the strongest supercomputer in the world could actually best that. Now, a few years later, both Google and IBM are at about 500 quantum bits. At that point, it is beyond doubt that there is quantum advantage, quantum supremacy. It’s there. Going back to the question, what is the application here? I don’t think the application is people carrying around quantum computers. But for certain niche applications, there are already quantum computers available to people on the cloud.
Peter Bookman | Founder & CEO | GUARDDOG AI
Encryption is probably one of the most documented, well-understood [examples]. Everybody seems to understand that quantum computing will break current encryption. There is a known scientific, proven answer to that, except for one thing: Today’s quantum computing, with as many qubits, is also known to be rather slow. … By the time we need to solve the unencryption problem, the encryption problem will also be solved because the horsepower will be there now.
Dr. Sujatha Sampath | Senior Lead Scientist | Booz Allen Hamilton
*Any views on the topics I share are my own and do not represent my current or past employers. I don’t think, and a lot of the community agrees, that we are going to have quantum desktops in our homes and offices anytime soon. That’s not what it’s going to be. The current state of quantum computing is an expensive process. Only where there is a lot of funding, governments or huge industry players are there even proof-of-concept devices right now. There’s a lot of development going on in academia and the industry, but ubiquitous compared to … a phone — that’s not going to happen soon.
Troy Rydman | Chief Information Security Officer | Amazon Web Services
From a cloud perspective, there are more people interested because the technology is at their fingertips. Previously, if you wanted to test something, … you’d have to own a server. You’d have to rack it in the data center. You’d have to own this technology. You’d have to pay for the infrastructure. Then, you’d have to understand how to set it up and maintain it just to experiment with it. Now it’s all virtual and for pennies on the dollar.
How have you seen cloud computing change your field?
Adam Frisbee | Adjunct Professor, OIS Operations & Info Systems | University of Utah
In my classes at the University of Utah, we can do labs that are very low-cost or sometimes no-cost for enterprise-grade architectures. Ten years ago, students would never have that. They might have access to it if the university had provided it, maybe at the high-performance computing center. But with the cloud, … I have students building enterprise-ready technologies. … I think it’s a really powerful thing to have the cloud. I like to say it democratizes technology.
Whit Johnson | Partner | Foley & Lardner
It’s interesting that Troy is saying the cloud has made technology ubiquitous, and we’re saying quantum computing isn’t going to be ubiquitous for a long time. If it’s on the cloud and accessible and there to understand, I think with a little bit of lowering of the decoherence, solving the energy issue, and resolving some of the material challenges of the hardware, the ChatGPT moment for quantum computing could be at any moment.
Dr. Massood Tabib-Azar | USTAR Professor, ECE | University of Utah, Department of Electrical & Computer Engineering
If you look at the way sensors are evolving, kind of parallel to quantum computing are quantum sensors that enable you to sense things that are very, very minor: small changes in my biology or blood pressure. Those are coming. These quantum sensors are going to be for biology, and armies are interested in that. To access and analyze this data, you need very powerful computers. So, quantum sensors and quantum computers will go hand-in-hand in solving really difficult problems in biology, human health and global health.
Dr. Barclay Burns | Assistant Dean, Applied AI | Utah Valley University, Smith College of Engineering & Technology
If we start to make some statements, you have a governor, you have the Governor’s Office of Economic Opportunity, you have the World Trade Center, you have legislators who will sit in a room with this group and rustle through these issues and start to make policy decisions and investment decisions. They just aren’t having these opportunities. This is the kind of group that can convene this kind of conversation. … I’ve just seen how it played out with the AI policy with lawmakers and executive directors in government, higher education and industry. You need all the people at the same table thinking about this and engaging in the conversation, and people actually move.
Dr. Sujatha Sampath | Senior Lead Scientist | Booz Allen Hamilton
One of the advantages of quantum power is the way the operations are performed. For a classical computer, it all works in Boolean space. In quantum space, it has the advantage of mathematically mapping or scoping parallel scenarios using vectors and matrices. That is very akin to how machine learning and deep learning works. Utah has a sizable presence in deep learning and machine learning companies. That could be one selling point to influence policy.
Dr. Aurora Clark | Professor, Chemistry | University of Utah, Department of Chemistry
I was at an RPE workshop on quantum computing earlier this spring with 15-20 hardware companies and 10 different software companies. … It was evident there’s been a huge amount of investment already in both the hardware and the software side. Everyone is trying to find the right application. So much of it is partnerships and bringing people together so that your particular platform and the way you’ve constructed your hardware with this particular software will work well with this application. … It’s very crowded because there’s been so much investment at the government level — there are tons of startups available. At some point, the money is going to be going away, which means everyone’s going to be clawing for it. I think the partnership aspect could be a very fruitful way to approach it.
What are the implications of quantum or cloud computing being used in physical science simulations?
Jeremy Fillingim | Co-Founder & CTO | PassiveLogic
The very high-level view is that we build a model of the building and all the equipment. … You don’t get to train one model and then run it [multiple times]. The composability of our models became very important. We want to be able to reuse components. … The computing power available can become a limiting factor. We’ve taken a different approach than traditional AI. In the building space, we’re simulating physical processes. We’re basically running thermodynamics equations. The more things like that can be accelerated, the better it is for us.
Tony Kanell | Senior Engineering Manager | NVIDIA
Along those same lines, cloud technologies have allowed us to take all of these systems that have existed separately for so many years … and simulate them together for the first time. We had a manufacturing customer who did all of these different simulations in 10 different pieces of software, and then they built the factory. Unfortunately, the software that programmed the robot arm didn’t talk to the architecture software. The first time they turned the arm on, it went up and slammed into the gantry. That’s incredibly costly to have to go and redesign after it’s been built. When you bring all these together in the cloud and you build a digital twin that knows about everything in there, it’s physically based and you can simulate it, you can save that cost upfront.
Dr. Aurora Clark | Professor, Chemistry | University of Utah, Department of Chemistry
There are simulations I would love to do that are just not practical. If I wanted to simulate, for example, all of the chemical reactions occurring in the valley right now that are leading to the air quality, I cannot do that on a classical machine. But you could, in principle, on a quantum computer very effectively. A lot of fundamental work needs to be done toward that because that kind of information at a chemical level needs to be accounted for.
What materials/hardware limitations are currently limiting quantum computing?
Dr. Vikram Deshpande | Associate Professor, Physics & Astronomy | University of Utah, Department of Physics & Astronomy
The biggest issue holding up the scaling of qubits in quantum computers is decoherence; even temperature is an issue. … Google’s and IBM’s qubits need to be cold, but even there, they are only coherent in the sense that they’re only quantum for a certain period of time, which is on the order of tens of microseconds. The second is the error rate. With a few hundred qubits, if you have a bunch of them giving wrong answers, that can’t work. Essentially, there is this whole idea of error correction. … You have a whole infrastructure that is trying to correct the errors resulting from a given qubit. All of this is preventing scale-up.