Siemens CEO Roland Busch gave a keynote speech at CES 2024 all about using digital twins in the industrial metaverse as well as how generative AI is changing the enterprise.
Busch said he is all in on digital twins, which are virtual versions of physical entities such as factories and industrial buildings. These precursors of an industrial metaverse can make companies more productive and shave years of the development for complicated products such as new aircraft, Busch said.
With a digital twin, companies can perfect the designs of their factories in the real world before they have to invest in physical infrastructure. Then they can build the real factories, outfit them with sensors, and feed the digital data back to the digital twins. And that feedback cycle makes everything better.
And while there is a chance for some synergy in the industrial metaverse and a game-oriented entertainment metaverse, Busch sees a huge differences between the two. Digital twins require the precision of true simulation, while games and consumer metaverses can get by with good animation.
And since AI is going to transform the industrial metaverse, Siemens has teamed up with Microsoft to drive cross-industry adoption for artificial intelligence. They have created the Siemens Industrial Copilot. That’s a generative AI-powered assistant aiming to amplify human-machine collaboration and productivity across various sectors. Siemens is also working with Sony, which showed off a virtual reality headset aimed at designing industrial metaverse creations.
I caught up with Busch for a one-on-one interview. Here’s an edited transcript of our interview.
VentureBeat: It’s interesting to me that big enterprise companies are taking the idea of the metaverse seriously. Was there a point for you where this went from science fiction to something real?
Roland Busch: There wasn’t really a triggering point. We were working on digital twins for many years. The difference between the digital twin we’re talking about, and the gaming industry is we simulate. The games animate. Simulation is physics-based. It’s best explained–if you animate a robot, you can move it as fast as you want. It moves smoothly. Everything is fine. When you do that in a simulated digital twin and it moves too fast, it starts swaying. The physics kicks in. That makes a big difference. When we talk to our customers like BMW, they say, “I don’t need an animated robot. I need a simulated robot. I have to see how it works in the real world.” That’s a big difference.
The metaverse for the game industry–obviously the idea of higher immersion, a photorealistic experience, to blend that into what we do and make it more collaborative, more immersive, where you’re really living in this metaverse, that makes a difference. We thought, “This is a good idea. What’s the next step?” The next step in the industrial metaverse is you have a digital twin, but one that’s updated in real time at every point. The data flows not only upward, but downward, back and forth. You have a real time digital twin following what happens in the real world and bringing that to the digital world. That’s a huge opportunity to go back in time, go forward in time, and think about how you can optimize your system in a better way, finding problems you had in the past.
It needs a lot of building blocks. It needs the digital twin, physics-based, and real time capabilities. It has to be comprehensive. That means it’s not only a cell in a line, but the whole building. You need that to leverage the full power of it. That takes time. But the elements are already there. You can use it, start at a certain point, and build it up from there.
VentureBeat: How far do you have to go in making the simulation real before it becomes useful? A crude architecture in a digital world can still teach you some things, but–
Busch: What we’ve figured out–talking about “real” meaning photorealistic, including this raytracing that you already see, something almost indistinguishable between the physical world and the digital world. We’ve figured out that you don’t need that all the time. This requires a massive amount of data and processing power. It would be like shooting a very big gun at very small birds. Sometimes you need it. If you look for a full layout, you want to do that. But do you have to have that real time at any point in time? Maybe not.
That means you can choose the level of photorealistic representation and bring it to a point where you say, “This is what I need.” Having an engineer who is working with this on the shop floor–they need some kind of 3D representation, but it doesn’t have to be photorealistic. You can tune down the amount of data and processing power, or tune it up as you need it. This dynamic makes sense. What’s coming more and more–I would trade a bit of the photorealistic representation against real time functionality, which is more important. It helps you stay tuned to what happens on a shop floor.
VentureBeat: Is there an expectation or hope that you’ll get this sort of perfection loop between them? You put enough sensors in the factor and feed enough data back to the digital twin, then you can start changing the digital twin. Once that’s good, you can build out the factory.
Busch: If you look at our visuals, when you talk about combining the real and digital world, it’s a loop. It’s a lying eight. It’s a continuous cycle. You build something in the digital world, then build it in the real world. You get feedback from the real world and the digital world. Each time you run through this cycle, you improve. In the field, if something’s not working, you can take the feedback to your design, make a change, bring it to the manufacturing line–it’s very easy, because with the digital twin you know exactly what to do. You’re optimizing much faster. We believe this is the power of building digital twins and keeping them alive. Optimizing it again and again, feeding the data back from the real world and optimizing. It’s very powerful.
VentureBeat: The metaverse for consumers and games–there was a lot of excitement about all this, but it also had a lot of backlash around that hype. Now a lot of metaverse projects are either not getting more funding, or they’re just saying, “We’re not calling it the metaverse anymore.” That doesn’t seem to have happened on the enterprise side. The enterprise side seems to have accepted that this is where the future is going.
Busch: This is a reason why we distinguish between the metaverse and the industrial metaverse. The industrial metaverse allows you to solve real world problems. There’s an intrinsic motivation behind building it and investing into it, making it better and better. You can get substantial value creation. This cuts across everything. It’s not only manufacturing. It’s also any kind of infrastructure. It’s trains, for example. Energy grids. Buildings.
The reason you have to distinguish between these two worlds is that–maybe it’s another feature you might want in a gaming world. For the real world, the industrial world, the question is whether or not you will stay competitive in the future. I believe companies that are not going all in with this level of automation and digitalization with their products, their manufacturing, and running assets, they will fall behind. Everybody who does has the advantage of the speed of development, the speed of software development going into hardware. We talk about optimizing your hardware over and over again, because you run it in this virtuous cycle. That’s a completely different driver.
The industrial metaverse is alive and will happen as we speak. The elements will come together. Technology will develop. You’ll get more processing power on the shop floor. It will hit all of the industries we’re serving, including health care by the way.
VentureBeat: It seems that the industrial metaverse could lend itself to different styles of manufacturing, including one where the shop floors are constantly changing.
Busch: That’s what we see more and more. The development cycles are shorter and shorter. The demand is shorter. You have to bring in new models. Looking at the car industry, for example, the development of a new engine normally takes five years. The batteries are optimized for size, how they can produce it, maybe a coating material changes–these changes are all happening so quickly that you have to adapt your cycle time to be much faster. That impacts the whole car, especially when you’re increasing the number of variations you want to produce.
It’s not only the case for car manufacturers. It’s also in food and beverage, for example. You see higher output, lower batch sizes, recipes changing more often. That’s a trend. It ties to digitalization, because if you digitize your layout, you can simulate what you want to change, shut down your line for maybe a day or two, and you know what to do. You’re not trying it out over and over again and shutting down the line for two weeks. That’s a big difference.
In the semiconductor industry it’s a bit different. They’re fighting for the last .1% of yield. You want to have a stable process. But even there, they have to change. As the structure sizes get smaller, the amount of money you have to pay for a lithography machine goes up. At the same time, if you don’t run that fully loaded, 24/7, without any interruption, you have a problem. That creates another requirement in the digital thread. You have to know exactly what’s happening in every detail, in all of these machines, so it can predict failures before they strike.
VentureBeat: Wal-Mart said that they’ve started building digital stores and virtual malls, but nobody wanted to go there. They would take a popular, say, home improvement game, build a store in it, and sell Wal-Mart items related to the home improvement theme. You could buy in the game without having to leave while you checked out. The purchasing stayed at the point of engagement. I wonder if there’s a point of engagement for the engineers that are the primary users of this industrial metaverse. Where do they need the metaverse to be for them?
Busch: As we move on, more and more engineers are born in times when they played games. They want to have the same kind of experience in their work. Similarly, when you have an experience at home with the newest computer, you don’t want to be downgraded when you go to work. This is one element that we see as a difference in what people expect from their working environment.
This brings me to AI, generative AI. I believe that technology can help you optimize your work and concentrate on what you’re really good at, rather than doing stuff–to give you one idea, programming a new application is exciting for programmers. Writing the documentation for it, nobody wants to do that. If you have a generative AI copilot that helps you program the easy stuff, so you can focus on the complicated stuff, and you don’t have to worry about the documentation because it’s written for you, I believe that keeps people tuned. They can work on what they like to do, which is solving problems and writing cool applications. Not the monotonous tasks that AI can do.
The other point is, if you think about the headset we were talking about with Sony, which is going to be launched–I had a chance to use it, one of the first models. This is amazing. You’re there. You have your gadgets in your hands. You take a camera and turn it and look at it. It’s really as if you have it in your hands. How you deal with your digital objects, this can make a big difference. I loved it.
VentureBeat: Do you sense that that’s the way products should be designed now? Does designing in the virtual world give you an advantage?
Busch: Yes, because it’s much easier. As you design something in the physical world, if you change the form factor even a little bit, it’s very nasty. In the world we’re talking about, creators can create with these new kinds of gadgets and tools. You drag and drop. You change it in real time. If you like it, it’s programmed back into your model. You don’t have to worry too much about little things. It makes the work completely different from what you normally face. What you see is what you get.
VentureBeat: How rapid a rollout do you see for your digital twins? Is this going to be in hundreds of factories?
Busch: We see it already. We have some leading customers, leading industries. These are the industries that have the highest productivity pressure. Automotive has a multidimensional transformation ahead. They really have high pressure. Their demand is extremely high. We see the same thing in the logistic companies, the fulfillment centers. They have very high pressure. Food and beverage is another one. The pressure of producing smaller lot sizes, changing recipes faster, making special offerings, that’s extremely high. Their need to adopt new technologies is very high, and they do it quite often.
The only question is the speed with which companies are adopting this technology correlates a bit with their capabilities and how easy it is to deploy a solution. The way to implement new solutions, making it easy to deploy and easy to use, that’s another element. We’re working on lowering the entry barrier. We want to say, “This is plug and play. Don’t worry about it. Buy it, install it, run it.” We’re working on that. It will help us scale up faster.
VentureBeat: How do you feel about the openness of the industrial metaverse? Is that where you want it to be, or are there are parts that you would like to see become more open? OmniVerse is very interesting as a comprehensive tool, but at some point it becomes Nvidia’s closed system.
Busch: The answer is clear. Openness makes a big difference. The openness and the ecosystem behind it, which can benefit from this openness, to me this is key. It’s a bit of a different way of looking at it. The traditional industrial market, the PLC market, it’s very closed. It’s something where you say that openness is not really what you want. But this is changing as we speak, for good reason.
We believe in a very open system where you can easily integrate new technology from others. This is the reason why we’re working with our ecosystem partners intensively – BMW, Nvidia, but also smaller and medium-sized companies like DMG MORI, one of the largest machine tool manufacturers. We’re working with them on how to auto-program machine tools. That’s a big thing, because the productivity, the availability of resources–you can take that away and make it automated. Openness is key. We’re driving that.
The markets won’t allow a monopoly on GPUs, at the end of the day. It’s not going to work. Neither will they allow a monopoly on who owns the data. There’s a reason why we’re working on different solutions with different partners. We want to keep this openness.
GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.