5G is the tool that makes the other trends possible in the first place
Electromobility, the Internet of Things (IoT), Big Data, automation, artificial intelligence, autonomous driving and 5G. Which one is your favorite trend at present?
Generally speaking, many of these trends influence one another – and there are also many interfaces between them. As a commuter, for example, I'm very interested in autonomous driving. While I'm driving, I spend a lot of time uselessly in the car – time that I could use more sensibly if the car could drive itself.
In your opinion, which megatrend has the greatest potential to change the world in the long term?
You need to take a holistic view of mobility, and many of the other megatrends as well. One key point here is the IT infrastructure that's required, especially as regards data transmission. If I want to control and regulate my processes in real time, I'll eventually need faster transmission speeds than 5G can provide at present. It will only become possible to implement the other megatrends when data transmission takes place in the terahertz range, at some time in the future. That's why 5G is a crucial factor for data transmission.
Do you think that 5G is the trend that will have the biggest impact on all the other megatrends?
Yes – because we will need 5G to implement AI methods, Big Data and IoT. Of course, stationary analysis is possible in all of these cases, but mobile applications are highly interesting – like autonomous driving, for instance, or drones. Because sooner or later, we have to reach the point where the devices can communicate with one another in real time. That's why 5G is the tool that makes the other trends possible in the first place. However, Big Data is another extremely interesting topic: somehow or other, we have to collect the data so that we can apply AI methods, for example. But we can't implement any of this in real time unless we can generate data transmission at adequate speeds.
What do you see as the main risks arising from this trend?
The greatest challenge in this area is the issue of data security – I'm referring to data misuse, and malicious manipulation of processes. To do everything and network everything digitally in real time – so everyone is communicating with everyone else – we need open networks, interfaces and protocols. At the end of the day, of course, this opens up the possibility of manipulative interventions again.
Is that also an issue that you regard as particularly important at the university?
Naturally, the university has to participate in work on these issues, and we must generate knowledge in this field. Paderborn University is in an excellent position to do this: we focus on several subjects in this field – the main ones are optoelectronics, photonics and intelligent technical systems. We have access to wide-ranging know-how so that we can come up with answers to the fundamental questions. It's very clear that the university has to position itself in this field, because – of course – data security is also a socially driven issue.
Is this also required by your industrial clients?
It goes without saying that this is an important issue for industry, so it's being pushed for that reason. If you look at the automotive industry as a whole, and developments in this sector, you can see that there's a lot of movement in the market. Autonomous driving and electrified drives are core trends here. And of course those trends are reflected back 1:1 in our work. As an institute of higher education, we usually provide value-neutral responses to more fundamental questions, and we generate the relevant knowledge. But naturally, the requirements and the input originate from industry too, so economic aspects also have to be considered.
You're dealing with sensor technology and signal conditioning every day. What do you see as the major challenges for measurement technology in in your everyday working routine?
We're in a university here, not an industrial business. This means that I'm working together with students and academic staff a lot of the time. They are only temporary employees, on limited-time contracts, and they leave the university again after they graduate. So the result is that the specialists on our staff are constantly changing, and you keep having to instruct and train them over and over again. That's why the training effort for the measurement technology that we use has to be as low as possible. For the same reasons, the equipment should be robust and users should be able to operate it intuitively. Another factor is the wide variety of measurement equipment that we use. Many specialist departments operate in our facilities, and their research focuses on different subjects. In our professorial department, for example, we've covered virtually every aspect from quasistatic investigations through to crashes – we use everything here from simple strain gage force transducers to super-fast recorders. That means we have to keep track of a vast range of equipment.
How do you set about managing such a huge volume of measurement data?
That's certainly another important point. In the future, I'd like to see a fully networked 'Test Facility 4.0' for our laboratory operation. In other words, our measurement data would be transmitted directly from the machine into the cloud. That way, we would have real-time access to all the measurement data at all times, so we could take a look at what's going on in the tests. And that brings us to the subject of Big Data: to take just one example, we carry out 100 to 200 tensile tests every week. It's difficult to keep track of the huge amount of data that they generate. Innovative algorithms offer the only way for us to identify new dependencies and relationships. This is why it's very important to manage the data properly so that you're also able to work with it.
Which other challenges will measurement technology face in the future?
The main issue that comes to mind is the whole area of regulation and control in real time. It's very challenging to integrate sensors into highly complex measuring systems, and to forward the measurement data to a higher-level system at the highest possible data transmission speeds. One possible solution would be to combine sensors and actuators – in other words, to integrate small decentralized computing units. That would enable us to monitor our components with intelligent control algorithms, and we could also control and regulate them at the same time.
You also work with some products from Kistler in your laboratory. Which products do you use?
For more than ten years now, we've been working with force sensors, force transducers and charge amplifiers from Kistler. We put our trust in products from your company, especially for crash testing. We're very satisfied with them – and that applies to the service as well. That's why we're always keen to buy more products from you.
If you could just make one wish, which measurement technology product would you want? What would make your work easier?
Something that would make our work far easier would be an integral data management system that could upload all the data into the cloud and manage it, in real time, during the measurement operation. Then it should be possible to perform analyses and evaluations there. Sad to say, we haven't yet been able to implement that here as yet. For the time being, our scientific staff have to store their data decentrally and then enter it manually in a database. That situation can certainly be optimized. As you know, Kistler has added a new controller to its range that calculates the measurement uncertainty immediately, among other features. We would be very interested in that. A measuring instrument that doesn't need to be extended with large numbers of extra cards would also be ideal – an instrument where everything is integrated. A completely wireless solution would be best, so there would be no need to install any measuring cables.