The demands placed on modern measurement technology in industry today are higher than ever: production processes must become more efficient, more fail-safe and more flexible - and this under constant cost and innovation pressure. At the same time, unplanned downtimes and high maintenance costs are to be avoided through intelligent condition monitoring. The need for precise and comprehensive data acquisition is also growing in the field of research and development - test stands today often have to monitor dozens or even hundreds of parameters in real time.
What sounds simple on paper - recording, storing and analyzing data - proves to be a complex challenge in practice. Because the reality often looks like this: Measurement data is generated decentrally, at different points in the plant or even distributed worldwide. It comes from a wide variety of sources and has to be painstakingly collated, synchronized and evaluated. And often the necessary system integration and flexibility for location- and platform-independent access are missing.
A key problem is that many measurement technology systems have grown historically. They consist of isolated solutions without a consistent architecture, which makes horizontal communication at field level just as difficult as vertical integration into company systems or cloud services. This not only prevents uniform data evaluation - it also slows down the digitalization of production as a whole.
But what can a modern measurement technology solution look like that overcomes all these hurdles? Which technologies enable end-to-end data acquisition and processing - from the sensor to the cloud?
The following article explores these questions. And not from the classic bottom-up approach - but from the user interface. After all, it is where the human interacts with the machine that ultimately determines how efficient, transparent and future-proof a measurement system really is.
The HMI - from the static Windows solution to platform-independent measurement technology software
In many companies today, there is still a familiar picture: a control station PC running an older SCADA system on the production line or test bench. These systems have been adapted and expanded over the years, but are now often difficult to maintain, not very flexible and no longer up to date with the latest software architecture. Location-independent access to process data is generally not possible with such solutions - the operator is tied to a fixed workstation.
However, requirements have changed. Users today expect more flexibility: not only do they want to design individual dashboards for their use cases, they also want to be able to access measurement data from different devices and locations - for example from a laptop in the office, a tablet on the shop floor or even a smartphone on the move.
An example of the implementation of these requirements can be found in the ProfiSignal 20 software platform from Delphin Technology AG. It follows a modular and platform-independent approach: while the ProfiSignal 20 Go module offers a wide range of analysis functions for a wide variety of diagram forms (e.g. y(t), y(x), multi-track), ProfiSignal 20 Basic enables the creation of individually designed visualization and user interfaces - comparable to modern SCADA systems, but much more flexible in use.
One remarkable feature is its platform-independent usability: the software runs on both classic Windows systems and mobile devices with iOS or Android. This enables continuous data access - from the control station to the smartphone. This not only makes process monitoring more convenient, but also significantly more efficient.
An interesting detail is the so-called SCACH function ("Scan And Check"), which enables particularly low-threshold access to measurement projects. By scanning a QR code attached directly to the machine, the associated project is opened on the mobile device - ideal for routine rounds, maintenance or short-term checks.
Digitalization does not stop at measurement technology. Systems such as ProfiSignal 20 show how modern visualization solutions can help to make information access and process control much more flexible - and thus better adapted to today's production and development requirements.
A flood of data without direction? Why centralized data management is the key to efficiency
Before measurement data can be analysed, visualized or used for decision-making, it must first be reliably recorded, collated and made available. However, this is precisely where things often get stuck in practice - with noticeable consequences for time, quality and costs.
The picture is similar in many places: data is generated in different machines, plant sections or even geographically separate locations. They are read out via incompatible interfaces, often transferred manually, copied into tables and then laboriously merged. What sounds like a technical detail turns out to be a real productivity killer in everyday life - including a high susceptibility to errors and delayed response times.
In addition, the more distributed the infrastructure, the more difficult it becomes to provide consistent, up-to-date data centrally. And when additional requirements are placed on IT security, access rights or alerting, conventional solutions quickly reach their limits.
One approach to consistently solving these challenges is to establish consistent, centralized data management. One example of this is the Delphin Data Center - a scalable platform that automatically collects, synchronizes and stores data from a wide variety of sources. The software can be operated flexibly on a local server or directly in a cloud environment.
By supporting common interfaces such as OPC UA, Modbus TCP and SQL, classic DAQ systems as well as other field devices, ERP systems, MES databases or cloud-based applications can be connected. This turns a heterogeneous data landscape into a standardized information base - without manual intermediate steps.
All data connections are encrypted as standard, ensuring reliable operation even in security-critical areas such as energy, chemicals or pharmaceuticals. In addition, the Delphin Data Center offers functions for central user and rights management as well as alarm management. The result is a holistic data ecosystem that is not only technically impressive, but also meets organizational requirements - and makes individual solutions superfluous.
Image: master1305@AdobeStock, Delphin Technology AG