Page 41

Industrial Ethernet Book 103

Applications Microsoft Azure IoT and OPC UA can work together to provide effective links between the private and public cloud. cloud monitoring the serviceability state. There is also a further aspect to this. That of maintaining control even if errors or failures are affecting the system. The reaction of the fog is far faster than that of the cloud so even with a catastrophic failure the motor can be stopped, possibly reducing secondary damage and hence efficiently allowing maintenance to be more cost effective. On the other hand, maintenance can also be made more cost effective if the cloud monitoring the motor’s information can calculate when it is expected to fail and so schedule maintenance at the most appropriate time. When designing the system, we can also implement mechanisms as ring topology and use standard protocols to automatically correct standard errors and faults in the transport layers. Defined and standardized to make the system rugged, such methods also can allow different devices to interoperate and ensure the rugged platform, and as such, data and information integrity, is maintained. Failure in a bad way Always good engineering practice, the designer will obviously cater for most, if not every, failure condition that could be met within the system. Devices can fail, wiring can fail but such events can be catered for within devices and their reporting facilities but today the thing that should be to the forefront of everybody’s mind is cyber-security. A forced, intentional failure could be approached and caused anywhere in the system if it is open to abuse, the designer should cater for such in the design. The level of security applied is of course going to increase the TCO but when offset against such a potentially harmful failure, which could go undetected for some considerable time, it is more optimum to implement safety and security features than not. The story begins Everyone has recently started discussing cloud and fog computing but in reality they have always, relatively, been there. It is only now that the terms have been given meaning in the system function partitioning that clarity to the uninitiated comes and helps to target the system architecture design decision thought processes. System design can be seen to be based around a simple derivation; the data that is obtained from the real world, the information the data forms and the use the information is put to. Detailing the transform to information at the edge is where the fog lies. The transform actually clears the fog from the system, easing sight of the overall picture or control needed, allowing optimum use of resources such as lessening the bandwidth needs of information moving towards the cloud as well as aiding the edge peer to peer use of the information derived. Obviously, from such operations we have now formed several aspects of the system layer definition with time and effort to do so making time to market less. Take for example a water tank fed by streams used to irrigate farmland. It is desired to keep the water tank at a specific level to ensure good pressure to the irrigation system. Function partitioning is by functions looking at items which can be controlled in the fog and items which cannot be controlled locally too well are pushed to the cloud. Water purity and temperature are fed to the cloud but tank level is monitored and controlled in the fog. It would be pointless having an on/off control for letting the water into the irrigation, far better to have a variable opening which maintains the pressure but controls the efflux amount over time as the amount of influx changes with the level of water in the streams. In such a system the amount of data passed to the cloud is far less than having all the SOURCE: MOXA pressure/level/efflux data passed upwards. cloud processing and storage is far less and so are cloud running costs. Can data be intelligent? We started this journey asking if data can be intelligent. In most ways the answer has to be no, as to exhibit intelligence processing has to be involved. Intelligence in all guises understood today would seem to point at the need for an understanding of the end to end needs but Artificial Intelligence is based on many conjoined disciplines, not least of which is that of system operation utilising operations that behave akin to a neuron. A data point becomes the data itself, some self-imposed limits, feedback of the amortised data point and its output. Effectively now we appear to be on the cusp of data becoming intelligent in its own right with little processing. Add to this a data point becoming an information point, where information is passed through a similar ‘neuron’, as discussed the raw data is diluted but the information now aids a better overview and gives wiser system usage and control. One of the better points of all this metaphysical understanding is that, with the power of quite small devices today it is no longer the case that the fog computing and cloud computing are strictly processing partitioned. More, it leaves the control centre with a newly acquired power at quite minimal costs. No longer are very powerful computers needed nor is there a need for high levels of human intervention or even monitoring. Look at the vehicle industry today. Cars order their own spare parts to be replaced at the next managed servicing period as well as driving themselves. Intelligent data? Yes. Possibly in a premature state today, but it is definitely present. Alan Harris, Field Application Engineer, Moxa. 41 11.2017 industrial ethernet book


Industrial Ethernet Book 103
To see the actual publication please follow the link above