What is on the list of trends for 2020?

06/12/2019
Data centre trends for 2020 from Rittal

Growing volumes of data, a secure European cloud (data control), rapid upgrades of data centres and rising energy consumption are the IT/data centre trends for Rittal in 2020. For example, the use of OCP (Open Compute Project) technology and heat recovery offers solutions for the challenges of the present.

Concept of cloud computing or big data, shape of cloud in futuristic style with digital technology interface!

According to the market researchers at IDC (International Data Corporation), humans and machines could already be generating 175 zettabytes of data by 2025. If this amount of data were stored on conventional DVDs, it would mean 23 stacks of data discs, each of them reaching up to the moon. The mean 27 percent annual rate of data growth is also placing increasing pressure on the IT infrastructure.

Since there is hardly any company that can afford to increase its own data storage by almost a third every year, IT managers are increasingly relying on IT services from the cloud. The trend towards the cloud has long since been a feature in Germany: A survey published in the summer of 2019 by the Bitkom ICT industry association together with KPMG showed that three out of four companies are already using cloud solutions.

However, businesses using cloud solutions from third-party providers do lose some control over their corporate data. That is why, for example, the US Cloud Act (Clarifying Lawful Overseas Use of Data) allows US authorities to access data stored in the cloud, even if local laws at the location where the data is stored do prohibit this.

“Future success in business will be sustainable if they keep pace with full digital transformation and integration. Companies will use their data more and more to provide added value – increasingly in real time – for example in the production environment,” says Dr Karl-Ulrich Köhler, CEO of Rittal International. “Retaining control over data is becoming a critical success factor for international competitiveness,” he adds.

Trend #1: Data control
The self-determined handling of data is thus becoming a key competitive factor for companies. This applies to every industry in which data security is a top priority and where the analysis of this data is decisive for business success. Examples are the healthcare, mobility, banking or manufacturing industries. Companies are now faced with the questions of how to process their data securely and efficiently, and whether to modernise their own data centre, invest in edge infrastructures or use the cloud.

The major European “Gaia-X” digital project, an initiative of the German Federal Ministry for Economics and Energy (BMWi), is set to start in 2020. The aim is to develop a European cloud for the secure digitalization and networking of industry that will also form the basis for using new artificial intelligence (AI) applications. The Fraunhofer Gesellschaft has drawn up the “International Data Spaces” initiative in this context. This virtual data room allows companies to exchange data securely. The compatibility of their own solutions with established (cloud) platforms (interoperability) is also provided.

This means that geographically widespread, smaller data centres with open cloud stacks might be able to create a new class of industrial applications that perform initial data analysis at the point where the data is created and use the cloud for downstream analysis. One solution in this context is ONCITE. This turnkey (plug-and-produce) edge cloud data centre stores and processes data directly where it arises, enabling companies to retain control over their data when networking along the entire supply chain.

Trend #2: Standardisation in data centres with OCP
The rapid upgrade of existing data centres is becoming increasingly important for companies, as the volume of data needing to be processed continues to grow. Essential requirements for this growth are standardised technology, cost-efficient operation and a high level of infrastructure scalability. The OCP technology (Open Compute Project) with its central direct current distribution in the IT rack is becoming an interesting alternative for more and more CIOs. This is because DC components open up new potentials for cost optimisation. For instance, all the IT components can be powered centrally with n+1 power supplies per rack. This way, an efficient cooling is achieved, since fewer power packs are present. At the same time, the high degree of standardisation of OCP components simplifies both maintenance and spare parts management. The mean efficiency gain is around five percent of the total current.

Rittal expects that OCP will establish itself further in the data centre as an integrated system platform in 2020. New OCP products for rack cooling, power supply or monitoring will enable rapid expansion with DC components. Furthermore, new products will support the conventional concept of a central emergency power supply, where the power supply is safeguarded by a central UPS. As a result, it will no longer be necessary to protect every single OCP rack with a UPS based on lithium-ion batteries. The advantage: the fire load in the OCP data centre is reduced considerably.

Trend #3: Heat recovery and direct CPU cooling
Data centres release huge amounts of energy into the environment in the form of waste heat. As the power density in the data centre grows, so too do the amounts of heat, which can then potentially be used for other purposes. So far, however, the use of waste heat has proven too expensive, because consumers are rarely found in the direct vicinity of the site for example. In addition, waste heat, as generated by air-based IT cooling systems, is clearly too low at a temperature of 40 degrees Celsius to be used economically.

In the area of high-performance computing (HPC) in particular, IT racks generate high thermal loads, often in excess of 50 kW. For HPC, direct processor cooling with water is significantly more efficient than air cooling, so that return temperatures of 60 to 65 degrees become available. At these temperatures, for instance, it is possible to heat domestic hot water or use heat pumps or to feed heat into a district heating network. However, CIOs should be aware that only about 80 percent of the waste heat can be drawn from an IT rack, even with a direct CPU water cooling. IT cooling is still needed by the rack for the remaining 20 percent.

At the German Government’s 2019 Digital Summit, the topic of heat recovery was discussed in the working group concerned, which identified a high need for action. For this reason, Rittal assumes that by 2020, significantly more CIOs will be involved in the issue of how the previously unused waste heat from the data centre can be used economically.

Trend #4: Integration of multi-cloud environments
Businesses need to be assured that they can run their cloud applications on commonly used platforms and in any country. This calls for a multi-cloud strategy. From management’s point of view, this is a strategic decision based on the knowledge that its own organisation will develop into a fully digitised business.

For example, an excellent user experience is guaranteed by minimising delays with the appropriate availability zones on site. This means that companies choose one or more availability zones worldwide for their services, depending on their business requirements. Strict data protection requirements are met by a specialised local provider in the target market concerned, for example. A vendor-open multi-cloud strategy allows exactly that: combining the functional density and scalability of hyperscalers with the data security of local and specialised providers such as Innovo Cloud. At the push of a button, on a dashboard, with a contact person, an invoice and in the second when the business decision is made – this is what is making multi-cloud strategies one of the megatrends of the coming years. This is because the economy will take further steps towards digital transformation and further accelerate its own continuous integration (CI) and continuous delivery (CD) pipelines with cloud-native technologies – applications designed and developed for the cloud computing architecture. Automating the integration and delivery processes will then enable the rapid, reliable and repeatable deployment of software.

#PAuto @Rittal @EURACTIV @PresseBox

 


Bob Lally – Piezoelectric sensing technology pioneer.

27/03/2018

Molly Bakewell Chamberlin, president, Embassy Global LLC pays touching tribute to an important instrument pioneer and innovator. She acknowledges the help of Jim Lally, retired Chairman of PCB Group in preparing this eulogy.

Bob Lally (1924-2018)

During my earliest days in the sensors industry, at PCB Piezotronics (PCB), I can still remember the excitement which accompanied publication of my first technical article. It was a primer on piezoelectric sensing technology, which ran some 15 years ago in the print edition of Sensors. About a month later, I recall receiving a package at PCB, containing both a copy of my article and a congratulatory letter. The article was covered in a sea of post-it notes, filled with new insights and explanatory diagrams. I recall marveling at the sheer kindness of anyone taking such time and interest in the work. I’d sent an immediate thank you, then received yet another encouraging response.  From that time onward, nearly each time I’d publish an article, another friendly envelope would arrive. I’d look forward to them, and the opportunities for learning and growth they’d offered.

As I’d soon come to know, those envelopes were sent by none other than PCB Founder, Bob Lally, who passed away last month at the age of 93. For me, Bob was my PCB pen pal, who along with his brother, Jim, helped me to develop a real appreciation for piezoelectric sensing technology. They made it fun. I also had the privilege of learning quite a bit about this kind, brilliantly complex and insightful person who was so helpful to me. To the sensors industry, Bob’s technical contributions were legendary. What is less known about Bob, however, were his equally remarkable histories, first as a decorated veteran of WW II; and later, as an innovator in STEM.

After graduating from high school in 1942, Bob entered military service, as part of the United States Army which helped liberate mainland Europe during World War II. His service was recognised with two Bronze Stars for bravery. When the hostilities ended, Bob returned home, and was able to benefit from a special U.S. government program which funded the university education of military veterans and their families. This benefit allowed Bob to attend the University of Illinois at Urbana-Champaign, where he earned both Bachelor of Science and Master of Science degrees in Mechanical Engineering with a minor in Mathematics. He graduated with high honors, as University co-salutatorian, in 1950. Bob also later continued this commitment to lifelong learning via studies at both Purdue and the State University of New York at Buffalo (NY USA).

Bob’s first engineering job upon graduation was as a guidance and control engineer at Bell Aircraft Corp. (Bell) in Buffalo, (NY USA). This a position in which he would serve for four years. He worked in test flight control systems R&D for experimental aircraft, glide bombs and guided missiles. He also supervised the inertial guidance group. It was from his work at Bell that Bob first learned about the application of piezoelectric sensing technology for the dynamic measurement of physical parameters, such as vibration, pressure, and force. That technology was first developed by Bob’s colleague, Walter P. Kistler, the Swiss-born physicist who had successfully integrated piezoelectric technology into Bell’s rocket guidance and positioning systems.

Original PCB Piezotronics facility in the family home of Jim Lally, ca 1967. Bob Lally, centre background, is operating a DuMont oscilloscope in the Test department.
Jim Lally, left foreground, leads the Sales department.

In 1955, Bob and some of his Bell colleagues decided to form what was the original Kistler Instrument Company. That company sought to further commercialize piezoelectric sensing technologies for an expanded array of applications and markets, beyond the aerospace umbrella. In addition to his role as co-founder, Bob remained at the original Kistler Instrument Company for 11 years, serving as VP of Marketing, while continuing his roles in engineering, production, testing, and sales. Upon learning that the company was being sold to a firm out of Washington State, Bob decided to form PCB Piezotronics. Established in 1967, PCB specialized in the development and application of integrated electronics within piezoelectric sensors for the dynamic measurement of vibration, pressure, force and acceleration. The original PCB facility had rather humble beginnings, with all sales, marketing, R&D and operations running from the basement of Jim Lally’s family home.

IR-100 Award plaque, presented to Bob Lally, 1983.

It was also in this timeframe that Bob became world-renowned for his capability to successfully integrate piezoelectric sensing technology into mechanical devices, setting a new industry standard for test and measurement. He was awarded multiple U.S. patents for these innovations, including the modally-tuned piezoelectric impact hammer, pendulum hammer calibrator, and gravimetric calibrator, all for the modal impact testing of machines and structures. The modally tuned impulse excitation hammer was further recognized with a prestigious IR-100 award, as one of the top 100 industry technical achievements of 1983.

Bob was also renowned for his successful commercialization of a two-wire accelerometer with built-in electronics. That concept was marketed by PCB as integrated circuit piezoelectric, or ICP. Bob’s 1967 paper for the International Society of Automation (ISA), “Application of Integrated Circuits to Piezoelectric Transducers”, was among the first formally published technical explanations of this concept. As Bob had detailed, the application of this technology made the sensors lower cost, easier to use and more compatible with industrial environments. Subsequent widespread industry adoption of these accelerometers created new markets for PCB, such as industrial machinery health monitoring, and formed a major cornerstone for the company’s success. In 2016, PCB was acquired by MTS Systems Corporation and employs more than 1000 worldwide, with piezoelectric sensing technologies still among its core offerings.

Beyond Bob’s many R&D accomplishments, he is known for his invaluable contributions to the establishment of industry standards and best practices, as a member of the technical standards committees of the Society of Automotive Engineers (SAE), Society for Experimental Mechanics (SEM), and Industrial Electronics Society (IES), among others. Bob also served on the ISA Recommended Practices Committee for Piezoelectric Pressure Transducers and Microphones, as well as the ASA Standards Committee for Piezoelectric Accelerometer Calibration. Many of the standards that Bob helped to develop, as part of these committees, remain relevant today.

Upon retirement, Bob remained committed to the education and training of the next generation of sensors industry professionals. He often gave tutorials and donated instrumentation for student use. Bob later continued that work as an adjunct professor at the University of Cincinnati. In the mid-2000s, he began to develop an innovative series of Science, Technology, Engineering, and Math (STEM) educational models. Each was designed to provide a greater understanding of various sensing technologies, their principles of operation, and “real life” illustrations of practical applications.

STEM sensing model, with adjustable pendulums, by Bob Lally.

Among Bob’s final works was a unique STEM model consisting of three adjustable connected pendulums. That model was used to illustrate the concept of energy flex transference and the influence of physical structural modifications on structural behavior. Bob continued his mentoring and STEM work nearly right up until his passing. He did so with unwavering dedication and enthusiasm, despite being left permanently disabled from his combat injuries.

In addition to co-founding two of the most successful sensor manufacturers in history and his many R&D accomplishments, Bob’s generosity of spirit shall remain an important part of his legacy. I, like many, remain truly grateful for the selfless and meaningful contributions of Bob Lally to my early professional development, particularly in my technical article work. It is an honour to tell his story.

• He is survived by his son, Patrick (Kathi) Lally of Orchard Park, New York; his grandson, Joshua Lally; his surviving siblings, Jim, MaryAnn (Wilson), and Patricia; and his many nieces, nephews, friends and colleagues.

• Special thanks to Jim, Kathi and Patrick Lally for their support and contributions to this article.

• All pictures used hear are by kind courtesy of the Lally family.

No escape even for agrochemicals!

28/09/2017
In this article key points that are covered in depth in the IDTtechEX published report “Agricultural Robots and Drones 2017-2027: Technologies, Markets, Players” by Dr Khasha Ghaffarzadeh and Dr Harry Zervos are discussed. 

New robotics is already quietly transforming many aspects of agriculture, and the agrochemicals business is no exception. Here, intelligent and autonomous robots can enable ultraprecision agriculture, potentially changing the nature of the agrochemicals business. In this process, bulk commodity chemical suppliers will be transformed into speciality chemical companies, whilst many will have to reinvent themselves, learning to view data and artificial intelligence (AI) as a strategic part of their overall crop protection offerings.

Computer vision
Computer vision is already commercially used in agriculture. In one use case, simple row-following algorithms are employed, enabling a tractor-pulled implement to automatically adjust its position. This relieves the pressure on the driver to maintain an ultra-accurate driving path when weeding to avoid inadvertent damage to the crops.

The computer vision technology is however already evolving past this primitive stage. Now, implements are being equipped with full computer systems, enabling them to image small areas, to detect the presence of plants, and to distinguish between crop and weed. The system can then instruct the implement to take a site-specific precision action to, for example, eliminate the weed. In the future, the system has the potential to recognize different crop and weed types, enabling it to take further targeted precision action.

This technology is already commercial, although at a small scale and only for specific crops. The implements are still very much custom built, assembled and ruggedized for agriculture by the start-ups themselves. This situation will continue until the market is proven, forcing the developers to be both hardware and software specialists. Furthermore, the implements are not yet fully reliable and easy to operate, and the upfront machine costs are high, leading the developers to favour a robotic-as-a-service business model.

Nonetheless, the direction of travel is clear: data will increasingly take on a more prominent (strategic) role in agriculture. This is because the latest image processing techniques, based on deep learning, feed on large datasets to train themselves. Indeed, a time-consuming challenge in applying deep learning techniques to agriculture is in assembling large-scale sets of tagged data as training fodder for the algorithms. The industry needs its equivalents of image databases used for facial recognition and developed with the help of internet images and crowd-sourced manual labelling.

In not too distant a future, a series of image processing algorithms will emerge, each focused on some set of crop or weed type. In time, these capabilities will inevitably expand, allowing the algorithms to become applicable to a wider set of circumstances. In parallel, and in tandem with more accumulated data (not just images but other indicators such NDVA too), algorithms will offer more insight into the status of different plants, laying the foundation of ultra-precision farming on an individual plant basis.

Agriculture is a challenging environment for image processing. Seasons, light, and soil conditions change, whilst the plant themselves transform shape as they progress through their different stages of growth. Nonetheless, the accuracy threshold that the algorithms in agriculture must meet are lower than those found in many other applications such as autonomous general driving. This is because an erroneous recognition will, at worse, result in elimination of a few healthy crops, and not in fatalities. This, of course, matters economically but is a not safety critical issue and is thus not a showstopper.

This lower threshold is important because achieving higher levels of accuracy becomes increasingly challenging. This is because after an initial substantial gain in accuracy improvement the algorithms enter the diminishing returns phase where lots more data will be needed for small accuracy gains. Consequently, algorithms can be commercially rolled out in agriculture far sooner, and based on orders of magnitude lower data sizes and with less accuracy, than in many other applications.

Navigational autonomy
Agriculture is already a leading adapter of autonomous mobility technology. Here, the autosteer and autoguide technology, based on outdoor RTK GPS localization, are already well-established. The technology is however already moving towards full level-5 autonomy. The initial versions are likely to retain the cab, enabling the farmer/driver to stay in charge, ready to intervene, during critical tasks such as harvesting. Unmanned cable versions will also emerge when technology reliability is proven and when users begin to define staying in charge as remote fleet supervision.

The evolution towards full unmanned autonomy has major implications. As we have discussed in previous articles, it may give rise to fleets of small, slow, lightweight agricultural robots (agrobots). These fleets today have limited autonomous navigational capability and suffer from limited productivity, both in individual and fleet forms. This will however ultimately change as designs/components become standardized and as the cost of autonomous mobility hardware inevitably goes down a steep learning curve.

Agrobots of the future
Now the silhouette of the agrobots of the future may be seen: small intelligent autonomous mobile robots taking precise action on an individual plant basis. These robots can be connected to the cloud to share learning and data, and to receive updates en mass. These robots can be modular, enabling the introduction of different sensor/actuator units as required. These robots will never be individually as productive as today’s powerful farm vehicles, but can be in fleet form if hardware costs are lowered and the fleet size-to-supervisor ratio is increased.

What this may mean for the agrochemicals business is also emerging. First, data and AI will become an indispensable part of the general field of crop protection, of which agrochemical supply will become only a subset, albeit still a major one. This will mandate a major rethinking of the chemical companies’ business model and skillsets. Second, non-selective blockbuster agrochemicals (together with engineered herbicide resistant seeds) may lose their total dominance. This is because the robots will apply a custom action for each plant, potentially requiring many specialized selective chemicals.

These will not happen overnight. The current approach is highly productive, particularly over large areas, and off-patent generic chemicals will further drive costs down. The robots are low-lying today, constricting them to short crops. Achieving precision spraying using high boys will be a mechanical and control engineering challenge. But these changes will come, diffusing into general use step by step and plant by plant. True, this is a long term game, but playing it cannot be kicked into the long grass for long.

@IDTechEx #Robotics #Agriculture #PAuto

New editor in chief for automation journal

17/10/2011

Dr. Ahmad B. Rad is to be the new editor-in-chief of ISA Transactions, beginning January 2012, when he succeeds Dr. R. Russell Rhinehart.

Dr Ahmed Rad

ISA Transactions is a journal of advances and state-of-the-art in the science and engineering of measurement and automation, of value to leading-edge industrial practitioners and applied researchers.

“Dr. Ahmad Rad has been one of the key participants in managing reviews, soliciting manuscripts, and seeking both scientific excellences with a practice balance,” said Dr. R. Russell Rhinehart, ISA Transactions past editor-in-chief. “During Dr. Rad’s 10 years as an Associate Editor, he energetically and creatively shaped the direction and continuous improvement of ISA Transactions.  I am sure that his leadership will continue, and that ISA Transactions – The Journal of Automation will rise to even greater international impact.”

Dr. Rad received the B.Sc. degree in engineering from Abadan Institute of Technology, Abadan, Iran, the M.Sc. degree in control engineering from the University of Bradford, Bradford, U.K., and the Ph.D. degree in control engineering from the University of Sussex, Brighton, U.K., in 1977, 1986 and 1988, respectively. He is currently a professor at the school of Engineering Science of Simon Fraser University, Canada.

Prior to this appointment, Rad served as chair of Robotics and Mechatronics at the Faculty of Engineering and Industrial Sciences of Swinburne University of Technology, Melbourne, Australia, and as professor of Electrical Engineering at the Department of Electrical Engineering of the Hong Kong Polytechnic University, Hong Kong. He has also worked as a control and instrumentation engineer in the oil industry for seven years from 1977 to 1984.

Dr. Rad’s research interests include autonomous systems, robotic systems, intelligent process control, time delay system identification and adaptive control. He has served as a member of editorial board and an associate editor of ISA Transactions since 1999.

“We are very pleased that such a distinguished scholar as Dr. Ahmad Rad has agreed to fill the formidable space being left by Dr. Russ Rhinehart after 14 years at the editorial helm of ISA Transactions,” said Eoin Ó Riain, vice president of the ISA Publications Department. “Dr. Rhinehart is a hard act to follow but we know that in his experience in working with Russ on ISA Transactions, Dr. Rad has a keen grasp of the journal, how it has developed under Dr. Rhinehart’s editorial leadership and the correct mix between practice and theory, which has made it the success it is. We look forward to seeing Dr. Rad’s own insights and the further development of ISA Transactions in the years of his editorship.”


Regulations drive automation in W&W in Europe

20/05/2010

The European water and wastewater (W&WW) sector has been offering sustainable opportunities for the automation and control solutions (ACS) market mainly due to supportive legislation. Stringent regulations and intervention from private participants have paved the way for prospective growth opportunities for ACS in the European W&WW sector. Additionally, investments from the developing economies of the Central and Eastern European (CEE) regions contribute to building more automated plants in Europe.

Gloine uisce

New analysis from Frost & Sullivan, Strategic Analysis of Automation and Control Solutions in European Water and Wastewater Sector, finds that the market earned revenues of $623.3 (€500+)  million in 2009, and estimates this to reach $825.5 (€670+) million in 2016. The markets covered in this research service by product type are programmable logic controller (PLC), supervisory control and data acquisition (SCADA), disturbed control system (DCS), human machine interface (HMI), manufacturing execution systems (MES) and industrial asset management (IAM).

The majority of Europe has poor networking with regard to water distribution and wastewater treatment. EU directives such as the Water Frame Work Directive, Drinking Water Directive and Urban Wastewater Treatment Directive are the main political drivers that have catalysed strong opportunities for treatment techniques such as membrane bio-reactor (MBR), ozone and ultraviolet (UV).

“European Union’s directives are the major driver for the growth of automation and control systems across the water and wastewater sector, mandating the European countries to comply,” says Frost & Sullivan Research Analyst Katarzyna Owczarczyk. “The primary focus of the regulations is to enhance the water and the wastewater infrastructure in order to concentrate on incorporating the ACS system to completely automate the plant, help in continually tracking several production processes, and also effectively control and maintain various plant operations.”

The accession countries face the impact of EU regulations the most due to the timeframe within which they have to transpose the directives into actions. Regions such as Eastern Europe, but also others like Iberia, parts of Italy, and Benelux are undertaking large-scale implementation of water treatment plants. Significantly, European Union is funding the new member states from the CEE region to improve their W&WW infrastructure through the cohesion fund mechanism, leading to a plethora of design, build and operate (DBO) opportunities. All these initiatives are likely to spur growth of the European ACS market for the W&WW industry.

However, a key challenge faced by ACS manufacturers is the need to provide systems that seamlessly integrate with the existing plant infrastructure.

“The end users are conservative when it comes to revamping the existing systems in order to incorporate latest automation and control solutions,” explains Owczarczyk. “Integration issues along with the cost involved in revamping make the end users reluctant, and consequently either prohibiting or delaying the implementation of newer automation and control solutions.”

“Manufacturers should reassure customers about their products’ compatibility and urge them to adapt to the new systems,” he concludes. “Besides providing automation systems that are compatible with the existing plant set-up, suppliers can also retain the cost and engineering inputs from the earlier set-up.”