Data privacy!


It’s been another busy year for hackers. According to the Central Statistics Office, nearly 1 in 5 (18 %) of Irish businesses experienced ICT-related incidents, 87 per cent of which resulted in the unavailability of ICT services, and 41% which resulted in either the destruction, corruption or disclosure of data.

Noel O’Grady, writer of this piece, is the head of Sungard Availability Services Ireland and has over 20 years of experience working with leading technology firms including HP, Vodafone and Dell in providing critical production and recovery services to enterprise-level organisations.

Last year saw a number of high-profile security incidents making the headlines. In April, 3,600 accounts belonging to former customers of Ulster Bank were compromised, resulting in some customers’ personal details being released. In July, the Football Association of Ireland confirmed that malware was discovered on its payroll server following an attempted hack on IT systems.

Entering a new decade, digital technologies will continue to permeate every aspect of modern life, and the security of IT systems will come under increasing scrutiny. This will be driven by two major consequences of today’s hyper-connected world. Firstly, the sheer number of systems and devices which have now become digitalised has vastly expanded the cybersecurity threat landscape, potentially multiplying vulnerabilities or points of entry for hackers. Simultaneously, consumers and businesses alike demand constant availability in the products and services they use, reducing the tolerance for periods of downtime.

As a result, the security of data is no less than a global issue on par with national security, economic stability and even the physical security of citizens. It is with this in mind that Data Privacy Day is observed on this day (28th January 2020), a global initiative which aims to spread awareness of the hugely fundamental role that cybersecurity plays.

One of the most important developments in the field of data privacy was the establishment of the General Data Protection Regulation (GDPR) in May 2018. Nearly two years on, it’s timely to review how the new regulatory environment has succeeded in achieving its goals, especially in the light that almost one in three European businesses are still not compliant.

Data Privacy Day 2020

GDPR works by penalising organisations with inadequate data protection through sizeable fines. While this has established an ethical framework from which European organisations can set out strategies for protecting personal data, one issue that is still often overseen is the result of an IT outage, which prevents businesses from keeping its services running. As a server or organisation’s infrastructure is down, data is then at risk to exposure and therefore a company is at risk of failing compliance. IT and business teams will need to locate and close any vulnerabilities in IT systems or business processes, and switch over to disaster recovery arrangements if they believe there has been a data corruption.

This is especially pertinent in Ireland, where, according to a spokesperson for the Department of Business, Enterprise and Innovation (DoBEI), “Data centre presence…raises our visibility internationally as a technology-rich, innovative economy.” A strategic European hub for many multi-national technology giants, Ireland is currently home to 54 data centres, with another 10 under construction and planning permission for a further 31. While this growth in Ireland’s data centre market is a huge advantage for the national economy, Irish businesses must also tread with caution as they shoulder the responsibility for the security and availability of the countless mission-critical applications and processes which rely on them.

An organisation’s speed and effectiveness of response will be greatly improved if it has at its fingertips the results of a Data Protection Impact Assessment (DPIA) that details all the personal data that an organisation collects, processes and stores, categorised by level of sensitivity. Data Privacy Day is a great opportunity to expose unknown risks that organisations face, but moving forward, it is vital that business leaders embed privacy into every operation. This is the only sustainable way to ensure compliance on an ongoing basis.

#Cybersecurity @SungardASUK @brands2life

Flood monitoring.

Monitoring is an essential component of natural flooding management, helping to define appropriate measures, measure their success, keep stakeholders informed, identify mistakes, raise alarms when necessary, inform adaptive management and help guide future research.

Great Fen showing Holme Fen woods top left and new ponds and meres in April

Flooding is a natural process, but it endangers lives and causes heavy economic loss. Furthermore, flood risk is expected to increase with climate change and increased urbanisation, so a heavy responsibility lies with those that allocate funding and formulate flood management strategy. In the following article, Nigel Grimsley from OTT Hydromet explains how the success of such plans (both the design and implementation) depend on the accuracy and reliability of the monitoring data upon which they rely.

Climate projections for Britain suggest that rainfall will increase in winter and decrease in summer, and that individual rainfall events may increase in intensity, especially in winter. This paradigm predicates an increased risk of flooding.

Emphasising the urgent need for action on flood risk, (British) Environment Agency chairwoman Emma Howard Boyd, has said that on current trends, global temperature could rise between 2 deg C and 4 Deg C by 2100 and some communities may even need to move because of the risk of floods. Launching a consultation on the agency’s flood strategy, she said: “We can’t win a war against water by building away climate change with infinitely high flood defences.”

In response, Mike Childs, head of science at Friends of the Earth, said: “Smarter adaptation and resilience building – including natural flood management measures like tree-planting – is undeniably important but the focus must first and foremost be on slashing emissions so that we can avoid the worst consequences of climate chaos in the first place.”

Historically, floodplains have been exploited for agricultural and urban development, which has increased the exposure of people, property and other infrastructure to floods. Flood risk management therefore focused on measures to protect communities and industry in affected areas. However, flood risk is now addressed on a wider catchment scale so that initiatives in one part of a catchment do not have negative effects further downstream. This catchment based approach is embodied within the EU Floods Directive 2007/60/EC, and in recent years, those responsible for flood management have increasingly looked for solutions that employ techniques which work with natural hydrological and morphological processes, features and characteristics to manage the sources and pathways of flood waters. These techniques are known as natural flood management (NFM) and include the restoration, enhancement and alteration of natural features but exclude traditional flood defence engineering that effectively disrupts these natural processes.

NFM seeks to create efficiency and sustainability in the way the environment is managed by recognising that when land and water are managed together at the catchment scale it is possible to generate whole catchment improvements with multiple benefits.

Almost all NFM techniques aim to slow the flow of water and whilst closely connected, can be broadly categorised as infiltration, conveyance and storage.

Land use changes such as set-aside, switching arable to grassland or restricted hillside cropping, can improve infiltration and increase water retention. In addition, direct drilling, ‘no-till’ techniques and cross slope ploughing can have a similar effect. These land use techniques are designed to reduce the soil compaction which increases run-off. Livestock practices such as lower stocking rates and shorter grazing seasons can also help. Field drainage can be designed to increase storage and reduce impermeability, which is also aided by low ground pressure vehicles. The planting of shrubs and trees also helps infiltration and retention by generating a demand for soil moisture, so that soils have a greater capacity to absorb water. Plants also help to bind soil particles, resulting in less erosion – the cause of fertility loss and sedimentation in streams and rivers.

Ditches and moorland grips can be blocked to reduce conveyance, and river profiles can be restored to slow the flow. In the past, peats and bogs have been drained to increase cropping areas, but this damages peatlands and reduces their capacity to retain water and store carbon. The restoration of peatland therefore relies on techniques to restore moisture levels. Pumping and drainage regimes can be modified, and landowners can create strategically positioned hedges, shelter belts and buffer strips to reduce water conveyance.

Rivers can be reconnected with restored floodplains and river re-profiling, leaky dams, channel works and riparian improvements can all contribute to improved storage capability. In urban areas permeable surfaces and underground storage can be implemented, and washlands and retention ponds can be created in all areas. As mentioned above, the re-wetting of peatland and bogs helps to increase storage capacity.

Many of the effects of NFM might be achieved with the re-introduction of beavers, which build dams that reduce peak flows, create pools and saturate soil above their dams. The dams also help to remove pollutants such as phosphates. Beavers do not eat fish, instead preferring aquatic plants, grasses and shrubs during the summer and woody plants in winter. Beavers are now being introduced in a number of areas in trials to determine their value in the implementation of NFM. One of the key benefits offered by beavers is their ability to quickly repair and rebuild dams that are damaged during extreme weather. However, whilst the potential benefits of beavers are well known, several groups have expressed concern with the prospect of their widespread introduction. For example, farmers and landowners may find increased areas of waterlogged land due to blocked drainage channels. In addition, dams present a threat to migratory fish such as salmon and sea trout.

Beavers are native to Britain and used to be widespread, but they were hunted to extinction during the 17th century. However, other non-native species such as signal crayfish can have a detrimental effect on flood protection because they burrow into river banks causing erosion, bank collapse and sediment pollution. Signal crayfish are bigger, grow faster, reproduce more quickly and tolerate a wider range of conditions than the native white-clawed crayfish. Signal crayfish are also voracious predators, feeding on fish, frogs, invertebrates and plants, and as such can create significant negative ecological effects.

NFM benefits
NFM provides protection for smaller flood events, reduces peak flooding and delays the arrival of the flood peak downstream. However, it does not mitigate the risk from extreme flood events. Effective flood management strategy therefore tends to combine NFM with hard engineering measures. Nevertheless, NFM generally provides a broader spectrum of other benefits.

The creation of new woodlands and wetlands produces biodiverse habitats with greater flood storage capacity. They also enable more species to move between habitats. NFM measures that reduce soil erosion, run-off and sedimentation also help to improve water quality and thereby also improve habitats. In particular, these measures lower nutrient and sediment loading lower in the catchment; two issues which can have dramatic effects on water quality and amenity.

Land use and land management measures help to reduce the loss of topsoil and nutrients. This improves agricultural productivity and lowers the cost of fertilizers. Furthermore, a wide range of grants are available for NFM measures, such as the creation of green spaces and floodplains, to make them more financially attractive to farmers and landowners.

Many NFM measures help in the fight against climate change. For example, wetlands and woodlands are effective at storing carbon and removing carbon dioxide from the atmosphere. Measures that reduce surface run off and soil erosion, such as contour cultivation, can also reduce carbon loss from soil.

Monitoring NFM
Given the wide range of potential NFM benefits outlined above, the number and type of parameters to be monitored are likely to be equally diverse. Baseline data is essential if the impacts of implemented measures are to be assessed, but this may not always be deliverable. For example, it may only be possible to collect one season of data prior to a five year project. However, it may be possible to secure baseline data from other parties. In all instances data should of course be accurate, reliable, relevant and comparable.

Monitoring data should be used to inform the design of NFMs. For example, a detailed understanding of the ecology, geomorphology, hydrology and meteorology of the entire catchment will help to ensure that the correct measures are chosen. These measures should be selected in partnership with all stakeholders, and ongoing monitoring should provide visibility of the effects of NFM measures. Typically stakeholders will include funders, project partners, local communities, landowners, regulators and local authorities.

Since NFM measures are designed to benefit an entire catchment, it is important that monitoring is also catchment-wide. However, this is likely to be a large area so there will be financial implications, particularly for work that is labour-intensive. Consequently, it will be necessary to prioritise monitoring tasks and to deploy remote, automatic technology wherever it is cost-effective.

OTT ecoN with wiper

OTT ecoN Sensor

Clearly, key parameters such as rainfall, groundwater level, river level and surface water quality should be monitored continuously in multiple locations if the benefits of NFM are to be measured effectively. It is fortunate therefore that all of these measurements can be taken continuously 24/7 by instruments that can be left to monitor in remote locations without a requirement for frequent visits to calibrate, service or change power supplies. As a business OTT Hydromet has been focused on the development of this capability for many years, developing sensors that are sufficiently rugged to operate in potentially aggressive environments, data loggers with enormous capacity but with very low power requirement, and advanced communications technologies so that field data can be instantly viewed by all stakeholders.

Recent developments in data management have led to the development of web-enabled data management solutions such as Hydromet Cloud, which, via a website and App, delivers the backend infrastructure to receive, decode, process, display and store measurement data from nearly any remote hydromet monitoring station or sensor via a cloud-based data hosting platform. As a consequence, alarms can be raised automatically, which facilitates integration with hard engineering flood control measures. Hydromet Cloud also provides access to both current and historic measurement data, enabling stakeholders to view the status of an entire catchment on one screen.

Holme Fen – a monitoring lesson from the 1850s

Holme Fen post HS

Holme Fen post

Surrounded by prime agricultural land to the south of Peterborough (Cambridgeshire,GB) , the fens originally contained many shallow lakes, of which Whittlesey Mere was the largest, covering around 750 hectares in the summer and around twice that in the winter. Fed by the River Nene, the mere was very shallow and was the last of the ‘great meres’ to be drained and thereby converted to cultivatable land.

Led by William Wells, a group of local landowners funded and arranged the drainage project, which involved the development of a newly invented steam powered centrifugal pump which was capable of raising over 100 tons of water per minute by 2 or 3 feet. A new main drain was constructed to take water to the Wash. Conscious of the likely shrinking effect of drainage on the peaty soil, Wells instigated the burial of a measurement post, which was anchored in the Oxford Clay bedrock and cut off at the soil surface. In 1851 the original timber post was replaced by a cast iron column which is believed to have come from the Crystal Palace in London.

By installing a measurement post, Wells demonstrated remarkable foresight. As the drainage proceeded, the ground level sank considerably; by 1.44 metres in the first 12 years, and by about 3 metres in the first 40 years. Today, around 4 metres of the post is showing above ground, recording the ground subsidence since 1852. The ground level at Holme Post is now 2.75 metres below sea level – the lowest land point in Great Britain.
Several complications have arisen as a result of the drainage. Firstly, there has been a huge impact in local ecology and biodiversity with the loss of a large area of wetland. Also, as the ground level subsided it became less sustainable to pump water up into the main drain.

Holme Fen is now a National Nature Reserve, managed by Natural England, as is the nearby Woodwalton Fen. They are both part of the Great Fen Project, an exciting habitat restoration project, involving several partners, including the local Wildlife Trust, Natural England and the Environment Agency. At Woodwalton, the more frequent extreme weather events that occur because of climate change result in flooding that spills into the reserve. In the past, this was a good example of NFM as the reserve provided a buffer for excess floodwater. However, Great Fen Monitoring and Research Officer Henry Stanier says: “Floodwater increasingly contains high levels of nutrients and silt which can harm the reserve’s ecology, so a holistic, future-proof strategy for the area is necessary.”

Applauding the farsightedness of William Wells, Henry says: “As a conservationist I am often called in to set up monitoring after ecological recovery has begun, rather than during or even before harm has taken place. At the Wildlife Trust, we are therefore following the example provided by Wells, and have a network of monitoring wells in place so that we can monitor the effects of any future changes in land management.

“For example, we are setting up a grant funded project to identify the most appropriate crops for this area; now and in the future, and we are working with OTT to develop a monitoring strategy that will integrate well monitoring with the measurement of nutrients such as phosphate and nitrate in surface waters.”

Monitoring provides an opportunity to measure the effects of initiatives and mitigation measures. It also enables the identification of trends so that timely measures can be undertaken before challenges become problems, and problems become catastrophes.

Monitoring is an essential component of NFM, helping to define appropriate measures, measure their success, keep stakeholders informed, identify mistakes, raise alarms when necessary, inform adaptive management and help guide future research.

#Environment @OTTHydromet @EnvAgency @friends_earth

It all began with the War of the Currents…


Today, people greatly appreciate having electrical energy available at the flip of a switch, seemingly at any time and for any occasion. But where does electricity actually come from? The answer most people would give you is: “from the wall socket, of course”. So does this automatically settle the question of security of supply? More on this later.

If we compare the history of electric current with the 75 years of the history of Camille Bauer Metrawatt AG, it is easy to see how they were interlinked at certain times in the course of their development. Why is that?

It all began with the War of the Currents – an economic dispute about a technical standard

It was around 1890 when the so-called War of the Currents started in the USA. At that time, the question was whether the direct current favoured by Thomas Alva Edison (1847-1931) or the alternating current promoted by Nicola Tesla (1856-1943) and financially supported by George Westinghouse (1846-1914), was the more suitable technology for supplying the United States of America with electrical energy over large areas and constructing power grids. Because of Westinghouse’s market dominance at that time compared to Edison General Electric (called General Electric from 1890 on), it soon became clear that the alternating voltage invented by Nicola Tesla was rapidly gaining the upper hand. This was not least because its approximately 25% lower transmission losses weighed unquestionably in its favour. Soon afterward, came the breakthrough for alternating voltage as the means of transmitting electrical energy using. Initially, the main target application was electric lighting, which to be spurred on by the invention of the incandescent lamp by Edison. The reasons for this were logical. Westinghouse was initially a lighting manufacturing company and wanted to secure as great a market share as possible.

As developments continued, it is no surprise that already by 1891, in Germany for example, the first long-distance transmission of electrical energy was put into operation, over a distance of more than 170 km from Lauffen am Neckar to Frankfurt am Main. It was a technological breakthrough using three-phase current technology. However, this has by no means been the end of the story for direct current. Not least because of digitalization, electromobility, decentralized energy supplies, etc., DC voltage has experienced a full-blown renaissance and now is treated almost as a brand-new topic.

The Camille Bauer story.
The foundation of the Camille Bauer company dates back to 1900, immediately after the War of the Currents just described, at a time when electricity was rapidly gaining in importance. At the turn of the century, the Camille Bauer company, named after its founder Camille Bauer-Judlin, began importing measuring instruments for the trendy new phenomenon called “electricity” into Switzerland for sale to the local market. Some years later, in 1906, Dr. Siegfried Guggenheimer (1875 – 1938), formerly a research scientist for Wilhelm Conrad Röntgen (1845 – 1923) and who in 1901, became the first winner of the Nobel Prize for physics, founded what was a start-up company in Nuremberg, Germany, trading under his own name. The company was engaged in the production and sale of electrical measuring instruments. However, due to pressure from the Nazis because Dr. Guggenheimer was of Jewish descent, he had to rename the company in 1933, creating Metrawatt AG.

Four technological segments.

Four technological segments.

In 1919, a man by the name of Paul Gossen entered the picture. He was so dissatisfied with his employment with Dr. Guggenheimer that he founded his own company in Erlangen, near Nuremberg, and for decades the two rivals were continuously in fierce competition with one another. In 1944, towards the end of the Second World War, Camille Bauer could see that its importing business had virtually come to a standstill. All the factories of its suppliers, which were mainly in Germany (for example Hartmann & Braun, Voigt & Haeffner, Lahmeyer, etc.), had been converted to supplying materials for the war. At this point, a decision had to be made quickly. Camille Bauer’s original trading company located in Basel (CH), undertook a courageous transformation. In order to survive, it turned itself into a manufacturing company. In a first step, the recently formed manufacturing company Matter, Patocchi & Co. AG in Wohlen (CH) was taken over, in order to be get the business up and running quickly with the necessary operating resources at their disposal. Thus the Swiss manufacturing base in Wohlen in the canton of Aargau was born.

The story does not end there. In 1979, Camille Bauer was taken over by Röchling a family-owned company in Mannheim, Germany. At that time, Röchling wanted to quit the iron and steel business and enter the field of I&C technology. Later, in 1993, Gossen in Erlangen and Metrawatt in Nuremberg were reunited in a single company, after Röchling became owner of the Gossen holding company as a result of the acquisition of the Bergmann Group from Siemens in 1989, and Metrawatt was acquired from ABB in 1992. At the same time, Camille Bauer’s German sales operation in Frankfurt-Dreieich also became a part of the company. Today the companies operate globally and successfully under the umbrella brand of GMC-I (Gossen Metrawatt Camille-Bauer-Instruments).

A new era.
The physics of electric current have not changed over the course of time. However, business conditions have changed drastically, especially over the last 5-10 years. Catch phrases such as electricity free market, collective self-consumption, renewable energy sources, PV, wind power, climate targets, reduction of CO2 emissions, e-mobility, battery storage, Tesla, smart meters, digitalization, cyber security, network quality, etc. are all areas of interest for both people and companies. And last but not least, with today’s protest demonstrations, climate change has become a political issue. We will have to see what results from this. At the very least, the catch phrases mentioned above are perfect for developing scenarios for electricity supply security. And it really is the case that the traditional electricity infrastructure, which is often as old as Camille Bauer Metrawatt itself, was not designed for the new types of energy behaviour, either those on the consumer side or the decentralised feed-in side. As a result, it is ever more important to have increasing numbers of intelligent systems which need to work from basic data obtained from precise measurements in order to avoid outages, blackouts and resulting damage.

The overall diversity of these new clusters of topics has prompted Camille Bauer Metrawatt AG to once more face the challenges with courage and above all to do so in an innovative and productive way. In this spirit, Camille Bauer Metrawatt AG develops, produces and distributes its product range globally in 4 technological segments.

These are:
(1) Measurement & Display,
(2) Power Quality,
(3) Control & Monitoring,
(4) Software, Systems and Solutions.

Through its expert staff, modern tools and external partners Camille Bauer Metrawatt is able, for example, to analyse power quality and detect power quality problems. In addition, the Camille Bauer Metrawatt Academy, recently founded in 2019, puts its focus on knowledge transfer by experienced lecturers, with the latest and most important topics as its main priority. Furthermore, we keep in very close contact with customers, authorities, associations, specialist committees, educational institutions, practice-oriented experts and the scientific community in order to continually provide the requisite solutions to the market and interested parties.

#Camille_Bauer_Metrawatt #PAuto @irishpwrprocess

The most viewed Stories in 2019.

  • MCAA President Teresa Sebring has certified the election of officers and directors of the Measurement, Control & Automation Association…
  • The VP869 high performance 6U OpenVPX FPGA processing board has been announced by Abaco Systems .  Featuring two Xilinx® UltraScale+™ FPGAs …
  • Mr. Uwe Gräff has been appointed to the Board of New Technologies & Quality at the Harting Technology Group. He follows Dr. Frank Brode…
  • ISA demonstrates its undoubted strength again in providing stunning seminars allied with top class training built on member experience. Ne…
  • GE-IP Third Annual Executive Forum on delivering operational excellence. GE Intelligent Platforms recently hosted its third annual execut…
  • Leading monitoring and analysis solution makes improving SQL Server performance easier than ever SolutionsPT has announced a new partners…
  • The International Society of Automation (ISA) has welcomed Paul Gruhn , PE, CFSE, and ISA Life Fellow, is its 2019 Society President. Pa…
  • exida, LLC announced that it has assessed the Honeywell ISA100 Wireless™ Device Manager model WDMY version R320 and certified that it meets…
  • Anglia has continued to make big strides towards full RoHS 3* compliance over six months before the deadline for meeting the new provisions…
  • The emergence of radar has been an important advance in the level measurement field. Radar represents a cost effective, accurate solution that is immune to density and other process fluid changes….

#PAuto #TandM

Most viewed stories in 2018

What is on the list of trends for 2020?

Data centre trends for 2020 from Rittal

Growing volumes of data, a secure European cloud (data control), rapid upgrades of data centres and rising energy consumption are the IT/data centre trends for Rittal in 2020. For example, the use of OCP (Open Compute Project) technology and heat recovery offers solutions for the challenges of the present.

Concept of cloud computing or big data, shape of cloud in futuristic style with digital technology interface!

According to the market researchers at IDC (International Data Corporation), humans and machines could already be generating 175 zettabytes of data by 2025. If this amount of data were stored on conventional DVDs, it would mean 23 stacks of data discs, each of them reaching up to the moon. The mean 27 percent annual rate of data growth is also placing increasing pressure on the IT infrastructure.

Since there is hardly any company that can afford to increase its own data storage by almost a third every year, IT managers are increasingly relying on IT services from the cloud. The trend towards the cloud has long since been a feature in Germany: A survey published in the summer of 2019 by the Bitkom ICT industry association together with KPMG showed that three out of four companies are already using cloud solutions.

However, businesses using cloud solutions from third-party providers do lose some control over their corporate data. That is why, for example, the US Cloud Act (Clarifying Lawful Overseas Use of Data) allows US authorities to access data stored in the cloud, even if local laws at the location where the data is stored do prohibit this.

“Future success in business will be sustainable if they keep pace with full digital transformation and integration. Companies will use their data more and more to provide added value – increasingly in real time – for example in the production environment,” says Dr Karl-Ulrich Köhler, CEO of Rittal International. “Retaining control over data is becoming a critical success factor for international competitiveness,” he adds.

Trend #1: Data control
The self-determined handling of data is thus becoming a key competitive factor for companies. This applies to every industry in which data security is a top priority and where the analysis of this data is decisive for business success. Examples are the healthcare, mobility, banking or manufacturing industries. Companies are now faced with the questions of how to process their data securely and efficiently, and whether to modernise their own data centre, invest in edge infrastructures or use the cloud.

The major European “Gaia-X” digital project, an initiative of the German Federal Ministry for Economics and Energy (BMWi), is set to start in 2020. The aim is to develop a European cloud for the secure digitalization and networking of industry that will also form the basis for using new artificial intelligence (AI) applications. The Fraunhofer Gesellschaft has drawn up the “International Data Spaces” initiative in this context. This virtual data room allows companies to exchange data securely. The compatibility of their own solutions with established (cloud) platforms (interoperability) is also provided.

This means that geographically widespread, smaller data centres with open cloud stacks might be able to create a new class of industrial applications that perform initial data analysis at the point where the data is created and use the cloud for downstream analysis. One solution in this context is ONCITE. This turnkey (plug-and-produce) edge cloud data centre stores and processes data directly where it arises, enabling companies to retain control over their data when networking along the entire supply chain.

Trend #2: Standardisation in data centres with OCP
The rapid upgrade of existing data centres is becoming increasingly important for companies, as the volume of data needing to be processed continues to grow. Essential requirements for this growth are standardised technology, cost-efficient operation and a high level of infrastructure scalability. The OCP technology (Open Compute Project) with its central direct current distribution in the IT rack is becoming an interesting alternative for more and more CIOs. This is because DC components open up new potentials for cost optimisation. For instance, all the IT components can be powered centrally with n+1 power supplies per rack. This way, an efficient cooling is achieved, since fewer power packs are present. At the same time, the high degree of standardisation of OCP components simplifies both maintenance and spare parts management. The mean efficiency gain is around five percent of the total current.

Rittal expects that OCP will establish itself further in the data centre as an integrated system platform in 2020. New OCP products for rack cooling, power supply or monitoring will enable rapid expansion with DC components. Furthermore, new products will support the conventional concept of a central emergency power supply, where the power supply is safeguarded by a central UPS. As a result, it will no longer be necessary to protect every single OCP rack with a UPS based on lithium-ion batteries. The advantage: the fire load in the OCP data centre is reduced considerably.

Trend #3: Heat recovery and direct CPU cooling
Data centres release huge amounts of energy into the environment in the form of waste heat. As the power density in the data centre grows, so too do the amounts of heat, which can then potentially be used for other purposes. So far, however, the use of waste heat has proven too expensive, because consumers are rarely found in the direct vicinity of the site for example. In addition, waste heat, as generated by air-based IT cooling systems, is clearly too low at a temperature of 40 degrees Celsius to be used economically.

In the area of high-performance computing (HPC) in particular, IT racks generate high thermal loads, often in excess of 50 kW. For HPC, direct processor cooling with water is significantly more efficient than air cooling, so that return temperatures of 60 to 65 degrees become available. At these temperatures, for instance, it is possible to heat domestic hot water or use heat pumps or to feed heat into a district heating network. However, CIOs should be aware that only about 80 percent of the waste heat can be drawn from an IT rack, even with a direct CPU water cooling. IT cooling is still needed by the rack for the remaining 20 percent.

At the German Government’s 2019 Digital Summit, the topic of heat recovery was discussed in the working group concerned, which identified a high need for action. For this reason, Rittal assumes that by 2020, significantly more CIOs will be involved in the issue of how the previously unused waste heat from the data centre can be used economically.

Trend #4: Integration of multi-cloud environments
Businesses need to be assured that they can run their cloud applications on commonly used platforms and in any country. This calls for a multi-cloud strategy. From management’s point of view, this is a strategic decision based on the knowledge that its own organisation will develop into a fully digitised business.

For example, an excellent user experience is guaranteed by minimising delays with the appropriate availability zones on site. This means that companies choose one or more availability zones worldwide for their services, depending on their business requirements. Strict data protection requirements are met by a specialised local provider in the target market concerned, for example. A vendor-open multi-cloud strategy allows exactly that: combining the functional density and scalability of hyperscalers with the data security of local and specialised providers such as Innovo Cloud. At the push of a button, on a dashboard, with a contact person, an invoice and in the second when the business decision is made – this is what is making multi-cloud strategies one of the megatrends of the coming years. This is because the economy will take further steps towards digital transformation and further accelerate its own continuous integration (CI) and continuous delivery (CD) pipelines with cloud-native technologies – applications designed and developed for the cloud computing architecture. Automating the integration and delivery processes will then enable the rapid, reliable and repeatable deployment of software.

#PAuto @Rittal @EURACTIV @PresseBox


Checking organic carbon content.


Methods for checking water quality are an incredibly important part of the many processes involved in ensuring we have access to safe drinking water. However, as contaminants can come from many different sources, finding a general solution for contaminant identification and removal can be difficult.

Purification processes for water treatment include removal of undesirable chemicals, bacteria, solid waste and gases and can be very costly. Utility companies in England and Wales invested £2.1 billion (€2.44b) between 2013 and 2014 into infrastructure and assorted costs to ensure safe drinking water.1

One of the most widely used measures for assessing whether water is safe for consumption or not is the analysis of the organic carbon (TOC) content. Dissolved organic carbon content is a measure of how much carbon is found in the water as part of organic compounds, as opposed to inorganic sources such as carbon dioxide and carbonic acid salts.2   It has been a popular approach since the 1970s for both assessment of drinking water and checking wastewater has been sufficiently purified.

The proportion of organic carbon in water is a good proxy for water quality as high organic carbon levels indicate a high level of organisms in the water or, contamination by organic compounds such as herbicides and insecticides. High levels of microorganisms can arise for a variety of reasons but are often a sign of contamination from a wastewater source.

Testing TOC
Water therefore needs to be continually monitored for signs of change in the TOC content to check it is safe for consumption. While many countries do not specifically regulate for TOC levels, the concentrations of specific volatile organic compounds are covered by legislation and recommended levels of TOC are 0.05 ml/l or less.3

There are a variety of approaches for testing water for organic carbon. One approach is to measure the entire carbon content (organic and inorganic) and then subtract any carbon dioxide detected (as it is considered inorganic carbon) and any other carbon from inorganic sources. Another is to use chemical oxidation or even high temperature so that all the organic compounds in the sample will be oxidized to carbon dioxide, and measuring the carbon dioxide levels, therefore, acts as a proxy for the TOC concentration.

For wastewater plants, being able to perform online, real-time analysis of water content is key and measurements must be sensitive and accurate enough to pick up small changes in even low concentrations of chemical species. British legislation also makes it an offense to supply drinking water which does not adhere to legislation4, with several water suppliers being fined over a hundred million pounds for recent discharges of contaminated waters.5

Vigilant Monitors
One of the advantages of using carbon dioxide levels as a proxy for TOC content is that carbon dioxide absorbs infrared light very strongly. This means using nondispersive infrared (NDIR) detectors provide a very sensitive way of detecting even trace amounts of carbon dioxide.

Edinburgh Sensors are one of the world leaders in NDIR sensor production and offer a range of NDIR-based gas detectors suitable for TOC measurements of water.6 Of these, for easy, quick and reliable TOC measurements, the Gascard NG is an excellent device for quantifying carbon dioxide levels.7

Gascard NG
The Gascard NG is well-suited to continual carbon dioxide monitoring for several reasons. First, the device is capable of detecting a wide range of carbon dioxide concentrations, from 0 – 5000 ppm, maintaining a ± 2 % accuracy over the full detection range. This is important so that the sensor has the sensitivity required for checking TOC levels are sufficiently low to be safe for drinking water but means it is also capable of operating under conditions where TOC levels may be very high, for example in the wastewater purification process.

As it can come with built-in true RS232 communications for both control and data logging, the Gascard NG can be used to constantly monitor carbon dioxide levels as well as be integrated into feedback systems, such as for water purification, to change treatment approaches if the TOC content gets too high. UK legislation also requires some level of record-keeping for water quality levels, which can also be automated in a straightforward way with the Gascard.4

The Gascard NG is capable of self-correcting measurements over a range of humidity conditions (0 – 95 %) and the readings can be pressure-corrected with on-board electronics between 800 to 1150 mbar. Temperature compensation is also featured between 0 to 45ºC to ensure reliable measurements, over a wide range of environmental conditions.

Designed to be robust, maintenance-free and fail-safe, the Gascard NG also comes with several customizable options. The expansion port can be used for small graphical display modules for in-situ observable readings and TCP/IP communications can also be included if it’s necessary to have communications over standard networks. In conjunction with Edinburgh Sensor’s expertise and pre- and post-sales support, this means that the Gascard NG can easily be integrated into existing TOC measurement systems to ensure fast and accurate monitoring at all times.


  1. Water and Treated Water (2019),
  2. Volk, C., Wood, L., Johnson, B., Robinson, J., Zhu, H. W., & Kaplan, L. (2002). Monitoring dissolved organic carbon in surface and drinking waters. Journal of Environmental Monitoring, 4(1), 43–47.
  3. DEFRA (2019)
  4. Water Legislation (2019),
  5. Water Companies Watchdog (2019)
  6. Edinburgh Sensors (2019),
  7. Gascard NG, (2019),
#Pauto @Edinst

Wireless lifting!


New wireless control technologies are being adopted by manufacturers and users of cranes and other lifting equipment. Tony Ingham of Sensor Technology Ltd explains the advantages and looks at how the field is developing.


Trade Ship Harbored at Port of San Pedro, California, U.S.A.

Most of us find it odd to look back five or ten years to when our home computers were tethered to the wall by a cable. Somehow we just accepted that the cable restricted the mobility of the device and lived with that limitation. Now, of course, it is completely normal to pull a mobile phone out of your pocket and dial up the internet so that we can look up obscure information, book tickets or connect with a computer many miles away.

In the industrial and commercial arenas, wireless technology has revolutionised many practices. Logistics companies now routinely track the progress of every single parcel in their charge; field engineers collect data from and send instructions to remote facilities such as water pumping stations; customer accounts can be updated in real time, etc.
However, there is another aspect to wireless technology that is less obvious to the general public, but which engineers and technicians are really coming to appreciate. This may best be described as ‘local wirelessness’ or ‘wireless LANs (local area networks)’. Basically these remove to the need to install and maintain wiring for control equipment fitted to machinery such as cranes, hoists, lifts and elevators.

Control technology is essential to many modern industries, as it is the only practical way to ensure reliable operation and high efficiency.

Handling products through a busy working environment whether it is a container port, manufacturing plant, warehouse, logistics centre or retail outlet involves making sure that materials handling is rapid, accurate and safe. This requires a control system that can handle huge amounts of data in real time, can safely operate heavy duty machinery, and if necessary withstand extremes of climate and environment. Further, as many operations now run 24×7 there are likely to be immediate consequences to breakdowns and other stoppages, meaning control equipment has to be robust and reliable.

However the basic principles of a control system are relatively simple. The rotation of drive shafts in the various cranes and other machinery can be used to collect load and movement data on each item being moved. Each turn of the shaft will progress the equipment’s operation forward or backward a small but consistent amount, and if you can also measure the torque (rotational strain) in the shaft you can calculate the weight of the load being transferred.

This raw data stream can be used to easily calculate operational information such as the amount of goods or product moved, the time to completion of each operation, and the destination of each load. It is also equally easy to convert this operational data into commercial information and safety reports that include cumulative operating hours, total load lifted and other statistics.

In the past taking measurements from drive shafts has been difficult, but TorqSense, from Sensor Technology, provides a perfect solution. Previously, it was necessary to install sensors in difficult to access parts of industrial machinery and wire them back into the communication network that connected with a computer for collecting and interpreting the data. And once installed, the wiring had to be protected from damage and replaced if it failed.

However TorqSense gets around this by using radio transmissions instead of wiring. Further, old fashioned torque sensors tended to be delicate because they needed a fragile slip ring to prevent the turning drive shaft from pulling the wiring out of place, whereas TorqSense uses a wireless radio-frequency pick-up head that does not need physical contact with the rotating shaft.

A practical attraction of TorqSense is that its wirelessness makes it ultra-simple to install and robust in use. Furthermore it is largely unaffected by harsh operating environments, electromagnetic interference, etc. It is equally at home measuring coal on a conveyor, working on a dockside crane weighing and counting containers or in any other lifting application.

TorqSense is proving popular with an increasing number of users across many fields – not only in lifting, but also in robotics, chemical mixing and automotive applications – almost anywhere that uses machinery with rotating drive shafts.

Sensor Technology has also developed a complementary range of sensors, which measures the straight line equivalent to torque. Called LoadSense, this too uses a wireless radio frequency pick-up to collect data signals from the sensing head and transmit them wirelessly to a receiving computer for analysis.

It is notable, that both TorqSense and LoadSense can be used in a fully wireless mode, but equally can be fitted into conventional cabled systems, so is easy to retrofit into existing control systems.

It is also interesting to know that LoadSense was actually developed at the behest of a particular customer, a helicopter pilot who wanted real-time and exact information about the weight of loads he was carrying from an underslung cargo hook. The issue for him was that he would invalidate his helicopter’s certificate of airworthiness if he drilled a hole in the fuselage for a cable, so he had to have a wireless solution. This application also required very robust hardware that could withstand heat and cold, extreme movements and shock loads and be unaffected by motor noise, radio interference etc – all characteristics that translate well into fields such as lifting, conveying and cranage.
TorqSense, LoadSense, wireless data transfer and communications are making an increasing contribution to the development of materials handling technologies. While they are ‘invisible’ to the casual observer, they have the capacity to revolutionise many aspects of lifting operations and to drive efficiency, reliability and safety to new levels.

#PAuto @sensortech