Sensors for Mars.

02/07/2020
International collaboration takes Vaisala and the Finnish Meteorological Institute (FMI) to Mars onboard NASA’s Mars 2020 Perseverance rover. The rover is scheduled for launch on July 30, 2020. Vaisala’s sensor technology combined with FMI’s measurement instrumentation will be used to obtain accurate and reliable pressure and humidity data from the surface of the red planet.

The Finnish Meteorological Institute (FMI) is among the scientific partners providing measurement equipment for the new Perseverance rover, expected to launch in July and land on Mars in February 2021. The pressure and humidity measurement devices developed by the FMI are based on Vaisala’s world known sensor technology and are similar but more advanced to the ones sent to Mars on the first Curiosity rover in 2012.

Is there anybody out there?
Join this live webcast to hear more!
Welcome to learn about space-proof technology, how it works, what it does, why it’s important, and why measurements play a key role in space research. You’ll hear examples and stories by our experts, and by a special guest speaker, who will be sharing his own experiences and insights of space.
• Date: July 20, 2020
•Time: 15.30-16.30 EEST – 14.30-15.30 CEST – 08.30-09.30 EDT

Place: Virtual event – Sign up here
The event is organized by Vaisala and the Finnish Meteorological Institute. It will be held in English and it is free of charge. Live subtitles in Finnish will be available.
Learn more about space-proof technology before the event here and follow the discussion on social media using #spacetechFI.

The new mission equipment complements the Curiosity rover. While working on Mars, the Curiosity and Perseverance rovers will form a small-scale observation network. The network is only the first step, anticipating the extensive observation network planned on Mars in the future.

International and scientific collaboration aims to gather knowledge of the Martian atmosphere and other environmental conditions
The Mars 2020 mission is part of NASA’s Mars Exploration Program. In order to obtain data from the surface from the Red Planet, NASA selected trusted partners to provide measurement instruments for installation on the Mars rover. A Spanish-led European consortium provides the rover with Mars Environmental Dynamics Analyzer (MEDA); a set of sensors that provides measurements of temperature, wind speed and direction, pressure, relative humidity, and the amount and size of dust particles.

As part of the consortium, FMI delivers instrumentation to MEDA for humidity and pressure measurements based on Vaisala’s top quality sensors.

“Mars, as well as Venus, the other sister planet of Earth, is a particularly important area of atmospheric investigations due to its similarities to Earth. Studying Mars helps us also better understand the behavior of Earth’s atmosphere”, comments Maria Genzer, Head of Planetary Research and Space Technology group at FMI.

The harsh and demanding conditions of Mars require the most reliable sensor technology that provides accurate and reliable data without maintenance or repair.

“We are honored that Vaisala’s core sensor technologies have been selected to provide accurate and reliable measurement data on Mars. In line with our mission to enable observations for a better world, we are excited to be part of this collaboration. Hopefully the measurement technology will provide tools for finding answers to the most pressing challenges of our time, such as climate change,” says Liisa Åström, Vice President, Products and Systems of Vaisala.

Same technology, different planet – utilizing Vaisala core technologies for accuracy and long-term stability
In the extreme conditions of the Martian atmosphere, NASA will be able to obtain accurate readings of pressure and humidity levels with Vaisala’s HUMICAP® and BAROCAP® sensors. The sensors’ long-term stability and accuracy, as well as their ability to tolerate dust, chemicals, and harsh environmental conditions, make them suitable for very demanding measurement needs, also in space. The same technology is used in numerous industrial and environmental applications such as weather stations, radiosondes, greenhouses and datacenters.

Barocap Wafer

The humidity measurement device MEDA HS, developed by FMI for Perseverance, utilizes standard Vaisala HUMICAP® humidity sensors. HUMICAP® is a capacitive thin-film polymer sensor consisting of a substrate on which a thin film of polymer is deposited between two conductive electrodes. The humidity sensor onboard is a new generation sensor, with superior performance also in the low pressure conditions expected on the red planet.

In addition to humidity measurements, FMI has developed a device for pressure measurement, MEDA PS, which uses customized Vaisala BAROCAP® pressure sensors, optimized to operate in the Martian climate. BAROCAP® is a silicon-based micromechanical pressure sensor that offers reliable performance in a wide variety of applications, from meteorology to pressure sensitive industrial equipment in semiconductor industry and laboratory pressure standard measurements. Combining two powerful technologies – single-crystal silicon material and capacitive measurement – BAROCAP® sensors feature low hysteresis combined with excellent accuracy and long-term stability, both essential for measurements in space.

“Our sensor technologies are used widely in demanding everyday measurement environments here on Earth. And why not – if they work on Mars, they will work anywhere,” Åström concludes.

@VaisalaGroup @FMIspace @NASAPersevere #Metrology #Finland


The Arms Race Between Cybercriminals and Cybersecurity

28/05/2020

The number of devices connected to the internet is expected to reach 50 billion worldwide by the end of 20301, posing dangerous risks to people, businesses, and critical systems. To illustrate the divide between cyberattacks on these devices and business preparedness, Sectigo, a leading provider of automated digital identity management and web security solutions, today released its Evolution of IoT Attacks study.

The study report and associated infographic chronicle the progression, variety, and growing sophistication of many of the most infamous vulnerabilities and attacks on connected devices, as well as the emerging defenses used by organizations to fight them.

Sectigo has categorized IoT attacks into three eras: 

The Era of Exploration
Beginning in 2005, cybercriminals started to explore the potential to cause lasting damage to critical infrastructure, and even life. Security defenses at the time were rudimentary, with organizations unaware of the value the IoT could have for hostile actors.

The Era of Exploitation
Spanning 2011-2018, cybercriminals actively exploited the lucrative and damaging potential of attacking the IoT, thus expanding attacks to more targets with increased severity. However, they found organizations more prepared to withstand the onslaught. White hat hackers exposed potential IoT vulnerabilities to help shore up defenses before attacks occurred in the wild. Meanwhile, as organizations fortified their defenses, cybercriminals found more ways to monetize their attacks through crypto mining, ad-click fraud, ransomware, and spam email campaigns.

The Era of Protection
By 2019, enterprises and other organizations had become increasingly capable of countering these attacks. Just recently, governments have begun enacting regulations to protect IoT assets, and businesses and manufacturers are heeding the warnings. In fact, according to the recent 451 Research Enterprise IoT Budgets and Outlook report
, organizations are investing more than half of their IoT budgets, 51%, to implement security controls in devices, using security frameworks and unified solutions with strong technologies that work together to provide multiple layers of protection.

“As we move into this decade, protecting the vast Internet of Things has never been more critical for our safety and business continuity,” said Alan Grau, VP of IoT/Embedded Solutions at Sectigo. “Cybercriminals are retooling and honing their techniques to keep striking at vulnerable targets. Yes, businesses and governments are making laudable efforts to protect all things connected, but we are only at the beginning of the Era of Protection and should assume that these efforts will be met by hackers doubling down on their efforts.”

IoT security must start on the factory floor with manufacturers and continue throughout the device’s lifecycle. Power grids, highways, data security, and more depend on organizations adopting ever- evolving, cutting-edge security technologies in order to withstand attacks.

#PAuto #IoT @SectigoHQ


Flood monitoring.

27/01/2020
Monitoring is an essential component of natural flooding management, helping to define appropriate measures, measure their success, keep stakeholders informed, identify mistakes, raise alarms when necessary, inform adaptive management and help guide future research.
GreatFenGB

Great Fen showing Holme Fen woods top left and new ponds and meres in April

Flooding is a natural process, but it endangers lives and causes heavy economic loss. Furthermore, flood risk is expected to increase with climate change and increased urbanisation, so a heavy responsibility lies with those that allocate funding and formulate flood management strategy. In the following article, Nigel Grimsley from OTT Hydromet explains how the success of such plans (both the design and implementation) depend on the accuracy and reliability of the monitoring data upon which they rely.

Climate projections for Britain suggest that rainfall will increase in winter and decrease in summer, and that individual rainfall events may increase in intensity, especially in winter. This paradigm predicates an increased risk of flooding.

Emphasising the urgent need for action on flood risk, (British) Environment Agency chairwoman Emma Howard Boyd, has said that on current trends, global temperature could rise between 2 deg C and 4 Deg C by 2100 and some communities may even need to move because of the risk of floods. Launching a consultation on the agency’s flood strategy, she said: “We can’t win a war against water by building away climate change with infinitely high flood defences.”

In response, Mike Childs, head of science at Friends of the Earth, said: “Smarter adaptation and resilience building – including natural flood management measures like tree-planting – is undeniably important but the focus must first and foremost be on slashing emissions so that we can avoid the worst consequences of climate chaos in the first place.”

Historically, floodplains have been exploited for agricultural and urban development, which has increased the exposure of people, property and other infrastructure to floods. Flood risk management therefore focused on measures to protect communities and industry in affected areas. However, flood risk is now addressed on a wider catchment scale so that initiatives in one part of a catchment do not have negative effects further downstream. This catchment based approach is embodied within the EU Floods Directive 2007/60/EC, and in recent years, those responsible for flood management have increasingly looked for solutions that employ techniques which work with natural hydrological and morphological processes, features and characteristics to manage the sources and pathways of flood waters. These techniques are known as natural flood management (NFM) and include the restoration, enhancement and alteration of natural features but exclude traditional flood defence engineering that effectively disrupts these natural processes.

NFM seeks to create efficiency and sustainability in the way the environment is managed by recognising that when land and water are managed together at the catchment scale it is possible to generate whole catchment improvements with multiple benefits.

Almost all NFM techniques aim to slow the flow of water and whilst closely connected, can be broadly categorised as infiltration, conveyance and storage.

Infiltration
Land use changes such as set-aside, switching arable to grassland or restricted hillside cropping, can improve infiltration and increase water retention. In addition, direct drilling, ‘no-till’ techniques and cross slope ploughing can have a similar effect. These land use techniques are designed to reduce the soil compaction which increases run-off. Livestock practices such as lower stocking rates and shorter grazing seasons can also help. Field drainage can be designed to increase storage and reduce impermeability, which is also aided by low ground pressure vehicles. The planting of shrubs and trees also helps infiltration and retention by generating a demand for soil moisture, so that soils have a greater capacity to absorb water. Plants also help to bind soil particles, resulting in less erosion – the cause of fertility loss and sedimentation in streams and rivers.

Conveyance
Ditches and moorland grips can be blocked to reduce conveyance, and river profiles can be restored to slow the flow. In the past, peats and bogs have been drained to increase cropping areas, but this damages peatlands and reduces their capacity to retain water and store carbon. The restoration of peatland therefore relies on techniques to restore moisture levels. Pumping and drainage regimes can be modified, and landowners can create strategically positioned hedges, shelter belts and buffer strips to reduce water conveyance.

Storage
Rivers can be reconnected with restored floodplains and river re-profiling, leaky dams, channel works and riparian improvements can all contribute to improved storage capability. In urban areas permeable surfaces and underground storage can be implemented, and washlands and retention ponds can be created in all areas. As mentioned above, the re-wetting of peatland and bogs helps to increase storage capacity.

Many of the effects of NFM might be achieved with the re-introduction of beavers, which build dams that reduce peak flows, create pools and saturate soil above their dams. The dams also help to remove pollutants such as phosphates. Beavers do not eat fish, instead preferring aquatic plants, grasses and shrubs during the summer and woody plants in winter. Beavers are now being introduced in a number of areas in trials to determine their value in the implementation of NFM. One of the key benefits offered by beavers is their ability to quickly repair and rebuild dams that are damaged during extreme weather. However, whilst the potential benefits of beavers are well known, several groups have expressed concern with the prospect of their widespread introduction. For example, farmers and landowners may find increased areas of waterlogged land due to blocked drainage channels. In addition, dams present a threat to migratory fish such as salmon and sea trout.

Beavers are native to Britain and used to be widespread, but they were hunted to extinction during the 17th century. However, other non-native species such as signal crayfish can have a detrimental effect on flood protection because they burrow into river banks causing erosion, bank collapse and sediment pollution. Signal crayfish are bigger, grow faster, reproduce more quickly and tolerate a wider range of conditions than the native white-clawed crayfish. Signal crayfish are also voracious predators, feeding on fish, frogs, invertebrates and plants, and as such can create significant negative ecological effects.

NFM benefits
NFM provides protection for smaller flood events, reduces peak flooding and delays the arrival of the flood peak downstream. However, it does not mitigate the risk from extreme flood events. Effective flood management strategy therefore tends to combine NFM with hard engineering measures. Nevertheless, NFM generally provides a broader spectrum of other benefits.

The creation of new woodlands and wetlands produces biodiverse habitats with greater flood storage capacity. They also enable more species to move between habitats. NFM measures that reduce soil erosion, run-off and sedimentation also help to improve water quality and thereby also improve habitats. In particular, these measures lower nutrient and sediment loading lower in the catchment; two issues which can have dramatic effects on water quality and amenity.

Land use and land management measures help to reduce the loss of topsoil and nutrients. This improves agricultural productivity and lowers the cost of fertilizers. Furthermore, a wide range of grants are available for NFM measures, such as the creation of green spaces and floodplains, to make them more financially attractive to farmers and landowners.

Many NFM measures help in the fight against climate change. For example, wetlands and woodlands are effective at storing carbon and removing carbon dioxide from the atmosphere. Measures that reduce surface run off and soil erosion, such as contour cultivation, can also reduce carbon loss from soil.

Monitoring NFM
Given the wide range of potential NFM benefits outlined above, the number and type of parameters to be monitored are likely to be equally diverse. Baseline data is essential if the impacts of implemented measures are to be assessed, but this may not always be deliverable. For example, it may only be possible to collect one season of data prior to a five year project. However, it may be possible to secure baseline data from other parties. In all instances data should of course be accurate, reliable, relevant and comparable.

Monitoring data should be used to inform the design of NFMs. For example, a detailed understanding of the ecology, geomorphology, hydrology and meteorology of the entire catchment will help to ensure that the correct measures are chosen. These measures should be selected in partnership with all stakeholders, and ongoing monitoring should provide visibility of the effects of NFM measures. Typically stakeholders will include funders, project partners, local communities, landowners, regulators and local authorities.

Since NFM measures are designed to benefit an entire catchment, it is important that monitoring is also catchment-wide. However, this is likely to be a large area so there will be financial implications, particularly for work that is labour-intensive. Consequently, it will be necessary to prioritise monitoring tasks and to deploy remote, automatic technology wherever it is cost-effective.

OTT ecoN with wiper

OTT ecoN Sensor

Clearly, key parameters such as rainfall, groundwater level, river level and surface water quality should be monitored continuously in multiple locations if the benefits of NFM are to be measured effectively. It is fortunate therefore that all of these measurements can be taken continuously 24/7 by instruments that can be left to monitor in remote locations without a requirement for frequent visits to calibrate, service or change power supplies. As a business OTT Hydromet has been focused on the development of this capability for many years, developing sensors that are sufficiently rugged to operate in potentially aggressive environments, data loggers with enormous capacity but with very low power requirement, and advanced communications technologies so that field data can be instantly viewed by all stakeholders.

Recent developments in data management have led to the development of web-enabled data management solutions such as Hydromet Cloud, which, via a website and App, delivers the backend infrastructure to receive, decode, process, display and store measurement data from nearly any remote hydromet monitoring station or sensor via a cloud-based data hosting platform. As a consequence, alarms can be raised automatically, which facilitates integration with hard engineering flood control measures. Hydromet Cloud also provides access to both current and historic measurement data, enabling stakeholders to view the status of an entire catchment on one screen.

Holme Fen – a monitoring lesson from the 1850s

Holme Fen post HS

Holme Fen post

Surrounded by prime agricultural land to the south of Peterborough (Cambridgeshire,GB) , the fens originally contained many shallow lakes, of which Whittlesey Mere was the largest, covering around 750 hectares in the summer and around twice that in the winter. Fed by the River Nene, the mere was very shallow and was the last of the ‘great meres’ to be drained and thereby converted to cultivatable land.

Led by William Wells, a group of local landowners funded and arranged the drainage project, which involved the development of a newly invented steam powered centrifugal pump which was capable of raising over 100 tons of water per minute by 2 or 3 feet. A new main drain was constructed to take water to the Wash. Conscious of the likely shrinking effect of drainage on the peaty soil, Wells instigated the burial of a measurement post, which was anchored in the Oxford Clay bedrock and cut off at the soil surface. In 1851 the original timber post was replaced by a cast iron column which is believed to have come from the Crystal Palace in London.

By installing a measurement post, Wells demonstrated remarkable foresight. As the drainage proceeded, the ground level sank considerably; by 1.44 metres in the first 12 years, and by about 3 metres in the first 40 years. Today, around 4 metres of the post is showing above ground, recording the ground subsidence since 1852. The ground level at Holme Post is now 2.75 metres below sea level – the lowest land point in Great Britain.
Several complications have arisen as a result of the drainage. Firstly, there has been a huge impact in local ecology and biodiversity with the loss of a large area of wetland. Also, as the ground level subsided it became less sustainable to pump water up into the main drain.

Holme Fen is now a National Nature Reserve, managed by Natural England, as is the nearby Woodwalton Fen. They are both part of the Great Fen Project, an exciting habitat restoration project, involving several partners, including the local Wildlife Trust, Natural England and the Environment Agency. At Woodwalton, the more frequent extreme weather events that occur because of climate change result in flooding that spills into the reserve. In the past, this was a good example of NFM as the reserve provided a buffer for excess floodwater. However, Great Fen Monitoring and Research Officer Henry Stanier says: “Floodwater increasingly contains high levels of nutrients and silt which can harm the reserve’s ecology, so a holistic, future-proof strategy for the area is necessary.”

Applauding the farsightedness of William Wells, Henry says: “As a conservationist I am often called in to set up monitoring after ecological recovery has begun, rather than during or even before harm has taken place. At the Wildlife Trust, we are therefore following the example provided by Wells, and have a network of monitoring wells in place so that we can monitor the effects of any future changes in land management.

“For example, we are setting up a grant funded project to identify the most appropriate crops for this area; now and in the future, and we are working with OTT to develop a monitoring strategy that will integrate well monitoring with the measurement of nutrients such as phosphate and nitrate in surface waters.”

Summary
Monitoring provides an opportunity to measure the effects of initiatives and mitigation measures. It also enables the identification of trends so that timely measures can be undertaken before challenges become problems, and problems become catastrophes.

Monitoring is an essential component of NFM, helping to define appropriate measures, measure their success, keep stakeholders informed, identify mistakes, raise alarms when necessary, inform adaptive management and help guide future research.

#Environment @OTTHydromet @EnvAgency @friends_earth


What is on the list of trends for 2020?

06/12/2019
Data centre trends for 2020 from Rittal

Growing volumes of data, a secure European cloud (data control), rapid upgrades of data centres and rising energy consumption are the IT/data centre trends for Rittal in 2020. For example, the use of OCP (Open Compute Project) technology and heat recovery offers solutions for the challenges of the present.

Concept of cloud computing or big data, shape of cloud in futuristic style with digital technology interface!

According to the market researchers at IDC (International Data Corporation), humans and machines could already be generating 175 zettabytes of data by 2025. If this amount of data were stored on conventional DVDs, it would mean 23 stacks of data discs, each of them reaching up to the moon. The mean 27 percent annual rate of data growth is also placing increasing pressure on the IT infrastructure.

Since there is hardly any company that can afford to increase its own data storage by almost a third every year, IT managers are increasingly relying on IT services from the cloud. The trend towards the cloud has long since been a feature in Germany: A survey published in the summer of 2019 by the Bitkom ICT industry association together with KPMG showed that three out of four companies are already using cloud solutions.

However, businesses using cloud solutions from third-party providers do lose some control over their corporate data. That is why, for example, the US Cloud Act (Clarifying Lawful Overseas Use of Data) allows US authorities to access data stored in the cloud, even if local laws at the location where the data is stored do prohibit this.

“Future success in business will be sustainable if they keep pace with full digital transformation and integration. Companies will use their data more and more to provide added value – increasingly in real time – for example in the production environment,” says Dr Karl-Ulrich Köhler, CEO of Rittal International. “Retaining control over data is becoming a critical success factor for international competitiveness,” he adds.

Trend #1: Data control
The self-determined handling of data is thus becoming a key competitive factor for companies. This applies to every industry in which data security is a top priority and where the analysis of this data is decisive for business success. Examples are the healthcare, mobility, banking or manufacturing industries. Companies are now faced with the questions of how to process their data securely and efficiently, and whether to modernise their own data centre, invest in edge infrastructures or use the cloud.

The major European “Gaia-X” digital project, an initiative of the German Federal Ministry for Economics and Energy (BMWi), is set to start in 2020. The aim is to develop a European cloud for the secure digitalization and networking of industry that will also form the basis for using new artificial intelligence (AI) applications. The Fraunhofer Gesellschaft has drawn up the “International Data Spaces” initiative in this context. This virtual data room allows companies to exchange data securely. The compatibility of their own solutions with established (cloud) platforms (interoperability) is also provided.

This means that geographically widespread, smaller data centres with open cloud stacks might be able to create a new class of industrial applications that perform initial data analysis at the point where the data is created and use the cloud for downstream analysis. One solution in this context is ONCITE. This turnkey (plug-and-produce) edge cloud data centre stores and processes data directly where it arises, enabling companies to retain control over their data when networking along the entire supply chain.

Trend #2: Standardisation in data centres with OCP
The rapid upgrade of existing data centres is becoming increasingly important for companies, as the volume of data needing to be processed continues to grow. Essential requirements for this growth are standardised technology, cost-efficient operation and a high level of infrastructure scalability. The OCP technology (Open Compute Project) with its central direct current distribution in the IT rack is becoming an interesting alternative for more and more CIOs. This is because DC components open up new potentials for cost optimisation. For instance, all the IT components can be powered centrally with n+1 power supplies per rack. This way, an efficient cooling is achieved, since fewer power packs are present. At the same time, the high degree of standardisation of OCP components simplifies both maintenance and spare parts management. The mean efficiency gain is around five percent of the total current.

Rittal expects that OCP will establish itself further in the data centre as an integrated system platform in 2020. New OCP products for rack cooling, power supply or monitoring will enable rapid expansion with DC components. Furthermore, new products will support the conventional concept of a central emergency power supply, where the power supply is safeguarded by a central UPS. As a result, it will no longer be necessary to protect every single OCP rack with a UPS based on lithium-ion batteries. The advantage: the fire load in the OCP data centre is reduced considerably.

Trend #3: Heat recovery and direct CPU cooling
Data centres release huge amounts of energy into the environment in the form of waste heat. As the power density in the data centre grows, so too do the amounts of heat, which can then potentially be used for other purposes. So far, however, the use of waste heat has proven too expensive, because consumers are rarely found in the direct vicinity of the site for example. In addition, waste heat, as generated by air-based IT cooling systems, is clearly too low at a temperature of 40 degrees Celsius to be used economically.

In the area of high-performance computing (HPC) in particular, IT racks generate high thermal loads, often in excess of 50 kW. For HPC, direct processor cooling with water is significantly more efficient than air cooling, so that return temperatures of 60 to 65 degrees become available. At these temperatures, for instance, it is possible to heat domestic hot water or use heat pumps or to feed heat into a district heating network. However, CIOs should be aware that only about 80 percent of the waste heat can be drawn from an IT rack, even with a direct CPU water cooling. IT cooling is still needed by the rack for the remaining 20 percent.

At the German Government’s 2019 Digital Summit, the topic of heat recovery was discussed in the working group concerned, which identified a high need for action. For this reason, Rittal assumes that by 2020, significantly more CIOs will be involved in the issue of how the previously unused waste heat from the data centre can be used economically.

Trend #4: Integration of multi-cloud environments
Businesses need to be assured that they can run their cloud applications on commonly used platforms and in any country. This calls for a multi-cloud strategy. From management’s point of view, this is a strategic decision based on the knowledge that its own organisation will develop into a fully digitised business.

For example, an excellent user experience is guaranteed by minimising delays with the appropriate availability zones on site. This means that companies choose one or more availability zones worldwide for their services, depending on their business requirements. Strict data protection requirements are met by a specialised local provider in the target market concerned, for example. A vendor-open multi-cloud strategy allows exactly that: combining the functional density and scalability of hyperscalers with the data security of local and specialised providers such as Innovo Cloud. At the push of a button, on a dashboard, with a contact person, an invoice and in the second when the business decision is made – this is what is making multi-cloud strategies one of the megatrends of the coming years. This is because the economy will take further steps towards digital transformation and further accelerate its own continuous integration (CI) and continuous delivery (CD) pipelines with cloud-native technologies – applications designed and developed for the cloud computing architecture. Automating the integration and delivery processes will then enable the rapid, reliable and repeatable deployment of software.

#PAuto @Rittal @EURACTIV @PresseBox

 


Survey: The World Market for Flowmeters.

01/05/2019

Flow Research has completed one of the most comprehensive studies ever published of the worldwide flowmeter market. This study is called Volume X: The World Market for Flowmeters, 7th Edition. This All-in-One Study tells you 2018 market size and market shares, and forecasts through 2023 for all flowmeter types used in process environments.

World Market for Flow-meters, 7th Edition

Included in study.
Market size in 2018 of the worldwide market by flowmeter type in dollars and units
Market size forecasts by flowmeter type through 2023 in dollars and units
Market shares for each flowmeter type in 2018
Both worldwide and regional market size and forecast data
Growth factors and market trends for all flowmeter technologies
A technology description and analysis for each flowmeter type, including major competitive strengths and weaknesses
Company profiles with product information for easy comparison.

In addition to Volume X, they are publishing a companion volume called Module A: Strategies, Industries, and Applications. This popular volume gives segmentation by industry and application for each flow technology. It also discusses the underlying growth factors and driving forces behind the different industries and applications. The strategies section shows you how to gain an edge in a competitive market.

Over the past nine months, over 500 questionnaires have been sent to all known suppliers of every flow technology. The questionnaire asked for geographic distribution and revenue information, along with industries, applications, fluid types, and projected growth from the base year of 2018. The results have been analysed and a complete study written of the worldwide flowmeter market. This is the 7th edition of this tudy, which was first published in 2003. At that time, the worldwide flowmeter market was $3.1 billion. The market has grown substantially since that time, and we’ve been there to document its growth every step of the way.

The Flowmeter Market in 2018 Hightlighted
2018 was a very exciting year in the flowmeter market, and it was one of recovery. Some technologies appear to have experienced unprecedented growth, especially in the magnetic flowmeter market. Our new Volume X captures this growth and makes it easy to determine which flowmeter types benefited most from the recovering market in 2018. Volume X captures the growth in all the flow technologies, including Coriolis, ultrasonic, positive displacement, turbine, variable area, and other types. 2018 is likely to be remembered as a year of unprecedented growth.

Volume X makes it easy to compare market size, market shares, and growth rates for all types of flowmeters. We use the bottom-up method of determining market size for each flowmeter type, and then put these numbers together for the worldwide picture. This is the only way to get a reliable picture of the world flowmeter market and to find out how the different flowmeter types compare with each other. For complete details, go to http://www.flowvolumex.com. We have been able to gather this information because we began researching these markets nine months ago.


Soil carbon flux research!

21/11/2018
Measuring soil carbon flux gives an insight into the health of forest ecosystems and provides feedback on the effects of global warming. This article, from Edinburgh Instruments, outlines how soil CO2 efflux is determined and the applications of soil carbon flux research.

Soil is an important part of the Earth’s carbon cycle.
Pic: pixabay.com/Picography

The Earth’s carbon cycle maintains a steady balance of carbon in the atmosphere that supports plant and animal life. In recent years, concerns about the increasing levels of CO2 in the atmosphere, indicating a problem in Earth’s carbon cycle, has been a prominent global issue.1,2

As a part of a stable carbon cycle, carbon is exchanged between carbon pools including the atmosphere, the ocean, the land and living things in a process known as carbon flux. Carbon exchange typically takes place as a result of a variety of natural processes including respiration, photosynthesis, and decomposition.

Since the industrial age, humans have begun to contribute to carbon exchange with activities such as fuel burning, and chemical processes, which are believed to be responsible for increasing atmospheric CO2 concentrations and increasing global temperatures.1-3

Soil carbon flux provides feedback on environmental conditions
Soil is a vital aspect of the Earth’s carbon cycle, containing almost three times more carbon than the Earth’s atmosphere. Carbon is present in soil as ‘solid organic carbon’ including decomposing plant and animal matter. Over time, microbial decomposition of the organic components of soil releases carbon into the atmosphere as CO2.4,5

The amount of carbon present in soil affects soil fertility, plant growth, microbial activity, and water quality. Studying the carbon flux of soil gives an insight into an ecosystem as a whole and specific information about microbial activity and plant growth.4-6

Soil carbon flux can also help us to understand and predict the effects of global warming. As global temperatures increase, is it expected that microbial activity will also increase, resulting in faster plant decomposition and increased CO2 efflux into the atmosphere.5,6

Measuring soil CO2 efflux
Determining soil-surface CO2 efflux can be challenging. Researchers commonly employ a chamber combined with CO2 concentration measurements to determine CO2 efflux. A variety of chambers have been designed for such research, some of which are commercially available.7-10

Closed-chamber systems typically pump air through a gas analyzer, which measures CO2 concentration, before returning the air to the chamber. Soil CO2 efflux is then estimated from the rate of increase of CO2 concentration in the chamber.

Open-chambers pump ambient air into the chamber and measure the change in CO2concentration between the air entering the chamber and the air leaving the chamber to determine the soil CO2 efflux.

Of the two chamber types, open chambers are considered more accurate. Closed chambers tend to underestimate CO2 efflux as increased CO2 concentrations in the chamber cause less CO2 to diffuse out of the soil while the chamber is in place.10,11.

Often, CO2 concentrations in chambers are measured periodically and then extrapolated to give an estimation of CO2 efflux. This method can be inaccurate because CO2 efflux can vary significantly between measurements with changes in environmental conditions.

A further limitation of using chambers for CO2 efflux measurements is that chambers typically only provide measurements in one location, while CO2 efflux has been found to vary widely even in relatively homogeneous environments. The overall result is CO2 efflux data with limited temporal and spatial resolution, that does not reflect the environmental situation as a whole.10,12,13

Naishen Liang

A group of researchers from the National Institute for Environmental Studies (Japan) led by Naishen Liang has designed an automated, multi-chamber chamber system for measuring soil-surface CO2 efflux.

As CO2 concentrations are measured automatically using an infrared gas sensor, COefflux can be determined accurately throughout the experiment. The improved temporal resolution, combined with increased spatial detail resulting from the use of multiple chambers gives a better overview of how CO2 efflux varies with time, location, and environmental conditions within an ecosystem.10

Liang and his team have applied his method to gather information about a range of forest ecosystems. Their automated chambers have been used in a variety of forest locations combined with heat lamps to provide high-resolution, long-term data about the effects of warming on microbial activity and CO2 efflux.

Liang’s research has shown that soil temperatures have a significant effect on COefflux in a wide range of forest environments, information that is vital for understanding how global warming will affect forest ecosystems and the Earth’s carbon cycle as a whole.14-17

All chamber systems for determining CO2 efflux rely on accurate CO2 concentration analysis. Infrared gas analyzers are the most widely used method of instrumentation for determining CO2 concentrations in soil CO2 efflux measurement chambers.8,10,18

Infrared gas sensors, such as gascard sensors from Edinburgh Sensors, are well suited to providing CO2 concentration measurements in soil chambers, and are the sensors of choice used by Liang and his team.

The gascard sensors are robust, low-maintenance, and easy to use compared with other sensors. They provide rapid easy-to-interpret results and can be supplied as either complete boxed sensors (the Boxed Gascard) or as individual sensors (the Gascard NG) for easy integration into automated chambers.19,20


Notes, References and further reading
1. ‘The Carbon Cycle’
2. ‘Global Carbon Cycle and Climate Change’ — Kondratyev KY, Krapivin VF, Varotsos CA, Springer Science & Business Media, 2003.
3. ‘Land Use and the Carbon Cycle: Advances in Integrated Science, Management, and Policy’ — Brown DG, Robinson DT, French NHF, Reed BC, Cambridge University Press, 2013.
4. ‘Soil organic matter and soil function – Review of the literature and underlying data’ — Murphy BW, Department of Environment and Energy, 2014
5. ‘The whole-soil carbon flux in response to warming’ — Hicks Pries CE, Castanha C, Porras RC, Torn MS, Science, 2017.
6. ‘Temperature-associated increases in the global soil respiration record’ — Bond-Lamberty B, Thomson A, Nature, 2010.
7. ‘Measuring Emissions from Soil and Water’ — Matson PA, Harriss RC, Blackwell Scientific Publications, 1995.
8. ‘Minimize artifacts and biases in chamber-based measurements of soil respiration’ — Davidson EA, Savage K, Verchot LV, Navarro R, Agricultural and Forest Meteorology, 2002.
9. ‘Methods of Soil Analysis: Part 1. Physical Methods, 3rd Edition’ — Dane JH, Topp GC, Soil Science Society of America, 2002.
10. ‘A multichannel automated chamber system for continuous measurement of forest soil CO2 efflux’ — Liang N, Inoue G, Fujinuma Y, Tree Physiology, 2003.
11. ‘A comparion of six methods for measuring soil-surface carbon dioxide fluxes’ — Norman JM, Kucharik CJ, Gower ST, Baldocchi DD, Grill PM, Rayment M, Savage K, Striegl RG, Journal of Geophysical Research, 1997.
12. ‘An automated chamber system for measuring soil respiration’ — McGinn SM, Akinremi OO, McLean HDJ, Ellert B, Canadian Journal of Soil Science, 1998.
13. ‘Temporal and spatial variation of soil CO2 efflux in a Canadian boreal forest’ — Rayment MB, Jarvis PG, Soil Biology & Biochemistry, 2000.
14. ‘High-resolution data on the impact of warming on soil CO2 efflux from an Asian monsoon forest’ — Liang N, Teramoto M, Takagi M, Zeng J, Scientific Data, 2017.
15. ‘Long‐Term Stimulatory Warming Effect on Soil Heterotrophic Respiration in a Cool‐Temperate Broad‐Leaved Deciduous Forest in Northern Japan’ —Teramoto M, Liang N, Ishida S, Zeng J, Journal of Geophysical Research: Biogeoscience, 2018.
16. ‘Sustained large stimulation of soil heterotrophic respiration rate and its temperature sensitivity by soil warming in a cool-temperate forested peatland’ — Aguilos M, Takagi K, Liang N, Watanabe Y, Teramoto M, Goto S, Takahashi Y, Mukai H, Sasa K, Tellus Series B : Chemical and Physical Meteorology, 2013.
17. ‘Liang Automatic Chamber (LAC) Network
18. ‘Interpreting, measuring, and modeling soil respiration’ — Ryan MG, Law BE, Biogeochemistry, 2005.
19. ‘Boxed Gascard’
20. ‘Gascard NG’

@edinsensors #Environment #NIESJp

Power take-off torque monitoring.

07/08/2018
AIM – Precisely & Quickly Monitor Power Take-Off Torque on A Wave Energy Converter

Challenge
As part of a project funded by Wave Energy Scotland, 4c Engineering needed to test various configurations of the SeaPower Platform, a Wave Energy Converter (WEC), to determine the effects on power capture.   To do this they needed a reliable and accurate way of measuring power take-off (PTO) torque, forces, positions and pressures of the waves on the SeaPower Platform.

Why? Establishing the most efficient design with the highest wave power generation, will make it a more cost-efficient form of wave energy.

The SeaPower Platform extracts energy from deep water ocean waves by reacting to long prevailing wavelengths in high resource sites.

Solution – Accurate DTD-P Parallel Shaft Reaction Torque Transducer
“We chose the DTD-P torque transducer for its high accuracy and compact size which we needed for tank testing the SeaPower Platform,” explains Andy Hall, Director at 4c Engineering.

  • Designed for In-Line Static or Semi-Rotary Torque Measurement
  • Capacities: 0-10Nm to 0-10kNm
  • High Accuracy – Ideal for Calibration, Development and Testing Applications
  • Accuracy: <±0.15% / Full Scale Output
  • Customised Capacities, Shaft and Configuration Options Available
  • IP67 Waterproof and IP68 Fully Submersible Versions Available

Complete Torque Monitoring System
Applied Measurements Ltd provided 4c Engineering with a DTD-P 100Nm parallel shaft torque sensor fitted with an ICA4H miniature load cell amplifier, calibrated to UKAS traceable standards and sealed to IP68 to allow complete submersion. This complete torque measuring system enabled their engineers to reliably and accurately monitor the torque applied by the WEC as it responded to waves in the test tank.

Save on Installation Time
The DTD-P torque transducer has keyed parallel shaft connections for in-line static or semi-rotary torque measurement in capacities from 0-10Nm up to 0-10kNm (custom capacities readily available). This version was fitted with a flying lead, however versions with an integral bayonet lock military connector are also available which promise simple and easy connection.

Guaranteed High Accuracy
The DTD-P torque transducer is highly accurate to better than ±0.15% (typically ±0.05%) of the full scale output, making it ideal for this high precision development and testing application. Additional applications include the testing of electrical motors, hydraulic pumps, automotive transmissions, steering systems and aircraft actuators.

Need a Specific Design?
The DTD-P torque transducer can be customised with bespoke shaft, configuration options and capacities (to 50kNm+) specific to your application. For 4c Engineering we customised the design of the DTD-P torque transducer to IP68 submersible for continuous use underwater to 1m, which was essential for use in the wave test tank.

High Stability, Fast Response, ICA4H Miniature Load Cell Amplifier
“The high speed, reliability and clean output of the ICA4H miniature amplifier enabled the data to be analysed immediately after each test.” says Andy Hall.

  • Very Compact 19mm Diameter
  • Low Current Consumption
  • High Speed 1kHz Bandwidth (max.)
  • 4-20mA (3-wire) Output (10 to 30Vdc supply)

Very Compact
To deliver a conditioned load cell output signal we supplied the DTD-P torque transducer with an ICA4H high performance miniature load cell amplifier. The ICA miniature load cell amplifiers are very compact at only 19mm in diameter allowing them to fit inside the body of most load cells. In this application the ICA4H was supplied in a gel filled IP68 immersion protected compact enclosure (see image above) along with 10 metres of cable making it suitable for this underwater application.

With High Speed Response
The engineers at 4c Engineering needed to have a quick and reliable way to process the power take-off torque data from the tests, to determine the power capture and effects of the control settings before running the next test. The ICA4H miniature load cell amplifier was chosen not only for its high stability and compact size but also for its 1000Hz fast response.


A sea platform off the Galway (Ireland) Coast – not far from the Read-out offices.

@AppMeas @4c_Eng #power #PAuto


Keep making the tablets!

08/05/2018
This article shows how valuable manufacturing production line downtime in the pharmaceutical industry can be reduced by ensuring predictive maintenance of tablet making machinery using Harting’s MICA industrial computing platform.

Introduction
Harting recently challenged postgraduate students from the Centre for Doctoral Training in Embedded Intelligence at Loughborough University to investigate practical application solutions where MICA – the company’s innovative open platform based ruggedised industrial edge computing device – could be applied to the benefit of manufacturing. Simple seamless integration within existing established production processes was the target, based on the concept of machine predictive maintenance.

The key objective was to achieve immediate productivity improvements and return on investment (RoI), thus satisfying the increasing trend for Integrated Industry 4.0 implementation on the factory floor. One such proposal was suggested for volume manufacturers in the pharmaceutical industry: in particular, those companies manufacturing tablets using automated presses and punch tools.

Data from these machines can be collected using passive UHF RFID “on metal” transponders which can be retrofitted to existing tablet press machines and mounted on the actual press-die/punch tools. The RFID read and write tags can record the pressing process, i.e. the number of operations performed by a particular press die, plus any other critical operating sensor-monitored conditions. The system can then review that data against expected normal end-of-life projected limits set for that die.

Such data can be managed and processed through Harting’s MICA edge computing device, which can then automatically alert the machine operator that maintenance needs to take place to replace a particular die-set before it creates a catastrophic tool failure condition and breakdown in the production line – which unfortunately is still quite a common occurrence.

Open system software
MICA is easy to use, with a touch-optimised interface for end users and administrators implemented entirely in HTML5 and JavaScript. It provides an open system software environment that allows developers from both the production and IT worlds to quickly implement and customise projects without any special tools. Applications are executed in their own Linux-based containers, which contain all the necessary libraries and drivers. This means that package dependencies and incompatibilities are eliminated. In addition, such containers run in individual “sandboxes” which isolate and secure different applications from one another with their own separate log-in and IP addresses. As a result, there should be no concerns over data security when MICA is allowed access to a higher-level production ERP network.

MICA is already offered with a number of containers such as Java, Python C/C++, OPC-UA, databases and web toolkits, all available on free download via the HARTING web site. As a result, users should be able to download links to the operating software system compatible with an existing machine, enabling full 2-way communication with the MICA device. Relaying such manufacturing information, which can comprise many gigabytes of data in the course of a day, directly to the ERP would normally overwhelm both the network and the ERP. With the MICA, this data stream is buffered directly onto the machine and can be reduced to just essential business-critical data using proven tools from the IT world.

The resultant improvements in productivity include:

– Less downtime reduces the amount of money lost during unforeseen maintenance of damaged punch tools.
– Individual punch identification will help in removing a specific punch, once it has reached its pre-set operational frequency working limit.
– A digital log of each punch and the number of tablets that it has produced is recorded. This provides vital information for GMP (Good Manufacturing Practice) regulators such as the MHRA (Medicines & Healthcare products Regulatory Agency) or the FDA (Food & Drug Administration).

A further benefit is that MICA is very compact, with DIN rail mounting fixing options that allow it to be easily accommodated inside a machine’s main control cabinet.

@HARTING #PAuto #Pharma @CDT_EI

Researchers investigate ultra-low Mediterranean nutrient levels.

25/04/2018

Researchers at Haifa University’s Marine Biological Station in Israel are exploiting the ultra-low detection limits of advanced laboratory equipment to measure extremely low nutrient concentrations in marine water.

H.Nativ – Morris Kahn Marine Research Station

The University’s Prof. M. D. Krom says: “We work in the Eastern Mediterranean which has the lowest regional concentration of dissolved nutrients anywhere in the global ocean. We therefore utilize an automated segmented flow analyzer from SEAL Analytical, which has been specially adapted to accommodate ultra-low measurements.”

The SEAL AutoAnalyzer 3 (AA3) is a 4 channel system, measuring Phosphate with a long flow cell which has a detection limit of 2 nM. Ammonia is measured using a JASCO fluorometer with a similar ultra-low detection limit, and Silicate, which has a higher concentration, is measured using SEAL’s high resolution colorimetric technology.

The measurement data are being used to determine the season nutrient cycling in the system, which will then be used to help understand the nature of the food web and the effects of global environmental and climate change.

Low nutrient levels in the Mediterranean
The eastern Mediterranean Sea (EMS) has an almost unique water circulation. The surface waters (0-200m) flow into the Mediterranean through the Straits of Gibraltar and from there into the EMS at the Straits of Sicily. As the water flows towards the east it becomes increasingly saline and hence denser. When it reaches the coast of Turkey in winter it also cools and then flows back out of the Mediterranean under the surface waters to Sicily, and then eventually through the Straits of Gibraltar to the North Atlantic. This outflowing layer exists between 200m and 500m depth.

Phytoplankton grow in the surface waters (0-200m) because that is the only layer with sufficient light. This layer receives nutrients from the adjacent land, from rivers and wastewater discharges, and also from aerosols in the atmosphere. These nutrients are utilized by the plankton as they photosynthesize. When the plants die (or are eaten) their remains drop into the lower layer and are jetted out of the EMS. Because the water flows are so fast (it takes just 8 years for the entire surface layers of the EMS to be replaced), these nutrient rich intermediate waters rapidly expel nutrients from the basin. The result is very low nutrient concentrations and very low numbers of phytoplankton – some of the lowest values anywhere in the world. Prof. Krom says: “The maximum levels of nutrients measured in the EMS are 250 nM phosphate, 6 uM nitrate and 6-12 uM silicate. Ammonia is often in the low nanomolar range. By contrast, in the North Atlantic, values are 1000 nM phosphate, 16 uM nitrate and 20 uM silicate, and the levels in the North Pacific are even higher.”

The value of data
The low levels of plankton caused by low nutrient levels, result in a low biomass of fish. Nevertheless, coastal areas generally support more fish than offshore, so the research will seek to quantify and understand the nutrient cycle in the coastal regions, which is poorly understood at present. “We plan to develop understandings which will inform stakeholders such as government. For example, there is a discussion about the potential for fish farms off the Israeli coast, so our work will enable science-based decisions regarding the quantity of fish that the system can support.”

To-date, three data sets have been taken from the EMS, and the first publishable paper is in the process of being prepared.

Choosing the right analyzer
Prof. Krom says that his first ‘real’ job was working for the (then) Water Research Centre at Medmenham in Britain, where he was involved in the development of chemical applications for the Technicon AA-II autoanalyzers, which included going on secondment to Technicon for several months. SEAL Analytical now own and manufacture the AutoAnalyzer brand of Continuous Segmented Flow Analyzers, so his career has been connected with autoanalyzers for decades. For example his is Professor (Emeritus) at the University of Leeds (GB), where, again, he worked with SEAL autanalyzers. An AA3 instrument was employed at Leeds in a project to investigate the nature of atmospheric acid processing of mineral dusts in supplying bioavailable phosphorus to the oceans.

Explaining the reasoning behind the purchase of a new AA3 at Haifa University, Prof. Krom says: “During a research cruise, it is necessary to analyse samples within a day to avoid changes in concentration due to preservation procedures.

“Typically we analyse 50-80 samples per day, so it is useful to useful to be able to analyze large numbers of samples automatically. However, the main reasons for choosing the SEAL AA3 were the precision, accuracy and low limits of detection that it provides.”

Commenting on this application for SEAL’s analyzers, company President Stuart Smith says: “Many of our customers analyze nutrient levels in freshwater and marine water samples, where high levels of nutrients are a concern because of increasing levels of algal blooms and eutrophication. However, Prof. Krom’s work is very interesting because, in contrast, he is looking at extremely low levels, so it is very gratifying that our instruments are able to operate at both ends of the nutrient concentration spectrum.

Bibliography
• Powley, H.R., Krom, M.D., and Van Cappellen, P. (2017) Understanding the unique biogeochemistry of the Mediterranean Sea: Insights from a coupled phosphorus and nitrogen model. Global Biogeochemical Cycles, 11; 1010-1031. DOI 10.1002/2017GB005648.

• Stockdale, A. Krom, M. D., Mortimer, R.J.G., Benning, L.G., Carslaw, K.S., Herbert, R.J., Shi, Z., Myriokefalitakis, S., Kanakidou, M., and Nenes, A., (2016) Understanding the nature of atmospheric acid processing of mineral dusts in supplying bioavailable phosphorus to the oceans. PNAS vol. 113 no. 51

#SealAnal #Marine @_Enviro_News

Connecting, communicating and creating in Netherlands.

14/03/2018

The country of the Netherlands is where the Rhine enters the sea. It is a country which has physically built itself out of the inhospitable North Sea. Often called Holland – which is the name of one (actually two) of its provinces – it even more confusingly for the English speaking world inhabited by the Dutch speaking Dutch. If you really want to know more about Holl.. er sorry, The Netherlands watch the video at the bottom of this piece.

Although not officially the capital of The Netherlands, Amsterdam is, The Hague is the seat of Government and official residence of the King. It was selected by the Emerson User Group as the venue for their European, Middle East & African assembly, refereed to as #EMrex on twitter. These assemblies – can we say celebrations? – occur every two years. The last was held in Brussels, the capital of the neighbouring Kingdom of the Belgians and of the European Union. An sccount of happenings there are in our postin “All change at Brussel Centraal.” (18/4/2016)

Lots of pictures from the event!

The size of this event was in marked contrast to the Brussels meet which was overshadowed by the terrible terrorist attacks in that city only three weeks earlier which presented transport difficulties. This time there were over one thousand six hundred delegates filling the huge hall of the Hague Convention Centre.

Another difference referred to in many of the discussions both formal and informal were the two great uncertainties effecting all businesses and industries – the possibility of a trade war with the USA under its current administration and nearer home the aftermath of the BREXIT decision – the exit of the British from the largest economic bloc on the planet. Many developments have been put on the long finger pending clarification on these issues.

Mary Peterson welcomes delegates

Why are we here?
This event continued in the vein of previous meetings. The emphasis continuing to move to perhaps a more philosophical and certainly a more holistic view of how the automation sector can help industry. This was made clear in the introductory welcome by Novartis’s Mary Peterson, Chair of the User Group, when she posed the question, “Why are we here?”

“This is a conference for users by users.” she said. It is a place to discuss users’ practical experiences; continuing our profissional development; learning best practice and proven solutions and technology roadmaps. But above all it presented an opportunity to connect with industry leaders, users and of course Emerson experts.

For other or more detailed information on happenings and/or offerings revealed at this event.
News Releases

and on Twitter #EmrEx

The emphasis is on the totality of services and packages not on individual boxes. Emerson’s European President Roel Van Doren was next to address the assembly. We should know our plant but be unafraid to use expertese and knowledge to keep it fit for purpose. Monitor the plant constantly, analyse what is required and then act. This means seeing how the latest advances might improve production. This means harnessing the “new technologies.” In passing he drew our attention that Emerson had been recognised earlier this year as ‘Industrial IoT Company of the Year’ by IoT Breakthrough.

The path is digital
A very striking presentation was given by Dirk Reineld, Senior VP Indirect Procurement with BASF. He brought us to the top of Rome’s Via de Conciliazione on 19th April 2005. We saw the huge crowd looking towards the centre balcony as the election of a new pope was announced. He then moved forward to the 13th March in 2013, the same place but what a difference in such a short time. This time it seemed that everybody had a mobile phone held to take photographs of the announcement of the election of Francis. All we could see was a sea of little screens. He used this to emphasize a point “We are underestimating what is happening & its speed.” This is not helped by a natural conservatism among plant engineers. Change is happening and we either embrace it or get left behind. It is becoming more and more clear that in front of us “the path is digital!” He presented some useful examples of digitalisation and collaboration at BASF.

PRESENTATIONS

Registered delegates have access to slides from the main presentation programme. These slides are available for download via the Emerson Exchange 365 community (EE365).

Emerson Exchange 365 is separate from the Emerson Exchange website that presenters & delegates used before Exchange in The Hague. So, to verify your attendance at this year’s conference, you must provide the email you used to register for Exchange in The Hague. If you are not already a member of EE365 you will be required to join.

To access the presentations, visit The Hague 2018 and follow the prompts. The first prompt will ask you to join or sign in.

Something in this particular EmrEx emphasised how things are moving and those unprepared for the change. Among some of the press people and others there was disappointment expressed that there was not a printed programme as in previous years. This correspondent is used to going away into a corner and combing through the printed agenda and selecting the most relevant sessions to attend. This was all available on line through the “Emerson Exchange Web App.” This was heralded as a “a great preshow planning tool.” All we had to do was enter a link into our our web-browser on our phones and away you went. Yes this is the way to go certainly and although I am inclined to be adventurous in using social media etc I and some (if not many) others found this a step too far to early. It was not clear that a printed version of the programme would not be available and the first hour of a conference is not the best time to make oneself au fait with a new app.

Having said that while many of the journos took notes using pencil and paper they were not adverse to taking photos of the presentation slades so they could not be said to qualify as complete luddites!

Terrific progress but…

Rewards of efficiency
This event was being held at the same time as CERAWeek 2018 in which Emerson was an important participant. Some Emerson executives thus made the trans Atlantic journey to make presentations. One of those was Mike Train, Emerson’s Executive President who delivered his talk with no apparent ill effects. In effect he was asking a question. “Just how effective is progress?” Yes, we HAVE made phenomenal progress in the last 30 years. “Modern automation has made plants more efficient, reliable and safer, but, the ‘Efficiency Era’ is reaching diminishing returns….Productivity seems to be stagnation while the workforce is stretched.”

He postulated five essential competancies for digital transformation.

  1. Automated workflows: Eliminate repetitive tasks and streamline standard operations.
  2. Decision support: Leverage analytics and embedded exportise.
  3. Mobility: Secure on-demand access to information and expertese.
  4. Change management: Accelerate the adoption of operational best practices.
  5. Workforce upskilling: Enable workers to acquire knowledge and experience faster.

Making the future!

Making the future
The next speaker was Roberta Pacciani, C&P Manager Integrated Gas and Upstream Technology with Shell. She is also President of the Women’s Network at Shell Netherlands. She spoke on leveraging the best available talent to solve future challenges. I suppose that we would have classified this as a feminist talk but of course it isn’t. As the presenter said it is not so much a feminist issue as a people issue. “Closing the gender gap in engineering and technology makes the future.”  This was a useful presentation (and in this correspondent’s experience unusual) and hopefully will be helpful in changing perceptions and preconceptions in STEM and our own particular sector.

As partof EmrEX there is an exhibition, demonstration area. Delegates may see innovative technologies applied to their plant environment. They meet with experts about topics such as getting their assets IIoT ready or how to use a Digital Twin to increase performance and explore options to prepare their plant for the future. As a guide – printed as well as on-line – the produced a Metro-like guide.
Using this we could embark on a journey through products, services and solutions where Emerson together with their partners could help solve operational and project challenges.

One of the most popular exhibits was the digital workforce experience. Here we visited a plant and were transported magically to former times to see just how different plant management is now and particularly with the help of wireless and digitisation.

It happened!

One of the good things about this sort of event is the opportunity to meet friends for the first time though social media. Sometimes one does not know they are attending unless the tweet something. Thus I realised that an Emerson engineer was present and so I went looking for him in the expos area. This it was that Aaron Crews from Austin (TX US) and I met for the first time after knowing each other through twitter & facebook for a frightening ten years. Another of these virtual friends, Jim Cahill, says, “It hasn’t happened without a picture!” So here is that picture.

The following morning there were a series of automation forum dedicated to various sectors. The Life-Sciences Forum was one which was very well attended.  Ireland is of course a leader in this sector and we hope to have a specific item on this in the near future. Emerson have invested heavily in the national support services as we reported recently.

Each evening there were social events which provided further opportunities for networking. One of these was a visit to the iconic Louwman Transport Museum where reside possibly the largest collections of road vehicles from sedan chairs through the earliest motor cars up to the sleekest modern examples. These are all contained in a beautiful building. The display was very effectively presented and one didn’t have to be a petrol-head – and believe me there were some among the attendance – to appreciate it.

It is impossible to fully report an event like this in detail. One can follow it on twitter as it happens of course. And there will be copies of many of the presentations and videos of some of the sessions on the website.

The Emerson User Group Exchange – Americas will continue “spurring innovation” in San Antonio (TX USA) from 1st to 5th October 2018. It looks exciting too.

We promised at the top of this blog an exposé of the country often called Holland in English –


So now you know!

@EMR_Automation #Emrex #Pauto