“Breaking new ground with a world first for level detection technology” #PAuto


VEGA has re-engineered this technology, bringing it up to date and developing a sensor for extreme situations: the vibrating level switch VEGASWING 66. Why these new sensors ‘feel comfortable’ in temperatures up to 450 °C and pressures up to 160 bar is explained to us in an interview with Holger Sack, Head of Product Management at VEGA.

DE-Sack-HolgerHolger Sack is head of product management at VEGA Grieshaber KG in Schiltach. After completing his engineering studies, he moved from Berlin to the Black Forest in 1991 to collaborate in the development of sensors for non-contact level measurement at VEGA. In 2000 he became product manager for radar, since 2009 he has been head of product management. His areas of responsibility, shared by his team of 11 product managers, are supporting product development and overseeing the worldwide marketing of our instrumentation.
The term ‘limit switch’ appears rather infrequently in technical texts. Even my Google search was not very successful for that term. Why?

Holger Sack: Today, the old, ‘tried and tested’ analogue limit switches with potentiometer adjustment etc. are gradually being replaced by digital technology. I think that’s why these devices are now more commonly called ‘point level sensors.’ Even in the English language, you’ll more often hear people talking about ‘point level measurement’, i.e. detection of a substance at a specific point. I suppose ‘point level’ has prevailed over ‘limit level’ for that reason. We’ve found that point level measurement is indeed very closely related to level measurement.

That is very likely the reason why ‘level’ is used in some descriptions as a synonym for ‘point level.’ To what extent are the two technologies different, and what are their similarities?
Holger Sack: Level measurement is used to describe continuous measurement of a changing level, whereas point level is used to indicate a discrete condition, i.e. the existence of the level at a certain point. To clarify, ‘Level’ means the continuous measurement of contents from empty to full, with the values being output in percent, volume or other units of contents. ‘Point level’, on the other hand, means a discrete on/off signal given when a product has reached a significant level in a vessel.

Continuous or discrete – can the two areas of applications be clearly separated from one another?

Holger Sack: No, they can’t. You can find both level and point level sensors everywhere. Even combined in one application, depending on the customer requirement. Usually, when both sensor types are installed, it is to increase safety – for example in the chemical industry.

Does the simpler technology behind point level switches mean that they are cheaper than continuous level sensors?

Holger Sack: Roughly speaking, yes. Because, continuous level sensors are more complex in structure and sometimes also in terms of technology. But this need not always be the case. You see, with regard to the process, point level sensors have to meet the same requirements as continuous level sensors. Level switches just carry out a ‘simpler evaluation’ of level data. That’s why the price of a level switch, or point level sensor, is usually less than that of a continuous level sensor by a certain factor.

In point level detection, a switching command starts or stops the filling equipment. How do you monitor the process, i.e. ensure that the sensing element and the electronics are working properly?

Holger Sack: On the one hand, the individual sensor has to be considered, and on the other, the entire measurement chain. In the modern sensors, microprocessor technology enables numerous functions that monitor the electronics as well as the sensing element during operation. A high percentage of faults in the electronics as well as in the sensing element, such as build-up or corrosion, can be detected and reported. Looking at the entire measurement chain, we see that information on tank contents is forwarded to the control system or special actuators through cables or bus systems. These systems are responsible for ensuring that valves, pumps, etc. operate at the right time. Until now, all devices in a system were analysed individually from safety-engineering standpoint. But this has changed. Today, engineers look at the entire measurement chain, that is, from the sensor to the transmission of measured values to the actuating components (valves, pumps, etc.). This ultimately ensures that the switch-off mechanism in its entirety works, preventing overfilling or dry running of pumps.

When it comes to safety, a lot has been done in recent years. Has level switch technology also been made better, safer?

Holger Sack: As I said earlier, the basic technology is very old. People built level switches long before they started building continuous level measuring instruments. For that reason there are still a lot of old but proven technologies in service, such as floats or paddle switches. The capacitive measuring principle is also a very old, tried-and-true measuring method, albeit with a few limitations when compared to the vibration principle. Next to microwave/radar, vibration is currently the most universal measuring principle that we offer. We at VEGA have been focusing on electronic measuring systems for years, because they have significant advantages in terms of maintenance and means Life Cycle Costs. That’s why, although they are also a little more expensive to produce and to buy, we believe that this extra outlay pays off over a service life of 15 years or more.

Does this mean that your new vibrating level switch VEGASWING 66 is just old, well-known technology in a new guise?

Holger Sack: No, not in this regard. Here we are breaking new ground with our new, patented technology. The backstory is that this instrument can be used in temperature and pressure ranges where previously only a few technologies could be deployed, and certainly not the tuning fork measuring principle. The basis of this technology is a tuning fork that is electrically excited and made to vibrate over a few micrometres range. Until now, it was not possible to use vibration technology in temperatures above 250 °C. With our VEGASWING 66, applications up to 450 °C are now possible; not only that, it is also capable of temperatures as low as -195 °C. We are the only company that can offer this technology for use in such temperatures and in pressures from -1 to 160 bar.

Vegaswing 66

Vegaswing 66

How did you make it possible to use the switch in the extreme temperatures and pressures commonly found in the process industry?

Holger Sack: For one thing, we replaced the previously used piezoelectric drive with a special solenoid that we developed ourselves. This solenoid now drives the tuning fork and is able to withstand the high temperatures. Another point is, we now use ceramic materials and have designed the electrical connections made so secure that they operate reliably even at 450° C. And last but not least, we achieved the high pressure resistance through the mechanical stability of robust materials that withstand pressures up to 160 bar.

For point level detection, the user can choose between different measuring principles. How can the customer be sure he gets the right one, the one he really needs for his application?

Holger Sack: It is always important that the customer tells us in advance what his requirements and application conditions are. Because, with this information we can recommend the right measuring principle. However, we are confronted again and again with new challenges that challenge the limits of our technology – because our customers are not standing still, they’re continuously developing their processes. In most cases it’s about new combinations of pressure, temperature and chemicals. That’s why we’re constantly developing and improving our products and measuring principles, optimising them to meet the latest process requirements.

Current problems are, for example, difficult product properties or foaming. To what extent do these factors influence the quality of the measuring results?

Holger Sack: On VEGASWING 66, for example, we can detect build-up, and we can also detect whether the tuning fork is corroded or broken. This is possible both due to the measuring principle itself, as well as monitoring of the natural resonant frequency. Build up changes the amplitude of the oscillation, which allows us to use the available processor technology to electronically evaluate this change and notify the customer that a problem exists. Such ‘anticipatory’ diagnostic capabilities are being demanded by customers more and more.

In addition to diagnostic capabilities and safety, users nowadays are calling for the simplest possible instrument handling. Do your level switches also follow the motto of the plics platform, ‘simpler is better’?

Holger Sack: Absolutely. The whole idea of the plics® concept is to make level and pressure measurement as simple as possible for each and every customer. The customer doesn’t have to be an instrument engineer to be able to use our instruments. His job is to control the process and keep it running smoothly – by asking questions about his process and its requirements, we are able to recommend the instrument that is best suited for the application. VEGASWING 66 is also designed according to the modular instrument platform plics. This means that the customer can combine different components as required. But plics is more than that: it’s a system designed to make handling easier for the customer throughout the entire life cycle of the instrument.

Another important point is delivery time. We deliver 80 percent of our products are despatched within two to four working days – previously it took 6 to 10 weeks. The installation and setup are also greatly simplified by the modular system because, if the customer already knows how to operate one VEGA instrument, he can operate all the others just as well. Installation, adjustment and connection are completely standardized, this applies to 80 percent of our instruments. If servicing is required, our employees look after customers personally and make sure the job gets done quickly and without any fuss – because, every servicing event also provides us with valuable feedback on the product and a chance to improve it and ourselves.

Phones, tablets and sensors!


Product innovation and competitive pricing are key factors for success

The rapid proliferation of smart phones and tablets has caused the global magnetic sensors market to boom. This growth curve will continue as declining price points make magnetic sensors affordable for mass-market penetration.

New analysis from Frost & Sullivan, Analysis of the Global Magnetic Sensors Market, finds that the market earned revenues of  €1.20 billion (US$1.66b) in 2012 and estimates this to reach €2.56 billion (US$3.51b) in 2019. The study also outlines the prospective areas of growth globally, end user- and application-wise.

smart-phones-and-tabletsAnalysis of the Global Magnetic Sensors Market is part of the Sensors & Instrumentation Growth Partnership Service program. Frost & Sullivan’s related research services include: Global Temperature Sensors and Transmitters Market, Sensors in Laptop, Tablet, and Smartphone Global Markets, Global Sensor Outlook 2013, and Chinese Markets for Sensor Contract Manufacturing and Exports. All research services included in subscriptions provide detailed market opportunities and industry trends evaluated following extensive interviews with market participants.

With navigation emerging as a must-have feature in smartphones and cellular handsets, the need for digital compasses has spurred the global magnetic sensors market.

“Moreover, the mandatory fitment of magnetic sensors in vehicles due to regulations from the automotive sector has fuelled sale volumes,” said Frost & Sullivan Measurement and Instrumentation Senior Industry Analyst V Sankaranarayanan. “Opportunities for magnetic sensor vendors in the automotive industry will continue to open up as new applications emerge.”

However, this immense potential may not translate into equivalent revenues as intense competition leads to pricing pressures. Manufacturers are reducing prices to penetrate markets such as consumer electronics, where a compelling price point is crucial for success. Profits are also affected as automotive manufacturers apply downward price pressure on sensor manufacturers.

To counter these challenges, innovative product differentiation strategies are necessary. Products must move from competing purely on price to non-price factors, especially as customers look for unique solutions with tangible benefits.

“Manufacturers are investing in technological advancements and the widening of magnetic sensor application areas,” noted Sankaranarayanan. “Magnetic sensors have already evolved greatly in terms of sensitivity, size, packaging and flexibility. Miniaturisation, in particular, will be a common trend in the global magnetic sensors market.”

Demand from Asia-Pacific and Latin America is likely to push technology and product development further.

The Gasmet story …from an academic idea to a global business

Dr Petri Jaakkola

Dr Petri Jaakkola

In this article, Dr Petri Jaakkola, founder and Chairman of Gasmet Technologies Oy explains how an idea shared by a group of researchers at the University of Oulu in Finland, during the early 1970s, developed into one of the world’s leading gas monitoring instrumentation manufacturers.

Gasmet Technologies is a leading manufacturer of gas monitoring instruments and systems for industrial and environmental applications including Continuous Emissions Monitoring (Power Plants, Waste Incinerators, Cement Kilns etc.) Stack Testing / Comparison Measurements, Process Control, Industrial Hygiene / Indoor Air Quality, Engine Exhaust Gas, Semiconductor Manufacturing, Emergency / First Response Measurement (HAZMAT), Greenhouse Gas Monitoring, Carbon Capture, Fire Testing Emissions Monitoring and Research.
Gasmet FTIR gas analyzers can perform simultaneous measurement of both organic and inorganic compounds in even the most demanding applications including hot, wet and corrosive gas streams. Concentrations of up to 50 different compounds can be measured within seconds.

Early history – technology development.
At first, the Finnish researchers understood that the prospect of being able to measure multiple gases simultaneously was very exciting, but at the time it was difficult to identify the best market for the technology to address. Initially the high resolution FTIR technology developed in the University was only used for laboratory research applications. Environmental legislation was only in its formative stages, but conscious of a growing desire to protect air quality, the researchers felt that the long-term prospects for the technology were sound.

Using FTIR (Fourier Transform InfraRed) analysis, the workers had shown that it was possible to collect a complete infrared spectrum (a measurement of the infrared light absorbed by molecules inside the sample gas cell) for gases, and from this it was possible to generate both qualitative and quantitative data for the measured gas.

The first commercial partner was the company Scanoptics Oy, which worked on a high resolution FTIR spectrometer for the laboratory market and succeeded in selling two units to Finnish Universities in the early 1980’s. Since this business approach did not prove to be successful, the company changed its focus to develop a low resolution FTIR spectrometer for industrial applications and developed the initial prototype in the late 1980’s.

It is important to note that FTIR was also being developed elsewhere for use in the laboratory for qualitative analysis of liquids and solids. Crucially, if Scanoptics had chosen this market in the early years, the technology would have been unlikely to be suitable for emissions monitoring (which became the main market for Gasmet’s FTIR) because sufficient emphasis would not have been given to the ruggedness of the technology to withstand demanding industrial conditions.

However, since 1959 in Finland, it has been compulsory to include a shelter in any building over a certain size. Naturally, the quality of air inside these shelters is a primary concern, so Finntemet Oy, a Finnish manufacturer of shelters and blast resistant and gastight doors, was interested in gas analysis, and in 1990 they acquired Scanoptics’ FTIR business. This soon led to the creation of Temet Instruments Oy, which at that stage consisted of just four scientists; one working on electronics, one on the optical components, one on the mechanical design and one on software. At that time, there were no commercial or business development staff, so the focus was solely on the development of the technology.

The first Gasmet FTIR gas analyser was sold in 1993 – its serial number was #9301, and it was used successfully until around 2005. Long-term reliable service was a key objective for the Temet FTIR instruments and for this reason most of the company’s early years were spent improving the technology. This naturally created financial pressure, but the investment proved worthwhile as new environmental legislation created a heavy demand for emissions monitoring equipment.

The growth of FTIR
In the United States, the 1990 amendments to the Clean Air Act addressed toxic air pollution and established a national permits program for stationary sources, and increased enforcement authority. As a result, the USA was initially the largest market for emissions monitoring equipment. However, in the early days, the reputation of FTIR was tarnished by companies that tried to adapt high-resolution laboratory FTIR for emissions monitoring applications. As a result, for early adopters, it was necessary for the Temet FTIR to be installed in customers’ processes so that they could see the flexibility and reliability for themselves.

International performance certification schemes have been extremely influential in building confidence in FTIR and in the growth of the Gasmet business. However, certification by organisations such as TÜV in Germany and MCERTS in the UK, is a costly and time-consuming process. Nevertheless, environmental regulations around the world have increasingly required monitoring equipment to be certified and this has been a great benefit to Gasmet.

In addition to a reliance on the development of environmental legislation, it was also necessary to overcome market resistance to multicomponent analysers. In the 1990s process managers with a regulatory requirement to monitor a small number of specific gases preferred to buy individual analysers for each gas and if the number of gases was relatively low, this was generally less costly than a multi-component FTIR analyser. However, the operators of FTIR analysers were able to demonstrate a number of important advantages. Firstly, they were able to measure an almost limitless number of other chemical compounds that helped them to better manage their processes. Secondly, as legislation developed, it became necessary to measure new compounds for compliance purposes, and this was simple and cost-free for the users of FTIR, whereas those with single-gas analysers needed to purchase new hardware.

Many factors affect the choice of analyser, but the regulatory requirement is of course the most significant. A coal fired power station for example, may only be required to monitor SO2, NOx and CO emissions, whereas a municipal waste incineration plant will have to monitor other parameters such as organic compounds, HCl, HF and dioxins and heavy metals etc.

FTIR is ideal for process operators that need to:
1) Analyse multiple components, or
2) Analyse hot/wet gas
3) Analyse any gas in complicated gas mixtures

Management Buyout
In the early 2000s, demand for the company’s products grew very quickly requiring high levels of investment. This led to a management buyout in 2005 which resulted in the formation of Gasmet Technologies Oy. By that time, all of the company’s research, development, manufacturing and head office operations had been brought into one facility in Helsinki. Sales and service activities were undertaken by a global network of distributors and subsidiary offices in Hong Kong (2005), Canada (2009), and in Germany (2013).

The strategy of keeping all major functions in one facility was given a high level of priority because it enabled Gasmet to control the quality of key components such as the FTIR’s interferometer and also because it ensured that a high level of expertise was available before and after sales.

New Technologies
FTIR remains the most important analytical technology employed by Gasmet. However, until 2004 all of the analysers functioned by extracting a representative sample from a process or emission stream. At this time, the world’s first in-situ FTIR gas analyser was launched.

Portable Ambient FTIR System

Portable Ambient FTIR Analyser

Later, in 2008, Gasmet launched the world’s first portable ambient FTIR analyser, the DX-4030, capable of analysing large numbers of compounds simultaneously. This instrument brought Gasmet into many new monitoring applications; helping users to identify and measure almost any gas. At the time of writing (July 2013) the latest version, the DX-4040 remains unique and has been employed in occupational safety, incident response, Hazmat, chemical spill and fire investigations, shipping container testing, anaesthetic gas detection, greenhouse gas research and many others.

In 2013, Gasmet launched a new continuous mercury monitor (CMM). Coinciding with a new global treaty to reduce Mercury emissions, the new CMM could not have been better timed. Mercury is recognised as a chemical of global concern due to its long-range transport in the atmosphere, its persistence in the environment, its toxicity, its ability to bio-accumulate in ecosystems and its significant negative effect on human health. The CMM employs cold vapour atomic fluorescence (CVAF) to deliver very low detection limits at a significantly lower cost than other comparable mercury monitoring instruments.

Why has Gasmet been successful in the global market?
Gasmet has diversified significantly since its inception, which has helped to reduce business risk. However, emissions monitoring remains the largest application, and this is important because this presents challenging conditions for high-tech instruments, particularly since they are expected to run all day, every day, so it is extremely important that the technology is robust and very reliable. Typically, Gasmet’s analysers last from ten to fifteen years, significantly out-performing the computers that are employed to run them.

Every Gasmet customer is different, which means that almost every analyser is unique. Therefore, Gasmet and its distributors have invested heavily in technical support so that customers receive bespoke monitoring systems, tailored to meet their specific needs. The company’s philosophy is to sell solutions, not just instruments.

By manufacturing in-house Gasmet maintains a firm control not just over the quality of the products, but also the costs because there are no extra margins for outsourced sub-contract manufacturers. The quality of the products is further reinforced by the investment that has been made in performance verification/certification.

Finally, given that environmental legislation was only in its very formative stages, it is astonishing that from the results of the fundamental research work performed at the University of Oulu, two decades later Temet was able to build a business around it, and that the researchers’ foresight would lead to the creation of one of the world’s leading manufacturers of environmental monitoring instrumentation.

It is fortunate that the workers in Scanoptics chose to develop an analyser that would be ideal for continuous emissions monitoring. However, the definition of luck is: ‘preparation meeting opportunity’ and that is what happened with Gasmet’s FTIR and global environmental legislation.

2013: a ‘perfect storm’ in air quality!

500,000 people die prematurely in the EU from "Black Carbon"

500,000 people die prematurely in the EU from “Black Carbon”

As many countries fail to meet air quality targets and large numbers of premature deaths still result  from air pollution, Jim Mills, a particulate monitoring specialist and Managing Director of Air Monitors, explains why 2013 will be a pivotal year.

London, one of the great citys of the world recently marked the 60th anniversary of the 1952 Great Smog – “the worst air pollution event in the history of the United Kingdom!” This article is therefore timely, but demonstrates that, whilst there has been an improvement in air quality there is still a way to go before the problem may be said to be solved.

Jim Mills

Jim Mills

In November 2011 Janez Potočnik, European Commissioner for the Environment, expressed his determination to make 2013 the ‘Year of Air’. He acknowledged that there has been substantial improvement in air quality in recent decades but in the light of the environmental/climate issues surrounding air quality and the large number of premature deaths resulting from air pollution, he says: “The challenge for all of us is to address the shortcomings of existing regulations in a decisive and co-ordinated way. This will require the goodwill of policy-makers at all levels – European, national, regional and local – as well as other stakeholders such as the automotive and oil industries.”  (See also: Many Europeans still exposed to harmful air pollutants – 24/9/2012)

The European Environment Agency’s 2011 report on air quality reflects air quality improvements for a number of key parameters, with concentrations of sulphur dioxide and carbon monoxide falling by about half in the decade ending in 2009. However, the report also shows that in 2008, levels of nitrogen oxide, ozone and particulate matter have risen, fuelling concerns about overall air quality, especially in urban areas.

airpollAccording to the Commission, some 500,000 people die prematurely in the EU 27, mainly due to exposure to high levels of fine particulate matter (atmospheric microparticles or ‘dust’ of a diameter of less than 2.5 micrometres), which originates from residential heating, transport (diesel cars and trucks, ships and planes), agriculture, industrial processes and power production.

Particulate pollution continues to be a major problem, despite the considerable progress that has been made in the reduction of larger particulates such as PM10. This is due, in no small part, to the standard monitoring methodologies that have been adopted because particulates are generally monitored as the PM10 or PM2.5 fraction, whereas it is widely acknowledged that the finer particles (< 1 micron), are able to penetrate deeper into the lungs, and are responsible for the most severe health effects.

Black Carbon
A further problem associated with tiny particles is their ability to act as “sponges” carrying small amounts of toxic species such as PAH’s and Dioxins which are adsorbed onto Black Carbon particles and transported deep into the body. PM10 and PM2.5 monitoring measurements provide a total figure for everything with mass in the sample and thereby assume that all particles are of equal significance. In reality this is not the case because some of the particles are benign from a human health perspective or are not anthropogenic so are of less interest from an air quality management perspective. (see also article “Black Carbon” 14/10/2011)

It is fortunate that the fine particles (from the combustion of fossil fuels) that are of most interest are Black Carbon and can be measured with an Aethalometer, which employs an optical method to only measure those fine particles which are black. Importantly, an Aethalometer can provide a real-time readout of the mass concentration of ‘Black’ or ‘Elemental’ carbon aerosol particles in the air which means that live data can be used to manage the main contributor of urban Black Carbon: road traffic. However, while the importance of Black Carbon is becoming widely acknowledged, air quality monitoring standards need to be adapted so that Black Carbon monitoring is included in all national ambient monitoring stations.

A further consideration with Black Carbon is its role in climate change because, after carbon dioxide, it is believed to be the second largest contributor to current global warming. Black Carbon increases global warming by absorbing sunlight, darkening snow and influencing the formation of clouds; its effects are most noticeable at the poles, on glaciers and in mountainous regions – all of which are exhibiting the greatest effects of climate change.

Black Carbon stays in the atmosphere for a relatively short period of time – from days to weeks, before falling to ground as a result of dry deposition or precipitation. This is an important consideration in global strategy to combat climate change because CO2 stays in the atmosphere for many decades, so emissions reductions will take a long time to have an effect, whereas efforts to reduce Black Carbon could have a much faster beneficial impact on global warming.

In June 2011, a UN Environment Programme (UNEP) study estimated that ‘near-term’ global warming could be quickly reduced by 0.5 degrees Celsius by a reduction in Black Carbon emissions and that this would have an even greater benefit in the Arctic where it could reduce warming by 0.7 degrees.

Global action
The importance of short-lived pollutants was recognised by the eight richest industrialised nations which agreed to take emissions reduction measures for short-lived climate pollutants, including Black Carbon, methane, ground-level ozone and hydro fluorocarbons at the recent G8 meeting in Maryland USA.

The Camp David Declaration of May 2012 included the following:
‘Recognizing the impact of short-lived climate pollutants on near-term climate change, agricultural productivity, and human health, we support, as a means of promoting increased ambition and complementary to other CO2 and GHG emission reduction efforts, comprehensive actions to reduce these pollutants, which, according to UNEP and others, account for over thirty percent of near-term global warming as well as 2 million premature deaths a year. Therefore, we agree to join the Climate and Clean Air Coalition to Reduce Short-lived Climate Pollutants.’

British air quality
In common with most of Europe, the air quality improvements of recent decades have stalled and Britain and others are failing to meet many domestic and European air quality targets. The main parameters of concern are Nitrogen Oxides (from vehicles and electricity generation), Ozone (formed by a reaction between nitrogen oxides and organic gases) and Particulates (from combustion sources).

In 1952 over 4,000 Londoners (above the ‘normal’ mortality rate) are believed to have died as a result of the Great Smog and this led to the introduction of the Clean Air Act of 1956. However, in 1992, the Department of Health set up a Committee on the Medical Effects of Air Pollutants (COMEAP) which concluded that up to 24,000 deaths were still being ‘brought forward’ in the UK in 1995/1996 due to the effects of air pollution.

More recently, Britain’s Parliament’s Environment Audit Committee published a report which highlights the their poor performance on air quality and the subject has become a hotly debated issue.

Public awareness
Local Authorities monitor ambient air quality and publish data for the benefit of the public. However, as a result of public sector cutbacks environmental health professionals are focused on maintaining performance whilst implementing cost savings. Nevertheless, there are a number of opportunities for improvement; air quality needs to be higher on the political agenda and this can only be achieved if more people are aware of the problems, so we have to find ways to make it simpler for people to access easy to understand data. Happily, technology has advanced considerably and it will soon be possible to supplement existing monitoring networks with smaller low-cost ambient monitoring stations.

The main advantages of this technology, known as ‘AQMesh’, is that it can be positioned almost anywhere; improving spatial coverage and enhancing models. This will help ensure that readings are more representative of the air that people breathe. The AQMesh ‘pods’ are completely wireless, using battery power and GPRS communications to transmit data for the main air polluting gases to ‘the cloud’ where sophisticated data management will generate highly accurate readings as well as monitor hardware performance.

Traditional ambient monitoring stations have often been criticised because their physical location may limit their ability to provide representative data, so the ability to site low cost AQMesh pods in multiple locations, close to vehicles, passengers and pedestrians, will be a tremendous advantage.

‘What gets monitored, gets managed’
Marcus Pattison, one of the organisers of AQE 2013 (the air quality and emissions monitoring event) is a firm believer in the critical importance of monitoring for driving improvements. He says “The air quality progress that we have seen in recent decades has largely resulted from our ability to set targets and monitor our performance against them and this is why the Environment Agency, local authorities and the Source Testing Association are the driving forces behind events such as AQE 2013.”

AQE 2013 will be the world’s largest event to focus specifically on air quality monitoring and 2013 will be the seventh in this series of specialist air monitoring events which were previously known as ‘MCERTS’. Taking place at the International Centre in Telford, in England, on 13th and 14th March 2013, the focus, as with previous events, will be on testing and monitoring; but ambient air quality will be addressed in addition to environmental emissions and workplace exposure. As a result, the theme of the first day of the conference will be ambient air quality monitoring and the theme of the second day will be industrial emissions to air.

In common with the previous MCERTS events, AQE 2013 will also include 70 free walk in/walk out workshops and an exhibition featuring over 70 of the world’s leading organisations in air quality and emissions monitoring products and services.

Marcus Pattison is obviously delighted that Janez Potočnik has designated 2013 as the ‘Year of Air’ because “It creates a ‘perfect storm’ of activity in air quality; the new Industrial Emissions Directive is now in place, public awareness of air quality issues is growing and the Commissioner’s work will help to ensure progress, so I believe that AQE 2013 will be a timely event, making a powerful contribution to 2013 becoming the Year of Air.”

Remote-control boat speeds reservoir surveys


As the regulatory requirement, in Britain and elsewhere, to assess reservoirs and lakes expands to include smaller bodies of water, HR Wallingford has developed a remote control boat which is able to collect hydrometric data quickly, simply, safely and accurately.

ARC Boat

ARC Boat

The ARC-Boat employs a sophisticated SonTek  M9 Acoustic Doppler Profiler (ADP®) which is a 5-beam depth sounding device that scans the reservoir bed as the boat is guided across the water’s surface. Recorded data are analysed by SonTek Hydrosurveyor software to produce accurate depth measurement in addition to 3-D maps of the entire water body. With a small amount of post-processing in GIS or 3D CAD, an accurate water volume can be determined.

Craig Goff, a reservoir Supervising Panel Engineer and dam specialist at HR Wallingford has used the ARC-Boat in a trial project to assess five reservoirs and says “This new method offers tremendous advantages over traditional manned boat techniques because it is faster, safer, more environmentally friendly and involves fewer staff and resources. All of this combines to mean that it saves a great deal of time and money. This is particularly important because the Flood and Water Management Act 2010 will necessitate the volumetric assessment of many water bodies that have previously been below the threshold and therefore outside of the ambit of the Reservoirs Act 1975.”

Reservoir regulations
As a result of residential and industrial development in recent decades, the levels of risk associated with many British reservoirs have changed, and the British Flood and Water Management Act 2010 has amended their Reservoirs Act 1975 to bring a more risk-based approach to reservoir regulation. The 2010 Act seeks to achieve this by:

1. reducing the capacity at which a reservoir will be regulated from 25,000m³ to 10,000m³
2. requiring all Undertakers with reservoirs over 10,000m³ to register their reservoirs with the Environment Agency
3. ensuring that only those reservoirs assessed as high risk are subject to full regulation

The reservoir sections of the 2010 Act are dependent upon on the development of secondary legislation which is likely to specify the reservoir capacity above which water bodies will be regulated. However, irrespective of the content of this secondary legislation, the Flood and Water Management Act 2010 has clearly generated an urgent need for reservoir assessment and the application of the ARC-Boat for reservoir bathymetry is therefore propitious.

The ARC-Boat has been designed with a V-shaped hull to give optimal manoeuvrability and minimal air entrainment beneath the ADP, ensuring high quality data collection. The robust and reliable design, including grab handles fitted to the upper deck, mean that the boat can be launched from the most difficult locations and a unique detachable bow means that the ARC-Boat can easily be transported in an average sized car.

SonTek M9

SonTek M9

The SonTek M9 is a 9 beam acoustic Doppler profiler, using 5 beams at any one moment for depth measurements from a wide footprint on the water bed. This means that the time spent ‘driving’ the boat is minimised in comparison with single beam instruments. Importantly, the M9 is able to operate in depths ranging from 15cm to over 40m.

The boat employs industry standard remote control with a minimum range in excess of 200m and Bluetooth communications provide data transmission to an onshore laptop.

Data Management
HydroSurveyor™ is a system designed to collect bathymetric, water column velocity profile, and acoustic bottom tracking data as part of a hydrographic survey. The two key components of the system are the HydroSurveyor™ Acoustic Doppler Profiler (ADP®) platform, and the powerful, yet user-friendly, data collection software.

With the HydroSurveyor™ platform, SonTek is able to offer an exclusive 5-beam depth sounding device, with built-in navigation, full water column velocity (currents) profiling, full compensation for speed of sound (with the CastAway-CTD), and integrated positioning solution.

HydroSurveyer Real-time Data Shot!

HydroSurveyer Real-time Data Shot!

Trial Results
Craig Goff is extremely pleased with the results of the initial trials on five reservoirs in southern England. He says: “The M9 performed very well, running from 8am to 4.30pm each day on a single set of batteries. We were able to conduct the surveys much faster than has ever been possible before, without the health and safety risks of putting staff over water and the environmental risks of diesel powered larger survey boats. Most importantly, however, we were able to produce high quality accurate data for a modest price and our client was very pleased with the results.”

Applications for the ARC-Boat
In addition to the smaller reservoirs that will have to be surveyed, larger reservoirs will be able to take advantage of the new technology to assist in operations such as the creation of sedimentation models. These models inform strategies to prevent capacity depletion and to extend the lives of reservoirs through flushing, excavation, dredging etc. Similarly, ARC-Boat surveys can be employed around submerged hydropower or draw off pipe intakes to assess sedimentation levels – a vitally important role because sediment can seriously damage turbines, or influence operation of scour pipes or water supply draw off pipes from reservoirs.

As a result of the Flood and Water Management Act 2010, the owners of small reservoirs will need to prove whether their water bodies are affected by the amended Reservoirs Act 1975, by determining an accurate volume figure for their reservoirs. Typically, this will include landowners, farmers and organisations such as the National Trust. However, the development of the ARC-Boat with the M9 and the latest HydroSuveyor™ software mean that such work is now faster, safer and significantly lower cost. This is good news for the owners of smaller reservoirs for whom any survey cost is a new cost.



Rocket science! FTIR analysis in space!

FTIR gas analysis in the testing of satellite launch systems and on board satelite

Europe’s leading space technology company Astrium, has employed a sophisticated portable FTIR gas analyser as part of a test programme for satellite launch systems. The analyser, a ‘Gasmet DX4030‘, was supplied by instrumentation specialist Quantitech.

Astrium Propulsion Test & Launch Services Manager, Greg Richardson, says: “Many of the satellites that we design, build and launch are worth millions of Euros, so our test methods have to be extremely rigorous.

“The DX4030 was chosen because of its ability to provide highly accurate results for almost any gas. However, its intuitive software, compact size and portability were significant considerations because we use the technology at a number of our locations around the world.”

One of the tests that are performed on the propulsion systems is to check the integrity of the chambers that contain rocket fuel. To achieve this, the tanks are filled with a simulant (often isopropyl alcohol and demineralised water) and exposed to launch simulation conditions – pressure, heat, vibration etc. The simulant is then removed and the DX4030 is used to check for contamination or leaks.

LISA Pathfinder

The DX4030 was first utilised in the testing of the LISA Pathfinder; a project for which Astrium was selected by the European Space Agency to build and launch a spacecraft that will be packed with radical instrumentation and technology to pave the way for LISA (Laser Interferometer Space Antenna), the world’s first space-based gravity wave detector which will open a new window on the Universe by measuring gravitational waves generated by exotic objects such as collapsing binary star systems and massive black holes. In doing so, this project will be able to test a phenomenon predicted by Einstein’s General Theory of Relativity in 1916.

Star census
The analyser has also been used in testing the Gaia satellite which will conduct a census of a thousand million stars in our Galaxy, monitoring each of its target stars about 70 times over a five-year period. Gaia will precisely chart their positions, distances, movements, and changes in brightness. It is expected to discover hundreds of thousands of new celestial objects, such as extra-solar planets and failed stars called brown dwarfs. Gaia should also observe hundreds of thousands of asteroids within our own solar system.

DX 4040 with PDA

The DX4030 employs FTIR gas detection technology to obtain infrared spectra by first collecting an ‘interferogram’ of a sample signal with an interferometer, which measures all infrared frequencies simultaneously to produce a spectrum. This means that data is collected for the required parameters in addition to spectra for almost all others.

Sample identification is possible because chemical functional groups absorb light at specific frequencies. As a result, the DX4030 can measure any gas, with the exception of noble (or inert) gases, homonuclear diatomic gases (e.g., N2, Cl2, H2, F2, etc) and H2S (detection limit too high).

Commenting on the work at Astrium, Quantitech’s Dr Andrew Hobson said: “This has to be one of the more unusual applications for the DX4030. It is more commonly used for chemical spill, security and forensic investigations, and for occupational health, anaesthetic gas monitoring and research. The same technology is also employed to monitor industrial processes and gaseous emissions. However, Astrium’s work clearly demonstrates the flexibility of the device and we are delighted to have been involved.”

Photoelectric sensors collapse and rebound!

Market for photoelectric sensors faces increasing demand for smart sensors and price pressure

The market for photoelectric sensors experienced a collapse and a dramatic rebound in the last three years.  The market is now back to the development behavior that we have seen in the past.  As the market is strongly dependent on the investment climate, the situation has recently worsened, but ARC still expects a rather positive development for the coming years.

The market for smart sensing in the area of photoelectric sensors refers to all sensors that expand the traditional capabilities of fixed measuring and switching.

“Photoelectric sensors have long been in a position in which there was simply no alternative, but now ultrasonic sensors as well as low-end vision sensors are targeting the same applications.  While photoelectric sensors are still price competitive, they also add value for end users with more functions,” says ARC Analyst Florian Güldner, the principal author of ARC’s Photoelectric Sensors Worldwide Outlook

Dependency on Investment Cycles Challenges Suppliers
The demand for sensing is increasing with overall demand for sensors rising faster than for industrial automation in general.  Still, the investment climate overshadows technological effects and trends from the plant floor.  Growth in photoelectric sensors is directly linked to the business cycle.

Automation demand is often supported by the spare parts business, modernization projects, and longer project lead times.  But sensor suppliers cannot count on these dynamics.  In contrast, the relatively high share of sales through distributors emphasizes the effects from investment as distributors empty/fill up their stocks at the beginning of a development.

Flexibility and the willingness to diversify are key characteristics of successful sensor suppliers.  Many are adopting a more solution-oriented business model targeted at measurement, quality control, or safety applications.

Is Smart Sensing a Benefit for the End User?
Smart sensors make the lives of machine builders easier as they reduce the number of suppliers and parts, and help to reduce engineering time.  Smart sensing includes photoelectric sensors, that can self-adjust to the environment, can be configured remotely, can house more than one sensing technology (light barrier, diffuse etc.), can be used for measuring and switching, and that help to detect errors such as a dirty lens or a broken cable.

With all these capabilities, sensors can help to make machines more flexible, shorten changeover times, and minimize planned and unplanned downtime.  Whether or not photoelectric sensors are smart is only important if there is the right connection to the rest of the automation architecture.  Here IO–Link offers a good solution.   However, the acceptance and awareness of IO-Link is still limited.  With technologies like IO-Link, photoelectric sensors are becoming part of the automation hierarchy.

Will Asia Help Suppliers Form Established Economies?
Asia is the fastest growing region in our study.  However, some trends are dampening the impact on sensor suppliers — locally produced machinery is often simple and requires limited, simple, and cheap switches.  Local producers that serve a large portion of the market, especially in China, are forced to focus on high volume, small margin sensors.

Sensor suppliers that focus on direct sales sometimes have difficulty in the Chinese market as the market is primarily served by distributors and thus the companies need to re-think their go to market strategy.  This can also impact their way of doing business because it is solution focused and based on close customer relationships.  These companies benefit from global end user customers that also produce in China, but have also adapted to the new market.

Profiles for the major suppliers for this market are included in this report.  Each profile reviews the company’s business, products, and services as it applies to this market.  Suppliers profiled include Balluff, Banner Engineering, Baumer Electric, Contrinex, Keyence, Leuze, OMRON, Pepperl+Fuchs, Rockwell Automation/Allen Bradley, Schneider Electric, SICK, Sunx, Yamatake.

Asia to be the new magnet for sensor manufacturers!

Sensors to benefit from shift toward better automation

New and expanding applications, coupled with the shift to enhanced automation processes and controls, are restoring growth to the global market for proximity and displacement sensors, which suffered negative growth rates in 2009.

New analysis from Frost & Sullivan, Analysis of the Proximity and Displacement Sensors Markets, finds that the market earned revenues of $2,427.5 million (ca €1950m) in 2011 and estimates this to reach $3,048.1 million (ca €2450m) in 2018. The research covers inductive, photoelectric, capacitive, magnetic, ultrasonic and LVDT sensors.

“The need for better automation is expected to allow for the conversion from older and less sophisticated controls to state-of-the-art automation,” notes Frost & Sullivan Senior Industry Analyst V. Sankaranarayanan. “As a result, the number and range of sensors used in equipment is increasing.”

Due to the rising sophistication in manufacturing processes, end-users are demanding more functionality from proximity sensors. Advanced network technologies (CompoNet and IO-Link) and diagnostic capabilities are some of the technical advancements that are also anticipated to boost market prospects.

Growth in mature markets such as Western Europe and North America is expected to be slow. The potential for further growth is limited, as most industrial processes are already using proximity and displacement sensors. In contrast, Asia is becoming progressively more important due to surging production and automation.

“Production in emerging economies, such as China and India, is becoming increasingly automated,” explains Sankaranarayanan. “Robust economic growth in these regions is expected fuel the demand for proximity and displacement sensors.”

Proximity sensors find application in almost every industry (due to the importance of feedback), underlining the widespread consumer demand for them. In addition to China and India, growth opportunities are also surfacing in other smaller Asian countries that have embarked on a path of economic development.

Keys to success will be emerging network technologies, solutions instead of products, regional growth markets and a successful distribution strategy.

“It is important to offer more than just a sensor; market participants will have to focus on providing complete solutions,” concludes Sankaranarayanan. “Price pressures will continue to pose a challenge, so vendors will need to constantly advance on the technological front.”

Rising demand and competition drives proximity sensor market


After the devastating recession in 2009, the market for proximity sensors recovered quickly in 2010 and marginally exceeded pre-crisis levels in 2011.  In 2012, we see a slowdown in growth but positive momentum will continue to dominate the market developments from 2013 onwards.  Overall ARC expects a CAGR of around 8 percent during the forecast horizon.

The market for proximity sensors is strongly connected to the business cycle and the overall performance of automation markets.  During the last few years, most products have commoditized and reached a mature state, the only exception is ultrasonic sensors.  The latter still offer the potential to technically differentiate from competitors, and markets are growing fast despite falling prices.  For inductive and capacitive sensors, prices have nearly bottomed out.

“The proximity sensor market is mature, highly competitive, and hosts a large number of suppliers.  This has created a market that appears settled, but actually has a lot of movement going on beneath the surface.  This especially includes brand labeling and partnering agreements,” according to Analyst Florian Güldner, the principal author of ARC’s “Proximity Sensors Worldwide Outlook”.

IO-Link Further on the Rise
Smart sensing is a market that we expect to grow at an above average rate during the forecast horizon.  The technology enables new applications and enhanced performance in existing applications.  Proximity sensors are normally not equipped with a microcontroller for signal processing simply for cost reasons.  We talk about smart proximity sensors if:

  • A sensor communicates more than its measured variable
  • A sensor has built-in intelligence to self-adjust to the environment or the detected object
  • A sensor can communicate with the controller or other devices to receive parameters
  • A sensor is enabled for band-sensing

The definition includes all devices using IO-Link.  The additional intelligence also adds complexity and cost, and ARC sees ultrasonic, photoelectric, and capacitive sensors as the first target markets for smart sensors.  IO-Link has a strong value proposition for end users, sensor manufacturers, and also machine builders.  “We see this technology growing much stronger during the forecast horizon,” according to Florian Güldner.

Ultrasonic Sensing Grows Faster than Market Average
Ultrasonic sensors is a fledgling market in the discrete sensing sector.  The technical challenges, combined with relatively high R&D costs, have kept many suppliers of low-cost products from entering the market.

Compared to other discrete sensor markets, technological advancements are still possible.  As the technology gets more reliable and accurate, many applications which have previously relied on photoelectric and capacitive technologies now use ultrasonic sensors.  These can lead to increased competition with capacitive sensors and photoelectric sensors.  The high growth rates, in turn, will certainly attract new market participants. In general, extensive brand labeling is one of the key characteristics in the ultrasonic sensor market.

Technology transfer is more important than innovation.

Innovation, we are told, is the key to a successful manufacturing economy. Invent and patent something and you will make money on every one made – if it is mass produced you get an on-going income. Tony Ingham of Sensor Technology Ltd looks at his experience over two decades and tries to draw some general principles that may help his, and other countries in their new ambition to rebuild the manufacturing base.

Tony InghamWe are all familiar with the idea of the man who had a brilliant idea and got it to market – James Dyson and his revolutionary vacuum cleaner; Steve Jobs who dropped out of college, tinkered with some electronics and set up Apple; Percy Shaw, the man behind the cat’s eye. And most of us are bright enough to know that there is an awful lot of hard work between having the idea and reaping generous rewards.

Never the less the principle is sound and of far greater significance then the inventor’s eventual personal wealth is the fact that whole companies and industries can be developed and sustained. These will employ many people and make an integral contribution to the national economy.

Sensor Technology is a company steeped in innovation. A large percentage of our staff are scientists, engineers and researchers who have pushed the boundaries of knowledge and technology. Over the years they have created many practical technologies that we have developed into successful commercial products, mainly in the field of sensing and measurement.

Of course there have been some failures along the way too. In fact if you were to count up, probably more failures than successes. But the important thing is that the successes can be capitalised upon to more-than cover the cost of the failures.

This is pretty much the standard model for innovation that economists talk about (it is also the model that politicians and pundits abbreviate to ‘one good idea and you are made for life)’.

But my feeling in that this misses out one critically important point, namely ‘technology transfer’ – introducing the invention to ever-more sectors and possible new users (perhaps tweaking it or repackaging it to meet specific needs).

Popular examples of this include space technology (something developed for NASA transfers to military aircraft, then commercial ones, then other non-aerospace applications) and Formula 1. But it also occurs in more mundane fields, for instance automation technologies such as VSDs, PLCs and HMIs are increasingly built into consumer products.

My experience at Sensor Technology is probably typical of the more usual type of technology transfer. The simple truth is that we have invented relatively few new technologies, but we have applied each to many different areas, sometimes by our own endeavours, sometimes by licensing or selling the technology. Also it is sometimes a ‘secondary or supporting’ technology that can be transferred, as illustrated by our TorqSense and LoadSense products. TorqSense is very clever; it uses surface acoustic wave sensors to measure torque in rotating shafts and we have applied this to more and more fields.

However, to make TorqSense viable we had to develop a secondary technology, a non-contact way of collecting the data stream, so that we didn’t have to use slip rings. So we combined radio transmissions with piezo technology and got our solution.

In another area we were talking to helicopter pilot about an entirely different project when he mentioned that because he had wired a cargo load gauge into his cockpit he was going to have to get his craft’s airworthiness certificate expensively renewed. We asked if a radio link between cargo hook and gauge would mean he hadn’t modified his aircraft, and assured him that we could develop just such a solution!

That was the light bulb moment and naturally there was a year or so of hard graft to get to a commercialisable product. But now a few years down the line we have a nice new business selling intelligent cargo hooks to helicopter operators around the world.

The important point is that if governments want to remodel a national economy to make engineering design and manufacturing stronger, then they need to encourage technology transfer as much as innovation.