What is on the list of trends for 2020?

06/12/2019
Data centre trends for 2020 from Rittal

Growing volumes of data, a secure European cloud (data control), rapid upgrades of data centres and rising energy consumption are the IT/data centre trends for Rittal in 2020. For example, the use of OCP (Open Compute Project) technology and heat recovery offers solutions for the challenges of the present.

Concept of cloud computing or big data, shape of cloud in futuristic style with digital technology interface!

According to the market researchers at IDC (International Data Corporation), humans and machines could already be generating 175 zettabytes of data by 2025. If this amount of data were stored on conventional DVDs, it would mean 23 stacks of data discs, each of them reaching up to the moon. The mean 27 percent annual rate of data growth is also placing increasing pressure on the IT infrastructure.

Since there is hardly any company that can afford to increase its own data storage by almost a third every year, IT managers are increasingly relying on IT services from the cloud. The trend towards the cloud has long since been a feature in Germany: A survey published in the summer of 2019 by the Bitkom ICT industry association together with KPMG showed that three out of four companies are already using cloud solutions.

However, businesses using cloud solutions from third-party providers do lose some control over their corporate data. That is why, for example, the US Cloud Act (Clarifying Lawful Overseas Use of Data) allows US authorities to access data stored in the cloud, even if local laws at the location where the data is stored do prohibit this.

“Future success in business will be sustainable if they keep pace with full digital transformation and integration. Companies will use their data more and more to provide added value – increasingly in real time – for example in the production environment,” says Dr Karl-Ulrich Köhler, CEO of Rittal International. “Retaining control over data is becoming a critical success factor for international competitiveness,” he adds.

Trend #1: Data control
The self-determined handling of data is thus becoming a key competitive factor for companies. This applies to every industry in which data security is a top priority and where the analysis of this data is decisive for business success. Examples are the healthcare, mobility, banking or manufacturing industries. Companies are now faced with the questions of how to process their data securely and efficiently, and whether to modernise their own data centre, invest in edge infrastructures or use the cloud.

The major European “Gaia-X” digital project, an initiative of the German Federal Ministry for Economics and Energy (BMWi), is set to start in 2020. The aim is to develop a European cloud for the secure digitalization and networking of industry that will also form the basis for using new artificial intelligence (AI) applications. The Fraunhofer Gesellschaft has drawn up the “International Data Spaces” initiative in this context. This virtual data room allows companies to exchange data securely. The compatibility of their own solutions with established (cloud) platforms (interoperability) is also provided.

This means that geographically widespread, smaller data centres with open cloud stacks might be able to create a new class of industrial applications that perform initial data analysis at the point where the data is created and use the cloud for downstream analysis. One solution in this context is ONCITE. This turnkey (plug-and-produce) edge cloud data centre stores and processes data directly where it arises, enabling companies to retain control over their data when networking along the entire supply chain.

Trend #2: Standardisation in data centres with OCP
The rapid upgrade of existing data centres is becoming increasingly important for companies, as the volume of data needing to be processed continues to grow. Essential requirements for this growth are standardised technology, cost-efficient operation and a high level of infrastructure scalability. The OCP technology (Open Compute Project) with its central direct current distribution in the IT rack is becoming an interesting alternative for more and more CIOs. This is because DC components open up new potentials for cost optimisation. For instance, all the IT components can be powered centrally with n+1 power supplies per rack. This way, an efficient cooling is achieved, since fewer power packs are present. At the same time, the high degree of standardisation of OCP components simplifies both maintenance and spare parts management. The mean efficiency gain is around five percent of the total current.

Rittal expects that OCP will establish itself further in the data centre as an integrated system platform in 2020. New OCP products for rack cooling, power supply or monitoring will enable rapid expansion with DC components. Furthermore, new products will support the conventional concept of a central emergency power supply, where the power supply is safeguarded by a central UPS. As a result, it will no longer be necessary to protect every single OCP rack with a UPS based on lithium-ion batteries. The advantage: the fire load in the OCP data centre is reduced considerably.

Trend #3: Heat recovery and direct CPU cooling
Data centres release huge amounts of energy into the environment in the form of waste heat. As the power density in the data centre grows, so too do the amounts of heat, which can then potentially be used for other purposes. So far, however, the use of waste heat has proven too expensive, because consumers are rarely found in the direct vicinity of the site for example. In addition, waste heat, as generated by air-based IT cooling systems, is clearly too low at a temperature of 40 degrees Celsius to be used economically.

In the area of high-performance computing (HPC) in particular, IT racks generate high thermal loads, often in excess of 50 kW. For HPC, direct processor cooling with water is significantly more efficient than air cooling, so that return temperatures of 60 to 65 degrees become available. At these temperatures, for instance, it is possible to heat domestic hot water or use heat pumps or to feed heat into a district heating network. However, CIOs should be aware that only about 80 percent of the waste heat can be drawn from an IT rack, even with a direct CPU water cooling. IT cooling is still needed by the rack for the remaining 20 percent.

At the German Government’s 2019 Digital Summit, the topic of heat recovery was discussed in the working group concerned, which identified a high need for action. For this reason, Rittal assumes that by 2020, significantly more CIOs will be involved in the issue of how the previously unused waste heat from the data centre can be used economically.

Trend #4: Integration of multi-cloud environments
Businesses need to be assured that they can run their cloud applications on commonly used platforms and in any country. This calls for a multi-cloud strategy. From management’s point of view, this is a strategic decision based on the knowledge that its own organisation will develop into a fully digitised business.

For example, an excellent user experience is guaranteed by minimising delays with the appropriate availability zones on site. This means that companies choose one or more availability zones worldwide for their services, depending on their business requirements. Strict data protection requirements are met by a specialised local provider in the target market concerned, for example. A vendor-open multi-cloud strategy allows exactly that: combining the functional density and scalability of hyperscalers with the data security of local and specialised providers such as Innovo Cloud. At the push of a button, on a dashboard, with a contact person, an invoice and in the second when the business decision is made – this is what is making multi-cloud strategies one of the megatrends of the coming years. This is because the economy will take further steps towards digital transformation and further accelerate its own continuous integration (CI) and continuous delivery (CD) pipelines with cloud-native technologies – applications designed and developed for the cloud computing architecture. Automating the integration and delivery processes will then enable the rapid, reliable and repeatable deployment of software.

#PAuto @Rittal @EURACTIV @PresseBox

 


Checking organic carbon content.

25/11/2019

Methods for checking water quality are an incredibly important part of the many processes involved in ensuring we have access to safe drinking water. However, as contaminants can come from many different sources, finding a general solution for contaminant identification and removal can be difficult.

Purification processes for water treatment include removal of undesirable chemicals, bacteria, solid waste and gases and can be very costly. Utility companies in England and Wales invested £2.1 billion (€2.44b) between 2013 and 2014 into infrastructure and assorted costs to ensure safe drinking water.1

One of the most widely used measures for assessing whether water is safe for consumption or not is the analysis of the organic carbon (TOC) content. Dissolved organic carbon content is a measure of how much carbon is found in the water as part of organic compounds, as opposed to inorganic sources such as carbon dioxide and carbonic acid salts.2   It has been a popular approach since the 1970s for both assessment of drinking water and checking wastewater has been sufficiently purified.

The proportion of organic carbon in water is a good proxy for water quality as high organic carbon levels indicate a high level of organisms in the water or, contamination by organic compounds such as herbicides and insecticides. High levels of microorganisms can arise for a variety of reasons but are often a sign of contamination from a wastewater source.

Testing TOC
Water therefore needs to be continually monitored for signs of change in the TOC content to check it is safe for consumption. While many countries do not specifically regulate for TOC levels, the concentrations of specific volatile organic compounds are covered by legislation and recommended levels of TOC are 0.05 ml/l or less.3

There are a variety of approaches for testing water for organic carbon. One approach is to measure the entire carbon content (organic and inorganic) and then subtract any carbon dioxide detected (as it is considered inorganic carbon) and any other carbon from inorganic sources. Another is to use chemical oxidation or even high temperature so that all the organic compounds in the sample will be oxidized to carbon dioxide, and measuring the carbon dioxide levels, therefore, acts as a proxy for the TOC concentration.

For wastewater plants, being able to perform online, real-time analysis of water content is key and measurements must be sensitive and accurate enough to pick up small changes in even low concentrations of chemical species. British legislation also makes it an offense to supply drinking water which does not adhere to legislation4, with several water suppliers being fined over a hundred million pounds for recent discharges of contaminated waters.5

Vigilant Monitors
One of the advantages of using carbon dioxide levels as a proxy for TOC content is that carbon dioxide absorbs infrared light very strongly. This means using nondispersive infrared (NDIR) detectors provide a very sensitive way of detecting even trace amounts of carbon dioxide.

Edinburgh Sensors are one of the world leaders in NDIR sensor production and offer a range of NDIR-based gas detectors suitable for TOC measurements of water.6 Of these, for easy, quick and reliable TOC measurements, the Gascard NG is an excellent device for quantifying carbon dioxide levels.7

Gascard NG
The Gascard NG is well-suited to continual carbon dioxide monitoring for several reasons. First, the device is capable of detecting a wide range of carbon dioxide concentrations, from 0 – 5000 ppm, maintaining a ± 2 % accuracy over the full detection range. This is important so that the sensor has the sensitivity required for checking TOC levels are sufficiently low to be safe for drinking water but means it is also capable of operating under conditions where TOC levels may be very high, for example in the wastewater purification process.

As it can come with built-in true RS232 communications for both control and data logging, the Gascard NG can be used to constantly monitor carbon dioxide levels as well as be integrated into feedback systems, such as for water purification, to change treatment approaches if the TOC content gets too high. UK legislation also requires some level of record-keeping for water quality levels, which can also be automated in a straightforward way with the Gascard.4

The Gascard NG is capable of self-correcting measurements over a range of humidity conditions (0 – 95 %) and the readings can be pressure-corrected with on-board electronics between 800 to 1150 mbar. Temperature compensation is also featured between 0 to 45ºC to ensure reliable measurements, over a wide range of environmental conditions.

Designed to be robust, maintenance-free and fail-safe, the Gascard NG also comes with several customizable options. The expansion port can be used for small graphical display modules for in-situ observable readings and TCP/IP communications can also be included if it’s necessary to have communications over standard networks. In conjunction with Edinburgh Sensor’s expertise and pre- and post-sales support, this means that the Gascard NG can easily be integrated into existing TOC measurement systems to ensure fast and accurate monitoring at all times.

NOTES

  1. Water and Treated Water (2019),
  2. Volk, C., Wood, L., Johnson, B., Robinson, J., Zhu, H. W., & Kaplan, L. (2002). Monitoring dissolved organic carbon in surface and drinking waters. Journal of Environmental Monitoring, 4(1), 43–47.
  3. DEFRA (2019)
  4. Water Legislation (2019),
  5. Water Companies Watchdog (2019)
  6. Edinburgh Sensors (2019),
  7. Gascard NG, (2019),
#Pauto @Edinst

Wireless lifting!

22/11/2019

New wireless control technologies are being adopted by manufacturers and users of cranes and other lifting equipment. Tony Ingham of Sensor Technology Ltd explains the advantages and looks at how the field is developing.

LoadSense

Trade Ship Harbored at Port of San Pedro, California, U.S.A.

Most of us find it odd to look back five or ten years to when our home computers were tethered to the wall by a cable. Somehow we just accepted that the cable restricted the mobility of the device and lived with that limitation. Now, of course, it is completely normal to pull a mobile phone out of your pocket and dial up the internet so that we can look up obscure information, book tickets or connect with a computer many miles away.

In the industrial and commercial arenas, wireless technology has revolutionised many practices. Logistics companies now routinely track the progress of every single parcel in their charge; field engineers collect data from and send instructions to remote facilities such as water pumping stations; customer accounts can be updated in real time, etc.
However, there is another aspect to wireless technology that is less obvious to the general public, but which engineers and technicians are really coming to appreciate. This may best be described as ‘local wirelessness’ or ‘wireless LANs (local area networks)’. Basically these remove to the need to install and maintain wiring for control equipment fitted to machinery such as cranes, hoists, lifts and elevators.

Control technology is essential to many modern industries, as it is the only practical way to ensure reliable operation and high efficiency.

Handling products through a busy working environment whether it is a container port, manufacturing plant, warehouse, logistics centre or retail outlet involves making sure that materials handling is rapid, accurate and safe. This requires a control system that can handle huge amounts of data in real time, can safely operate heavy duty machinery, and if necessary withstand extremes of climate and environment. Further, as many operations now run 24×7 there are likely to be immediate consequences to breakdowns and other stoppages, meaning control equipment has to be robust and reliable.

However the basic principles of a control system are relatively simple. The rotation of drive shafts in the various cranes and other machinery can be used to collect load and movement data on each item being moved. Each turn of the shaft will progress the equipment’s operation forward or backward a small but consistent amount, and if you can also measure the torque (rotational strain) in the shaft you can calculate the weight of the load being transferred.

This raw data stream can be used to easily calculate operational information such as the amount of goods or product moved, the time to completion of each operation, and the destination of each load. It is also equally easy to convert this operational data into commercial information and safety reports that include cumulative operating hours, total load lifted and other statistics.

In the past taking measurements from drive shafts has been difficult, but TorqSense, from Sensor Technology, provides a perfect solution. Previously, it was necessary to install sensors in difficult to access parts of industrial machinery and wire them back into the communication network that connected with a computer for collecting and interpreting the data. And once installed, the wiring had to be protected from damage and replaced if it failed.

However TorqSense gets around this by using radio transmissions instead of wiring. Further, old fashioned torque sensors tended to be delicate because they needed a fragile slip ring to prevent the turning drive shaft from pulling the wiring out of place, whereas TorqSense uses a wireless radio-frequency pick-up head that does not need physical contact with the rotating shaft.

A practical attraction of TorqSense is that its wirelessness makes it ultra-simple to install and robust in use. Furthermore it is largely unaffected by harsh operating environments, electromagnetic interference, etc. It is equally at home measuring coal on a conveyor, working on a dockside crane weighing and counting containers or in any other lifting application.

TorqSense is proving popular with an increasing number of users across many fields – not only in lifting, but also in robotics, chemical mixing and automotive applications – almost anywhere that uses machinery with rotating drive shafts.

Sensor Technology has also developed a complementary range of sensors, which measures the straight line equivalent to torque. Called LoadSense, this too uses a wireless radio frequency pick-up to collect data signals from the sensing head and transmit them wirelessly to a receiving computer for analysis.

It is notable, that both TorqSense and LoadSense can be used in a fully wireless mode, but equally can be fitted into conventional cabled systems, so is easy to retrofit into existing control systems.

It is also interesting to know that LoadSense was actually developed at the behest of a particular customer, a helicopter pilot who wanted real-time and exact information about the weight of loads he was carrying from an underslung cargo hook. The issue for him was that he would invalidate his helicopter’s certificate of airworthiness if he drilled a hole in the fuselage for a cable, so he had to have a wireless solution. This application also required very robust hardware that could withstand heat and cold, extreme movements and shock loads and be unaffected by motor noise, radio interference etc – all characteristics that translate well into fields such as lifting, conveying and cranage.
TorqSense, LoadSense, wireless data transfer and communications are making an increasing contribution to the development of materials handling technologies. While they are ‘invisible’ to the casual observer, they have the capacity to revolutionise many aspects of lifting operations and to drive efficiency, reliability and safety to new levels.

#PAuto @sensortech

Unique data acquisition system.

01/11/2019

The modular design of the new KiDAQ from Kistler Instruments makes unlimited channels and distributed systems a reality whatever the application; industrial, laboratory, permanent installations and mobile. Engineers now have everything needed to complete any measuring task: a single integrated system that can be flexibly expanded at any time with additional measurement modules and units.

The new data acquisition system can be configured to suit any application with a choice of portable and 19 inch rack housings that can accommodate up to 13 measurement modules each, and DIN Rail modules for industrial installations with any number of measurement modules and other components of a system. This flexibility allows users to configure a data acquisition system that best meets to their current needs. Using components from Kistler’s hardware, software and sensor portfolios means that, as needs change, the original system can be adapted and extended by reconfiguring the existing modules and adding others as needed. Nothing is made redundant ensuring a low cost of ownership without compromising system performance and capability.

Reliable information about the measurement uncertainty
A key characteristic of the new KiDAQ data acquisition system is KiXact technology, which automatically calculates the measurement uncertainty. With know-how across the entire measuring chain, a reliable statement of measurement uncertainty is now possible with this technology. This eliminates the time and effort needed to manually calculate the measurement uncertainty of the whole system.

KiDAQ provides data transparency across the entire measuring chain. As Kistler qualifies the measuring components and calibrates them for each application, the exact specifications of their properties are available. The engineer is provided with precise details about their measurements and reliable information about measuring accuracy.

KiConnect technology: Nerve centre of the KiDAQ system
Drawing on smart KiConnect technology, users can put together various Kistler measuring components and, if necessary, add selected third-party devices to the system. The integrated Precision Time Protocol (PTP) ensures exact time synchronization between the devices in the local network. Based on standard protocols (TCP/IP) the measurements can be configured and performed anywhere.

KiStudio Lab software: The entire measuring task at a glance
The web-based KiStudio Lab software enables measuring tasks and projects to be easily processed and managed. The user interface is developed according to the latest usability criteria and designed so that it can be intuitively and efficiently operated by both occasional and experienced users. Via the software, engineers can access all measuring data and results at any time from current and older projects. The raw data are easily and securely stored in a central repository and can be exported in various formats and analysed offline. The stored measurement setup can be called up at any time in the case of further measurement tasks. This saves users the time required for complex test setup and measurements can be performed immediately the next time.

Kistler KiDAQ – The total solution
KiDAQ is an innovative, modular data acquisition system which enables the user to create a test setup, perform measurements and achieve reliable results almost instantly. Kistler’s expertise and experience covers the entire measuring chain, from the sensor, to signal conditioning, and on to the software. A KiDAQ data acquisition system can be expanded at any time with measurement modules and measurement units. Third-party sensors can also be integrated into the system. The unique software provides step-by-step guidance throughout the system setup and provides valuable insight into the entire measuring chain.

#Kistler #PAuto #TandM

Non-contact technology simplifies torque monitoring and aids efficiency.

17/10/2019
Monitoring torque in a drive shaft is one of the best ways of assessing the performance of plant and machinery. However because drive shafts rotate, hard wiring a sensor into place usually requires the use of a delicate slip ring. An alternative solution is to use a non-contact radio frequency detector to monitor ‘Surface Acoustic Waves’ (SAWs), as Mark Ingham of Sensor Technology Ltd explains.

Torque imparts a small degree of twist into a driven shaft, which will distort SAW devices (small quartz combs) affixed to the shaft. This deformation causes a change in the resonant frequency of the combs, which can be measured via a non-contact radio frequency (RF) pick-up mounted close to the shaft. The pick-up emits an RF signal towards the shaft which is reflected back by the combs with its frequency changed in proportion to the distortion of the combs.

A SAW transducer is able to sense torque in both directions, and provides fast mechanical and electrical responses. As the method is non-contact it has also offers complete freedom from slip rings, brushes and/or complex electronics, which are often found in traditional torque measurement systems. SAW devices also have a high immunity to magnetic forces allowing their use in, for example, motors where other analogue technologies are very susceptible to electronic interference.

More detail:
In its simplest form, a SAW transducer consists of two interdigital arrays of thin metal electrodes deposited on a highly polished piezoelectric substrate such as quartz. The electrodes that comprise these arrays alternate polarities so that an RF signal of the proper frequency applied across them causes the surface of the crystal to expand and contract and this generates the surface wave.

These interdigital electrodes are generally spaced at half- or quarter-wavelength of the operating centre frequency. Since the surface wave or acoustic velocity is 10-5 of the speed of light, an acoustic wavelength is much smaller than its electromagnetic counterpart.

For example, a signal at 100Mhz with a free space wavelength of three metres would have a corresponding acoustic wavelength of about 30 microns. This results in the SAW’s unique ability to incorporate an incredible amount of signal processing or delay in a very small volume. As a result of this relationship, physical limitations exist at higher frequencies when the electrodes become too narrow to fabricate with standard photolithographic techniques and at lower frequencies when the devices become impractically large. Hence, at this time, SAW devices are most typically used from 10Mhz to about 3Ghz.

Applications
SAW-based torque sensors have been used around the world and in many fields, from test rigs to wind turbines and generators based on tidal or river flows. They are used extensively in the high tech world of the development of engines and gearboxes for Formula 1. Pharmaceutical companies employ them to monitor the pumps micro-dosing active ingredients into medicines and tablets. Torque feedback systems can be used by security firms to determine the direction their movable CCTV cameras are facing so that they can efficiently watch premises under their protection.

Today, as industrial engineers automated manufacturing and processing operations they are increasingly turning to torque monitoring to generate the vital operating and production data that maintains production and efficiency.

@sensortech #PAuto

Managing dust risks at quarries!

16/10/2019
In this article, Josh Thomas from instrumentation specialist Ashtead Technology, discusses the risks associated with dust at quarries, and highlights the vital role of monitoring.

Josh Thomas

Background
Almost all quarrying operations have the potential to create dust. Control measures should therefore be established to prevent the generation of levels that cause harm. These measures should be identified in the health and safety document, and measurements should be taken to monitor exposure and demonstrate the effectiveness of controls.

Many minerals contain high levels of silica, so quarrying activities of these materials generate silica dust known as respirable crystalline silica (RCS) and particular care must be taken to control exposure. Guidance is available from the British Health & Safety Executive (HSE); see document HS(G) 73 Respirable crystalline silica at quarries. Sandstone, gravel and flint typically contain over 70% crystalline silica, shale contains over 40% and granite can contain up to 30%. Inhaling RCS can lead to silicosis which is a serious and irreversible lung disease that can cause permanent disablement and early death. There is an increased risk of lung cancer in workers who have silicosis, and it can also be the cause of chronic obstructive pulmonary disease (COPD).

The British Control of Substances Hazardous to Health Regulations 2002 (COSHH) requires employers to ensure that exposure is prevented or, where this is not reasonably practicable, adequately controlled. The COSHH definition of a substance hazardous to health includes dust of any kind when present at a concentration in air equal to or greater than 10 mg/m3 8-hour time-weighted average of inhalable dust, or 4 mg/m3 8-hour TWA of respirable dust. This means that any dust will be subject to COSHH if people are exposed to dust above these levels. Some dusts have been assigned specific workplace exposure limits (WELs) and exposure to these must comply with the appropriate limits. For example, the WEL for RCS is 0.1 mg/m3 8-hour TWA.

The Quarries Regulations 1999 (GB) cover all surface mineral workings, and include tips and stockpiles, as well as areas used for crushing, screening, washing, drying and bagging. Buildings and other structures are also included, as are common areas and prospecting sites. The Regulations were created to protect the health and safety of quarry staff, as well as others that may be affected by quarrying activities, such as those living, passing or working nearby, or visiting the site.

The role of monitoring
In order to assess the risks posed by dust, it is necessary to undertake both workplace monitoring – inside buildings, vehicle cabs etc., as well as environmental monitoring in and around the quarry. The technology for doing so is similar but different instruments are available for every application. Ashtead supplies personal air sampling pumps when it is necessary to conduct compliance monitoring, or when the identification and measurement (in a laboratory) of a specific dust type, such as RCS, is required.

Once the dust risks at a quarry have been assessed, ongoing monitoring is more often conducted with direct reading instruments that employ optical techniques to measure the different particulate fractions. Portable battery-powered instruments such as the TSI SidePak and the DustTrak are ideal for this purpose and feature heavily in Ashtead’s fleet of instruments for both sale and rental.

Installed TSI DTE

The same dust monitoring technology is employed by the TSI DustTrak Environmental (DTE), which has been developed specifically for applications such as dust monitoring at quarries. Fully compliant with stringent MCERTS performance requirements, the DTE employs a ‘cloud’ based data management system, which provides users with easy access to real-time data on dust levels, with the optional addition of other sensors. Alarm conditions can be set by users so that text and email alerts are issued when threshold levels arise. The DTE monitors PMTotal, PM10, PM2.5 and PM1.0 mass fractions simultaneously, which provides detailed information on the type of dust present, and means that alarms can be set for specific fractions.

Clearly, dust monitors can perform a vital role in helping to protect safety at working quarries. However, a TSI DTE was recently hired from Ashtead Technology to perform monitoring prior to the commencement of quarrying operations, so that baseline dust levels could be established for comparison once the quarry is operational. Monitoring prior to operations is important, because airborne dust at a quarry is not necessarily derived from the quarry alone; local agricultural or industrial activities may also contribute to the particulate burden. This also highlights the advantages of 24/7 monitoring because dust pollution may be intermittent, so continuous monitors such as the DTE are able to identify peaks and thereby assist in the attribution of sources.

Ashtead Technology fitted the DTE mentioned above with a solar panel and rechargeable battery so that it could operate unattended for extended periods in a remote location. With web-based access to the data, site visits were minimised and costs lowered. This equipment was hired from Ashtead to avoid capital expenditure, and looking forward, the client is planning to add a Lufft wind monitor to the rental, because data on wind speed and direction helps with modelling and with the identification of dust pollution sources.

Summary
Ideally, quarry site monitoring should be undertaken prior to the commencement of operations to establish baseline levels for that site. Risk assessments can then be undertaken around the site and within buildings and vehicles/machinery. However, conditions can change significantly, so continuous monitoring is preferable. Changes in quarry practices and weather can affect environmental conditions, and workplace exposure can be affected by a wide range of factors such as broken filter bags, spillage, insufficient cleaning, filter blockage and dry (instead of wet) drilling or cutting.

With a variety of applications for dust monitoring, it is important that appropriate technology is employed, so the Ashtead Technology instrument fleet has been developed to meet almost every need, and technical advice is available to help consultants and quarry operators ensure that dust hazards and effectively managed.

#Environment @ashteadtech @_Enviro_News

Managing NOx gas emissions from combustion.

26/09/2019
Pollution can only be managed effectively if it is monitored effectively.

James Clements

As political pressure increases to limit the emissions of the oxides of nitrogen, James Clements, Managing Director of the Signal Group, explains how the latest advances in monitoring technology can help.

Nitrogen and oxygen are the two main components of atmospheric air, but they do not react at ambient temperature. However, in the heat of combustion, such as in a vehicle engine or within an industrial furnace or process, the gases react to form nitrogen oxide (NO) and nitrogen dioxide (NO2). This is an important consideration for the manufacturers of combustion equipment because emissions of these gases (collectively known as NOx) have serious health and environmental effects, and are therefore tightly regulated.

Nitrogen dioxide gas is a major pollutant in ambient air, responsible for large numbers of premature deaths, particularly in urban areas where vehicular emissions accumulate. NO2 also contributes to global warming and in some circumstances can cause acid rain. A wide range of regulations therefore exist to limit NOx emissions from combustion sources ranging from domestic wood burners to cars, and from industrial furnaces and generators to power stations. The developers of engines and furnaces therefore focus attention on the NOx emissions of their designs, and the operators of this equipment are generally required to undertake emissions monitoring to demonstrate regulatory compliance.

The role of monitoring in NOx reduction
NOx emissions can be reduced by:

  • reducing peak combustion temperature
  • reducing residence time at the peak temperature
  • chemical reduction of NOx during the combustion process
  • reducing nitrogen in the combustion process

These primary NOx reduction methods frequently involve extra cost or lower combustion efficiency, so NOx measurements are essential for the optimisation of engine/boiler efficiency. Secondary NOx reduction measures are possible by either chemical reduction or sorption/neutralisation. Naturally, the effects of these measures also require accurate emissions monitoring and control.

Choosing a NOx analyser
In practice, the main methods employed for the measurement of NOx are infrared, chemiluminescence and electrochemical. However, emissions monitoring standards are mostly performance based, so users need to select analysers that are able to demonstrate the required performance specification.

Rack Analyser

Infrared analysers measure the absorption of an emitted infrared light source through a gas sample. In Signal’s PULSAR range, Gas Filter Correlation technology enables the measurement of just the gas or gases of interest, with negligible interference from other gases and water vapour. Alternatively, FTIR enables the simultaneous speciation of many different species, including NO and NO2, but it is costly and in common with other infrared methods, is significantly less sensitive than CLD.

Electrochemical sensors are low cost and generally offer lower levels of performance. Gas diffuses into the sensor where it is oxidised or reduced, which results in a current that is limited by diffusion, so the output from these sensors is proportional to the gas concentration. However, users should take into consideration potential cross-sensitivities, as well as rigorous calibration requirements and limited sensor longevity.

The chemiluminescence detector (CLD) method of measuring NO is based on the use of a controlled amount of Ozone (O3) coming into contact with the sample containing NO inside a light sealed chamber. This chamber has a photomultiplier fitted so that it measures the photons given off by the reaction that takes place between NO and O3.

NO is oxidised by the O3 to become NO2 and photons are released as a part of the reaction. This chemiluminescence only occurs with NO, so in order to measure NO2 it is necessary to first convert it to NO. The NO2 value is added to the NO reading and this is equates to the NOx value.

Most of the oxides of nitrogen coming directly from combustion processes are NO, but much of it is further oxidised to NO2 as the NO mixes with air (which is 20.9% Oxygen). For regulatory monitoring, NO2 is generally the required measurement parameter, but for combustion research and development NOx is the common measurand. Consequently, chemiluminescence is the preferred measurement method for development engineers at manufacturer laboratories working on new technologies to reduce NOx emissions in the combustion of fossil fuels. For regulatory compliance monitoring, NDIR (Non-Dispersive Infrared) is more commonly employed.

Typical applications for CLD analysers therefore include the development and manufacture of gas turbines, large stationary diesel engines, large combustion plant process boilers, domestic gas water heaters and gas-fired factory space heaters, as well as combustion research, catalyst efficiency, NOx reduction, bus engine retrofits, truck NOx selective catalytic reduction development and any other manufacturing process which burns fossil fuels.

These applications require better accuracy than regulatory compliance because savings in the choice of analyser are negligible in comparison with the market benefits of developing engines and furnaces with superior efficiency and better, cleaner emissions.

Signal Group always offers non-heated, non-vacuum CLD analysers for combined cycle gas turbine (CCGT) power stations because these stations emit lower than average NOx levels. NDIR analysers typically have a range of 100ppm whereas CLD analysers are much more sensitive, with a lower range of 10ppm. Combustion processes operating with de-NOX equipment will need this superior level of sensitivity.

There is a high proportion of NO2 in the emissions of CCGT plants because they run with high levels of air in the combustion process, so it is necessary to convert NO2 to NO prior to analysis. Most CLD analysers are supplied with converters, but NDIR analysers are not so these are normally installed separately when NDIR is used.

In the USA, permitted levels for NOx are low, and many plants employ de-NOx equipment, so CLD analysers are often preferred. In Europe, the permitted levels are coming down, but there are fewer CCGT Large Plant operators, and in other markets such as India and China, permitted NOx emissions are significantly higher and NDIR is therefore more commonly employed.

In England, the Environment Agency requires continuous emissions monitors (CEMS) to have a range no more than 2.5 times the permitted NOx level, so as a manufacturer of both CLD and NDIR analysers, this can be a determining factor for Signal Group when deciding which analysers to recommend. The UK has a large number of CCGT power plants in operation and Signal Group has a high number of installed CEMS at these sites, but very few new plants have been built in recent years.

New NOx analysis technology
Signal Group recently announced the launch of the QUASAR Series IV gas analysers which employ CLD for the continuous measurement of NOx, Nitric Oxide, Nitrogen Dioxide or Ammonia in applications such as engine emissions, combustion studies, process monitoring, CEMS and gas production.

Chemiluminescence Analyser

The QUASAR instruments exploit the advantages of heated vacuum chemiluminescence, offering higher sensitivity with minimal quenching effects, and a heated reaction chamber that facilitates the processing of hot, wet sample gases without condensation. Signal’s vacuum technology improves the signal to noise ratio, and a fast response time makes it ideal for real-time reporting applications. However, a non-vacuum version is available for trace NOx measurements such as RDE (Real-world Driving Emissions) on-board vehicle testing, for which a 24VDC version is available.

A key feature of these latest instruments is the communications flexibility – all of the new Series IV instruments are compatible with 3G, 4G, GPRS, Bluetooth, Wifi and satellite communications; each instrument has its own IP address and runs on Windows software. This provides users with simple, secure access to their analyzers at any time, from almost anywhere.

In summary, it is clear that the choice of analyser is dictated by the application, so it is important to discuss this with appropriate suppliers/manufacturers. However, with the latest instruments, Signal’s customers can look forward to monitoring systems that are much more flexible and easier to operate. This will improve NOx reduction measures, and thereby help to protect both human health and the environment.