Remote monitors track river restoration success

10/05/2013
Remote monitoring of restoration work on beautiful English river using advanced sensing and telemetry technology.

Possibly one of the most unique areas of England is East Anglia; that part of the country north of London and south of the inlet known as the Wash. It encompasses the counties of Norfolk, Suffolk, Cambridgeshire and Essex, and is generally flat, stretching to the famous Broads, beloved of inland sailors and wildlife lovers. Water is an ever-present feature and this needs to be protected for environmental and biodiversity reasons.

The Norfolk Rivers Trust has installed a remote river monitoring station that has been tracking water quality and flow before and after river restoration work at an area of ecological importance on the River Nar (WIKI link!).

Picturesque view of the River Nar below Castle Acre! (Pic: Norfolk Rivers Trust)

Picturesque view of the River Nar below Castle Acre! (Pic: Norfolk Rivers Trust)

Rising in chalk hills to the east of the village of Tittleshall, the river flows south for 2.5 km until it reaches Mileham, then predominately west for 39.5 km through the villages of Litcham, Castle Acre, West Acre and Narborough until it reaches the tidal Ouse at King’s Lynn. The river rises on chalk and in its course to Narborough flows over chalk formations. In its lower course the underlying geology is more complex and consists of a progression from Narborough downstream through a series of clays and greensands, making it one of only a few remaining fenland chalk streams. In line with the requirements of the Water Framework Directive, the project is designed to ensure that the Nar maintains good ecological status by 2015 and in doing so it aims to improve the habitat for wildlife and promote biodiversity. The river monitoring station incorporates an Adcon GPRS telemetry unit from OTT Hydrometry, which automatically collects data and feeds a website, providing easy access for the project team.

The Problem
Agricultural runoff is a particular problem in the Anglian region because of the light sandy soils which are easily eroded during times of heavy rainfall. Fertilisers can add to the problem because they can be washed from the field and end up in water courses. As a result, many Norfolk Rivers contain high levels of nitrate and phosphate. Excessive levels of these nutrients can lead to eutrophication, symptoms of this can include vigorous growth of blanket weed; this change in water quality lowers dissolved oxygen levels in the streams and rivers, and harms wildlife.

In the past, the Nar channel has been made straighter, wider and deeper; initially to improve navigation, and later to improve drainage. However, this has had a detrimental effect on wildlife.

The River Nar also suffers from sediment deposition arising from point sources such as land drains, and from diffuse sources such as run-off resulting from cultivation in wet periods. This has affected species that rely on gravel beds for any stage in their lifecycle. For example, brown trout need sediment free gravel to lay their eggs.

The River Nar Project
Assisted by funds from WWF-UK, the Coca-Cola Partnership and the Catchment Restoration Fund, the Norfolk Rivers Trust has established a £609k  (€720k) river and flood plain restoration project to reduce pollution in the River Nar and improve the habitat for wildlife.

The project began in June 2012 and includes work to change the course of the river from a straight incised channel to a meandering route; reconnecting the river to the floodplain, which would create new habitats. This channel restoration project was completed in October 2012. The project also includes the creation of reed beds and other in-ditch options to trap sediment before it enters the River Nar. Currently four reed beds have been installed in different areas in the River Nar catchment which also includes the dredging of an existing pond.

Monitoring
Prior to the commencement of the project, the Norfolk Rivers Trust measured water quality by collecting weekly samples and transferring them to their laboratory for analysis. This was a time-consuming and expensive activity and only produced spot data for the moment that a sample was taken. Consequently, events that took place at night or between the sampling interval were not detected, so there were clear advantages to be obtained from continuous monitoring.

In order to establish a continuous monitoring station for water quality and flow, OTT Hydrometry provided a Hydrolab Minisonde water quality monitor and an Adcon A755 Remote Telemetry Unit (RTU). In combination with a bed mounted Doppler flow meter (provided by the Environment Agency), the station is able to provide a continuous record of the river’s condition.

narOTTThe Hydrolab Minisonde 5 takes measurements for turbidity, flow, conductivity, temperature and luminescent dissolved oxygen (LDO) every 15 minutes. The collected flow and water chemistry data is then stored and transmitted every hour via the RTU to an online server hosted by OTT Hydrometry. This allows information to be downloaded and analysed in the Trust’s office without the need for regular site visits. Data can be accessed at anytime from anywhere using the Adcon app.

Operating on extremely low power, and designed specifically for the collection and transmission of remote monitoring data, ADCON RTUs are able to utilise a variety of communication methods depending on site conditions. For example, radio represents a low-cost alternative in areas with poor GSM coverage and where line of sight is possible, with repeaters if necessary.

The monitoring site on the Nar has some GSM coverage, but the signal is poor, so an ADCON A755 RTU was chosen to communicate via GPRS. The A755 RTU has been developed specifically for areas with low signal, because it stores all monitoring data when signal strength is too low for transmission, and then sends the information when signal coverage improves, sending the backed up data first.

The monitoring equipment was installed at the end of July 2012 and restoration work began on 8th October 2012. Emphasising the importance of monitoring before and after the restoration work, project officer Helen Mandley says: “To be able to judge the success of the project it is essential that we are able to compare water quality data from the old river channel to the new river channel, because we need to improve water quality in order to improve the biodiversity of the river.”

Results
In addition to water quality and flow monitoring, ecological assessments have been undertaken for water voles and other small mammals, macrophytes, aquatic invertebrates, vegetation and fish. However, before a reliable assessment of the project’s success can be undertaken, it will be necessary to evaluate data over an extended period so that seasonal effects can be taken into consideration.

Pre- and post-restoration data on ecology, water quality and flow will be assessed in September 2013, and it is hoped that this will provide clear evidence that the project has had a significant effect on water quality and biodiversity.

Helen hopes to continue the project beyond 2013 commenting, “We currently monitor downstream of one of the new reed beds, but in the future we would like to place more monitoring equipment upstream of the reed bed to really see the differences, particularly in levels of turbidity and conductivity.”

The current phase of the project is due to run until the end of 2013, but a series of ‘restoration units’ have been identified by The River Nar Steering group that includes the Norfolk Rivers Trust, each applying restorative work to a specific section of the river. These units extend to 2027 but will be reliant on the availability of future funding.

Clearly, environmental monitoring is essential for the evaluation and ongoing management of remediation projects, and OTT’s UK Managing Director Simon Wills says: “This project is a good example of how simple and low-cost it can now be to create a monitoring station that is sufficiently flexible to collect and transmit data from a variety of monitors. “Advances in sensor, datalogging, communications and power management technology have combined to dramatically improve the effectiveness of remote data collection, which means that less site visits are necessary; thereby saving a great deal of time and money that can be spent on restoration.


W.A.G.E.S. for cost reduction!

07/01/2013

This paper from Endress + Hauser, discusses the increase in understanding and necessity of monitoring and controlling energy efficiency in utilities.

1. Introduction
Production plants in all industries are coming more and more under pressure to measure the cost of their utilities:

– Water
– Air
– Gas (e.g. Natural Gas, other gases or fuels)
– Electricity and
– Steam

It is interesting to confirm that this W.A.G.E.S. trend is independent of the type of industry. It is to be seen in small breweries and in big chemical sites.

One important driver for this pressure is the rise in the cost of energy. The cost of natural gas for industrial applications has more than tripled within less than ten years and the price for electricity in Europe has risen by 30% within less than 4 years.

Certifications according to EMAS and the ISO 14000 series also force customers to measure the energy streams using calibrated technology.

Savings_book_large
Find out more about how you can benefit from E+H’s experience in Energy Monitoring Solutions, you can now receive a free copy of their Energy Saving Book from their site!
More information about the Endress+Hauser Energy Monitoring Solution on line.

The utilities have been neglected frequently in the past. Currently, however, they are coming more and more into the focus. Still many companies only measure natural gas and electricity only at the custody transfer point. Using these few measurements, however, important parameters like specific energy consumptions are determined that give important indications: how much energy does it take to make a ton of product? These measurements, however, are only taken on a monthly or sometimes even on a yearly basis. Investing a relatively small amount of money in comparative terms it is possible to set up energy monitoring systems that measure the consumption of each respective utility close to the point of use. These measurements can then be used to build meaningful relations between energy consumptions and driving factors that enable the customer to

• Control their energy consumption with a better resolution (application-wise and time-wise)
• Identify and justify energy reduction projects (where is most energy consumed? Which changes are possible?)
• Detect poor performance earlier (are the boiler’s heating surfaces fouling?)
• Get support for decision making (should the contract with the provider of electricity be changed?)
• Report performance automatically (which Energy Accountability Centre/shift etc. is performing best? Did exceptions occur?)
• Audit historical operations
• Get evidence of success (did promises made by a manufacturer of energy efficient equipment come true?)
• Get support for energy budgeting and management accounting
• Provide the energy data to other systems (e.g. existing SCADA)

2. What is energy management

Picture 1: The Energy Management Cycle.

Picture 1: The Energy Management Cycle.

Energy management can be seen as a cyclic operation. Everything starts with the basic data collection: energy consumption is measured and converted to appropriate units. For most of the utilities, these conversions require highest attention:

– already the conversion from volumetric units (e.g. natural gas measured by turbines, steam measured by DP devices or vortex meters) to corrected volume, mass or energy often is done in a wrong way resulting in errors in the range of typically 10…30%
– many devices are wrong installed resulting in similar error ranges and
– if already the basic is wrong, the analysis will be wrong and all action taken will be based on wrong information.

The easiest form data collection is paper and pencil. It is amazing to see how many people in the industry still have to walk around the factory and find certain meters on a monthly basis to take the readings. Modern systems perform this automatically: Modern recorders as stand-alone devices or so-called “software recorders”are able to record data in the commonly used 15 min. or 30 min. intervals. If these intervals are not sufficient, even a data collection every 100ms is possible.
Most modern systems of data collection are even able to collect the data of up to 30 devices using bus communication and pass the data on using “Field Gates”.

3. Data analysis
If the Data collection is the basis of it all, data analysis is the heart: It helps to convert the pure measurements of energy data into meaningful data.

A first basic way consists in analyzing the 15-min or 30-min data profiles:

– What is the base-load of the application? Why energy is still consumed without production? How can this base-load be reduced?
– What is the typical maximum load during productive hours? How can the maximum load be reduced? (This is important e.g. for electricity contracts)
– What is the typical load distribution? How can a more uniform load-distribution be obtained?

For this purpose, different policies of load-management are available (e.g. peak-clipping)
Even more meaningful is to put energy consumptions into relation to a driving factor. Examples are:

– how much heating energy is consumed compared to how cold the weather is (so-called degree days)
– how much energy is consumed to make a ton of product
– how much electricity is consumed in order to light a building compared to the hours of day-light.

Since all of these parameters put into relationship energy consumption with a relevant driver, they are generally called “Specific Energy Consumptions” (SEC).

Controlling such a factor now enables the customer to control if a certain process is drifting over time, i.e. the process is becoming more in-efficient. Possible causes of such a drift can have multiple reasons:

– the amount of leakage in a compressed air grid is growing because of lacking maintenance
– the specific energy consumption for making steam is rising because of lacking maintenance of steam traps (steam traps fail open in case of a failure)
– the specific energy consumption for heating a building rises because of fouling of the surfaces of heat-exchangers

Generally, comparing the energy consumption with a driver will reveal a linear relationship. In certain applications, this linear relationship also shows an intercept that does not equal zero.

If no actions are taken, the trend will be as follows:

– the intercept grows (examples: increasing leakage in a compressed air application or due to failing steam traps)
– the slope of the linear relationship grows (loss of efficiency e.g. because of fouling heat-exchangers)

Customers, however, will strive to

– reduce the intercept and
– reduce the slope of the linear relationship.

The linear relationship found can now be used as a target for the future. One example: if in the past it has taken 4 GJ of energy to make a ton of steam, we expect this same value for the future, too – unless we take any actions to improve efficiency.

We can now compare the real energy consumption to the expected one and record the differences. If this difference exceeds a certain value, a warning will be generated.

pic2

Picture 2: The Control Graph for controlling deviations from a pre-set target. If the control limits are exceeded, an alarm can be generated

We can also take these differences and total them up over time in the so-called CUSUM (cumulated sums) chart.

Picture 3: The CUSUM chart. It acts as a totalizer and can reveal savings achieved.

Picture 3: The CUSUM chart. It acts as a totalizer and can reveal savings achieved.

This chart acts like a bank-account: If the process becomes less efficient, the CUSUM chart will run away from the zero line. In the picture the process has become more efficient, however. In our example, an economizer was installed improving a steam boiler’s efficiency. We can now read directly from the chart that compared to former performance the investment into the economizer saved the company 1100 MWh of energy within 15 weeks.

Where this data analysis can be done?
Recording the performance, analyzing data every 15 or 30 minutes and displaying current specific energy consumption values can be done easily using modern time recorders that display these values close to the process. These modern recorders already can perform even complex math operations. Thus, employees running certain processes can be directly involved and start asking questions:

– Why are certain shifts more efficient than other?
– Why was the specific energy consumption stable for months but started drifting recently?

These analysis techniques and also the “targeting” procedure described above can also be performed in Energy Monitoring software.

pic4

Picture 4: Set-up of a typical full-blown energy monitoring information system

4. Communication/reporting
Recipients of Energy reports can be found in different hierarchies: from operations personnel to top management and in different areas of a company (production/operation/engineering, controlling, energy and eco management).

The reports must provide information to enable the user to act. Operational staff needs to know when a problem has occurred as quickly as possible and know what they should do about it. Senior management, on the other hand, needs summary information to know that procedures and systems are working well. In order to design reports, it is important to understand who needs reports and why.

Reports to senior management might include:

– a summary of last year’s costs, broken down into EACs (energy accountable centers)
– a summary of the current year’s performance on a monthly basis

• against budget
• against the previous year
• against targets

– a note of the savings (or losses) achieved to date and how they were achieved
– a note of additional savings opportunities and what actions are ongoing to address them

A new report to management should be issued each month and be available in time for board meetings.

Operations management will be responsible for operating processes and plant efficiency. They will need to know on a shift, daily, weekly or monthly basis (depending on the nature of the process and the level of energy use) what energy has been used and how this compares with various targets. The information will be used to

– measure and manage the effectiveness of operations personnel and process plant and systems
– identify problem areas quickly
– provide a basis for performance reporting (to executives)

Operations personnel need to know when a problem has occurred and what needs to be done to rectify it. This information needs to be provided in a timely manner, which might mean within a few minutes of the event for a major energy-using process, or within a day or a week.

Engineers associated with operations will need reports similar to those for operations personnel. Engineers may typically be involved with problems where there is more time to act (compared with process operators), for example, cleaning heat exchangers, solving a control problem or removing air from a refrigeration condenser.

Engineers who are not directly in operations but who provide support will need more detailed historical information. Typically, these individuals will be involved in analyzing historical performance, developing targets and modeling. They will require access to the plant data historian and will use analysis tools, ranging from commonly available spreadsheet software to advanced data mining and similar software.

Engineers that are involved in projects will need supporting data, for example, levels of energy use, process operating conditions, etc. They will also need access to the raw data in the historian and access to analysis tools.

The accounts department may be interested in actual energy usages and costs to compare with budgets. They will need information that is broken down by department so that costs can be allocated to related activities. Accurate costing of operations and the cost of producing goods can improve decisions regarding product pricing, for example, and the allocation of resources.

Energy and environmental managers will need summary data that identifies the performance achieved and trends, much like what executives and operations managers require. Like engineers, they may require more detailed information for specific analysis.

The environmental department may want energy consumption expressed as equivalent CO2 emissions, and the energy reports may need to be integrated into environmental reports that are more general. Summary information may be required for annual energy and environmental reporting and may be needed more frequently by regulatory bodies.

The energy manager may be involved in energy purchasing as well as efficiency. He may need information about the profile of energy use (using a half-hourly graph, for example), peak usage, nighttime usage, etc. The energy manager will also need access to the raw data in order to allow evaluation of purchasing options and to check bills.

We can see from this broad variety of requirements that modern Energy Management Information Systems have to be very flexible in creating these reports.

5. Taking the action
Results of implementing Energy Monitoring Informations Systems in the UK indicate that, when properly implemented, such a system can save 5 to 15 percent of annual energy costs. As an initial approximation, 8 percent appears to be a reasonable estimate. [1]

Implementing an Energy Management Information System alone and taking the action based on the outcome of this tool alone will result typically in 8 percent savings. Most experience regarding this tooling can be found in the UK based on the local “Carbon Trust”.
Further savings can be achieved by spending capital cost e.g. for more efficient burners and boilers, economizers etc.

Savings Strategies in Energy Management typically fall into the four following categories:

• Eliminate. Generally, one should question if certain processes or sections of a plant are really required or if they could be replaced. A simple example: eliminating dead legs of a plant.
• Combine. CHP is a well-known “combine” process: generation of heat and electricity are combined. Another example is the use of off-heat created by compressors for making air e.g. for pre-heating factory air.
• Change equipment, person, place, or sequence. Equipment changes can offer substantial energy savings as the newer equipment may be more energy efficient. Changing persons, place, or sequences can offer energy savings as the person may be more skillful, the place more appropriate, and the sequence better in terms of energy consumption. For example, bringing rework back to the person with the skill and to the place with the correct equipment can save energy.
• Improve. Most energy management work today involves improvement in how energy is used in the process because the capital expenditure required is often minimized. Examples include reducing excess air for combustion to a minimum, reducing temperatures to the minimum required. Improving does sometimes require large amounts of capital. For example, insulation improvements can be expensive, but energy savings can be large, and there can be improved product quality.


Remote-control boat speeds reservoir surveys

10/12/2012

As the regulatory requirement, in Britain and elsewhere, to assess reservoirs and lakes expands to include smaller bodies of water, HR Wallingford has developed a remote control boat which is able to collect hydrometric data quickly, simply, safely and accurately.

ARC Boat

ARC Boat

The ARC-Boat employs a sophisticated SonTek  M9 Acoustic Doppler Profiler (ADP®) which is a 5-beam depth sounding device that scans the reservoir bed as the boat is guided across the water’s surface. Recorded data are analysed by SonTek Hydrosurveyor software to produce accurate depth measurement in addition to 3-D maps of the entire water body. With a small amount of post-processing in GIS or 3D CAD, an accurate water volume can be determined.

Craig Goff, a reservoir Supervising Panel Engineer and dam specialist at HR Wallingford has used the ARC-Boat in a trial project to assess five reservoirs and says “This new method offers tremendous advantages over traditional manned boat techniques because it is faster, safer, more environmentally friendly and involves fewer staff and resources. All of this combines to mean that it saves a great deal of time and money. This is particularly important because the Flood and Water Management Act 2010 will necessitate the volumetric assessment of many water bodies that have previously been below the threshold and therefore outside of the ambit of the Reservoirs Act 1975.”

Reservoir regulations
As a result of residential and industrial development in recent decades, the levels of risk associated with many British reservoirs have changed, and the British Flood and Water Management Act 2010 has amended their Reservoirs Act 1975 to bring a more risk-based approach to reservoir regulation. The 2010 Act seeks to achieve this by:

1. reducing the capacity at which a reservoir will be regulated from 25,000m³ to 10,000m³
2. requiring all Undertakers with reservoirs over 10,000m³ to register their reservoirs with the Environment Agency
3. ensuring that only those reservoirs assessed as high risk are subject to full regulation

The reservoir sections of the 2010 Act are dependent upon on the development of secondary legislation which is likely to specify the reservoir capacity above which water bodies will be regulated. However, irrespective of the content of this secondary legislation, the Flood and Water Management Act 2010 has clearly generated an urgent need for reservoir assessment and the application of the ARC-Boat for reservoir bathymetry is therefore propitious.

Technology
The ARC-Boat has been designed with a V-shaped hull to give optimal manoeuvrability and minimal air entrainment beneath the ADP, ensuring high quality data collection. The robust and reliable design, including grab handles fitted to the upper deck, mean that the boat can be launched from the most difficult locations and a unique detachable bow means that the ARC-Boat can easily be transported in an average sized car.

SonTek M9

SonTek M9

The SonTek M9 is a 9 beam acoustic Doppler profiler, using 5 beams at any one moment for depth measurements from a wide footprint on the water bed. This means that the time spent ‘driving’ the boat is minimised in comparison with single beam instruments. Importantly, the M9 is able to operate in depths ranging from 15cm to over 40m.

The boat employs industry standard remote control with a minimum range in excess of 200m and Bluetooth communications provide data transmission to an onshore laptop.

Data Management
HydroSurveyor™ is a system designed to collect bathymetric, water column velocity profile, and acoustic bottom tracking data as part of a hydrographic survey. The two key components of the system are the HydroSurveyor™ Acoustic Doppler Profiler (ADP®) platform, and the powerful, yet user-friendly, data collection software.

With the HydroSurveyor™ platform, SonTek is able to offer an exclusive 5-beam depth sounding device, with built-in navigation, full water column velocity (currents) profiling, full compensation for speed of sound (with the CastAway-CTD), and integrated positioning solution.

HydroSurveyer Real-time Data Shot!

HydroSurveyer Real-time Data Shot!

Trial Results
Craig Goff is extremely pleased with the results of the initial trials on five reservoirs in southern England. He says: “The M9 performed very well, running from 8am to 4.30pm each day on a single set of batteries. We were able to conduct the surveys much faster than has ever been possible before, without the health and safety risks of putting staff over water and the environmental risks of diesel powered larger survey boats. Most importantly, however, we were able to produce high quality accurate data for a modest price and our client was very pleased with the results.”

Applications for the ARC-Boat
In addition to the smaller reservoirs that will have to be surveyed, larger reservoirs will be able to take advantage of the new technology to assist in operations such as the creation of sedimentation models. These models inform strategies to prevent capacity depletion and to extend the lives of reservoirs through flushing, excavation, dredging etc. Similarly, ARC-Boat surveys can be employed around submerged hydropower or draw off pipe intakes to assess sedimentation levels – a vitally important role because sediment can seriously damage turbines, or influence operation of scour pipes or water supply draw off pipes from reservoirs.

Summary
As a result of the Flood and Water Management Act 2010, the owners of small reservoirs will need to prove whether their water bodies are affected by the amended Reservoirs Act 1975, by determining an accurate volume figure for their reservoirs. Typically, this will include landowners, farmers and organisations such as the National Trust. However, the development of the ARC-Boat with the M9 and the latest HydroSuveyor™ software mean that such work is now faster, safer and significantly lower cost. This is good news for the owners of smaller reservoirs for whom any survey cost is a new cost.

 

 


In-pipe monitors turn dreams into reality

17/08/2012

The quality of tap water has improved enormously in recent decades, but for largely technological reasons, until recently, knowledge of water quality between the treatment works and the tap has been an almost impossible dream for water treatment and distribution network managers. In this article, Jo Cooper, Product Specialist at Intellitect Water, will look at the ways in which the challenges have been overcome in order to turn that dream into reality.

Background
Drinking water generally leaves treatment plants in excellent condition before entering a network of underground pipes, of varying age, that can extend for many miles. It has always been difficult to identify water quality problems until after the water has reached the consumer. An array of technological challenges had to be overcome before it would be possible to measure in-pipe water quality without incurring major capital or operational costs. However, following a substantial research and development programme, the Intellisonde monitors were launched in 2008 and have since found application worldwide in pipes and bypass flow cells for drinking water, rivers and wastewater installations. The remainder of this article will examine the most significant hurdles that were overcome in the development of this technology.

1. Water quality parameters

Sonde Head

Most traditional water quality sensors would be unsuitable for use inside pressurised pipes, so it was necessary for Intellitect’s engineers to develop sensor technologies that would be small, low cost, low power, robust and require almost no service or calibration. As a result, the head of an Intellisonde (see picture on right) is a mere 3.6cm in diameter, but fully populated can deliver continuous water quality data for 12 parameters. Measurement options include Free Chlorine, Mono-chloramine, Dissolved Oxygen, Conductivity, pH, ORP/Redox, Flow, Pressure, Temperature, Turbidity and Colour. An ISE channel is also available for Fluoride or Ammonium. The measurements are flow independent because the unit features an automatic stirrer which starts when low flow is detected.

Intellisonde users are able to specify the required parameters such that monitors can be installed to meet customers’ specific needs.

2. Maintenance and consumables
Clearly, it would not be feasible to install a network of monitoring equipment in underground pipes that requires frequent recalibration, spares or maintenance. This was a major challenge because many traditional water quality monitoring technologies for parameters such as dissolved oxygen and chlorine require recalibration as often as fortnightly to address sensor drift. In addition to regular recalibration, these traditional membrane-covered amperometric sensors also necessitate occasional replacement of membranes and the sensors’ internal electrolyte solution.

Colorimetric water quality analysis techniques would also be unfeasible for an in-pipe monitoring application, because such techniques generate waste chemicals that could contaminate ground water if no waste water disposal facility is available and also necessitate a supply of reagents that would add to operational costs and create an unacceptable maintenance requirement.

The Intellisonde sensors are solid state and require no recalibration once they have settled following installation. Furthermore, no membranes or chemicals are required and no maintenance is necessary until the sensor head is quickly and easily serviced after typically 6 months.

3. Labour costs
As outlined above, traditional monitoring techniques can create a significant labour requirement for ongoing instrument service. However, there are two other issues that affect labour costs. Firstly, instrument installation and secondly, the labour costs incurred by water quality problem investigations.

Intellisonde requires just a standard inspection pit and 2” gate valve for installation. Other instrumentation may require a roadside cabinet, waste collection and mains power which increase the cost very significantly.

The Intellisonde has been designed to minimise labour costs. This has been achieved with miniature sensors and an extremely low power requirement; the probe can be fitted into large pressurised pipes (via a 3.8 or 5cm valve) as well as pipes only ten centimetres in diameter, for street level monitoring. An adjustable collar allows the sonde to be set to a pre-determined depth, ensuring maximum flexibility to adapt to local conditions. A variant of the sonde (Intellisonde SA – short application) enables continuous monitoring of a customer’s supply by insertion into a water metering box.

The cost of excavation is highly significant and represents a significant barrier to water quality problem investigations, particularly if excavation work is likely to become a nuisance to road users for example. This is an area in which Intellisondes offer enormous benefits because the provision of live in-pipe water quality data enables network managers to detect problems before they become serious and also to significantly improve the speed with which the location of a problem is identified, thereby avoiding unnecessary excavations.

4. Data collection
Intellisondes incorporate an internal datalogger that can be configured to measure and log data from the sensors at intervals from once per minute to once per hour. This data can be collected manually, but more commonly real-time data is transmitted through a variety of communication methods so that network managers have visibility of water quality.

Communications options include RS232 or RS485 serial outputs via MODBUS, Analogue Outputs and GPRS. As a result, Intellisondes can interface with almost any SCADA system or data collection platform. There are also two analogue inputs available for external sensors such as pressure transducers, and each of the 12 analogue voltage outputs can be connected to telemetry loggers for integration into an existing data collection system. GPRS transmission can occur after each sensor reading, enabling rapid response to any incidents.

5. Multiple applications
Whilst in-pipe monitoring of drinking water quality is the main application for Intellisondes, the monitors have also been employed very successfully in a number of other applications. These include river monitoring for intake protection and environmental purposes, final effluent monitoring at wastewater treatment works and in swimming pools.


Latest analytical technology ensures biogas efficiency

28/05/2012

Anaerobic Digestion (AD) relies on the ability of specific micro-organisms to convert organic material into a gas that can be used to generate electricity. However, these bacteria require specific conditions if they are to function effectively and instrumentation specialist company Hach Lange has developed a range of online, portable and laboratory instruments that have enabled a large number of AD plants to maximise efficiency and prevent the risk of failure.

Introduction
In 2009, renewable energy accounted for just 3% of  Great Britain’s energy supply. However, the Government there  has a target to raise this contribution to 15% by 2020 as part of its strategy to fight climate change. Along with wind, solar and various other sources of renewable energy, AD has an important role to perform in helping to achieve the renewable energy target whilst also helping with the management of organic waste.

Biogas is generated in large anaerobic digesters; air tight tanks in which bacterial digestion takes place in the absence of oxygen. Biogas is a combination of Methane, Carbon Dioxide and many other gases in trace amounts, which can be burnt to produce electricity, and then transported to the National Grid. Alternatively it can be further processed and refined to around 100% methane and injected into the national gas grid.

The remnant digestate can be used for a variety of purposes such as a nutritional additive to crops on arable land, much in the way manure is used, or as a landfill restoration material.

There are two types of biogas plants, determined by the substrate they use; co-fermentation plants and renewable raw material fermentation plants. In co-fermentation plants, substrates of non-renewable raw materials are used, such as residues from fat separators, food residues, flotation oil, industrial waste products (glycerol or oil sludge) and domestic organic waste. Renewable raw material fermentation plants utilise materials such as maize, grass, complete cereal plants and grains, sometimes together with manure slurry.

The need for testing and monitoring
Efficiency is vital to the success of a biogas production plant; bacteria require optimum conditions to effectively produce biogas from the digestion of organic matter. Plant operators therefore have a strong interest in the efficiency of their biogas plant and the activity of the bacteria. Consequently these production plants require reliable, on-site analysis in combination with continuously operating process instruments. Loading excessive levels of biomass into a digester may have severe economic consequences and could potentially lead to biomass inactivation and necessitate a cost-intensive restart. Conversely, under-loading a biomass digester could also have financial implications, because less electricity is produced and potential revenue is lost. Substrate amounts must be tailored to achieve the optimum rate of bacterial digestion.

The degradation process which occurs within the biogas plant digesters does so in a highly sensitive microbial environment. The digesting, methane-producing bacteria, for example, are highly temperature sensitive and most active within the temperature ranges of around 35 to 40 DegC and between 54 to approximately 57 DegC. The specific nature of the microbial environment inside the digesters must be maintained throughout fermentation to increase production and avoid inactivation of the highly responsive bacteria.

Monitoring equipment
Hach Lange provides portable, laboratory and online monitoring systems that facilitate examination at key points within the fermentation process, including eluate analysis, where the substrate is fed into the digester, but also within the digester itself. Online process analysis instrumentation can be employed to continuously maintain optimum conditions within the biogas plant and/or samples can be collected regularly for analysis.

Different analytical instruments are required for different stages of the fermentation process: at the substrate entry point; within the main digester; in post-fermentation tanks and to continuously monitor biogas production.

Process monitoring instruments used across the fermentation cycle allow operators to constantly supervise the anaerobic digestion rate and biogas production.

Hach Lange TIM 840 Titrator

One of the most important measurements for assessing fermentation progress is known as the FOS/TAC ratio. This is determined by their TIM 840 Titrator, and the values generated enable the system supervisor to identify potential process problems such as the imminent inversion of digester biology, so that countermeasures can be initiated. The FOS stands for Flüchtige Organische Säuren, i.e. volatile organic acids while TAC stands for Totales Anorganisches Carbonat, i.e. total inorganic carbonate (alkaline buffer capacity).

To measure the FOS/TAC ratio with the TIM 840 titrator, 5ml of sample is added to a titration beaker containing a follower bar. 50ml of distilled water is then added and the measurement is started. The addition of reagents is then conducted automatically by the titrator which saves operator time and reduces the potential for human error. After about 5 minutes the TAC and FOS values are calculated automatically using a pre-programmed formula.

All measured values can be stored in the autotitrator and/or sent to a printer or PC.

The FOS/TAC ratio provides an indication of the acidification of the fermenter, which is an important measurement because a low acid content demonstrates that the rate of bacterial digestion is not high enough. Conversely, too high an acid content means bacterial digestion is exceeding required levels, due to an overloading of substrate.

Case Study:

Viridor’s Bredbury facility

Viridor’s Resource Recovery Facilities in Reliance Street, Newton Heath, Manchester and Bredbury, Stockport. (GB)

At the Resource Recovery facilities which incorporate AD plants the feedstock is derived from domestic waste collections – the ‘black bag’ portion that would otherwise be destined for landfill. Pre-sorting removes plastics, metals and glass, after which the waste is pulverised to produce a slurry that is passed to the AD plant. This slurry contains the organic fraction that is processed to produce biogas.

Steve Ivanec is responsible for ensuring that the pant operates to optimal efficiency. He says “Monitoring is extremely important at this plant because of the variability of the feedstock – the organic content can fluctuate from one day to another, so we have to be able to respond very quickly.”

Steve’s team uses Hach Lange instruments to closely monitor the entire process and to ensure that the plant’s bacteria are provided with optimal conditions. These tests include chloride, pH, alkalinity and volatile fatty acids; the ratio of the latter two being the same as the FOS/TAC ratio, which is determined by a TIM Biogas titrator. In addition, samples are taken from the feed, the digesters and the effluent to monitor ammonia and COD with a Hach Lange spectrophotometer. This data is essential to ensure compliance with the plant’s discharge consent.

The Reliance Street plant utilises biogas to generate electricity and the residue from the AD process can be defined as a product rather than a waste because it complies with the BSI PAS110 Quality Protocol for Anaerobic Digestate (partly as a result of the monitoring that is undertaken). This product is termed ‘compost-like output’ (CLO) and can be landfilled, used as a landfill cover, or spread on previously developed land to improve that land. However, CLO cannot currently be applied to agricultural land used for growing food or fodder crops.

Summary
The Hach Lange test and monitoring equipment enables the operators of AD plants to ensure that the bacteria are provided with optimum conditions so that biogas production is as efficient as possible. As a result, less waste is sent for landfill and renewable energy is generated efficiently. This ensures the best possible return on investment and by reducing the use of fossil fuels for power generation, helps in the fight against climate change.


Power from the waves!

26/01/2012
New wave and tidal turbine concept promises clean, affordable energy

Practically infinite reliability was the defining requirement when researchers wanted to run a TorqSense torque transducer under the sea as part of extensive trials of a green energy turbine.

Practically infinite reliability was the defining requirement when researchers wanted to run a TorqSense torque transducer, from Sensor Technology, under the sea as part of extensive trials of a green energy turbine. Dependable, affordable energy from tidal streams and ocean currents could soon be a reality, with scale models of the innovative Evopod demonstrating the viability of the oceans as an energy source.

Developed by Tyne & Wear (GB) based Ocean Flow Energy, Evopod is a semi-submerged, floating, tethered tidal energy capture device. It uses a simple but effective mooring system that allows the free floating device to maintain optimum heading into the tidal stream. Installed as an individual device or as a tidal farm, the technology offers clean, green energy.

It overcomes the key concerns that have been expressed for tidal stream turbine installations. As a floating tethered device it imposes minimal disturbance on sensitive seabed ecosystems and its single turbine rotates at such low speeds (10 – 20rpm) that they are likely to be a low threat to marine wildlife. Further, Evopod’s novel mooring solution employs a tight envelope to reduce the size of the exclusion zone for shipping. A seabed region of one square kilometre can support enough Evopods to supply all the energy needs of up to 40,000 homes. This would reduce carbon dioxide emissions by 140,000 tonnes per annum if replacing power from a coal-fired power station.

An important milestone in the development of Oceanflow’s Evopod technology was reached on the 6th of March 2011 with the demonstration of grid connectivity by the company’s 1:10th scale trials unit in Strangford Narrows in County Down (N. Irl). The output from the Evopod can now be fed into the domestic mains circuit of the Queen’s University Marine Laboratory.

The Evopod employs a fixed pitch turbine driving a permanent magnet generator through a gearbox. Power control and data capture are essential for reliable energy generation. For an effective sensing solution to measure the torque and rotational speed of the turbine, Ocean Flow Energy turned to Sensor Technology and its TorqSense torque sensor.

Torque is a critical measurement as it indicates the power that can be derived from the system as well as giving an indication of the stresses on the turbine. But the marine environment and the nature of the turbine’s operation places unique performance requirements on the sensing equipment.

Ocean Flow Energy design engineer Roger Cox comments: “We had used TorqSense transducers before and had good experiences with them. We knew they were reliable in challenging applications, and would give us the quality data we needed as part of our proof of concept of the Evopod.”

TorqSense is a surface acoustic wave (SAW) based device. In a TorqSense transducer, surface waves are produced by passing an alternating voltage across the terminals of two interleaved comb-shaped arrays, laid onto one end of a piezoelectric substrate. A receiving array at the other end of the transducer converts the wave into an electric signal.

The frequency is dependent upon the spacing of the teeth in the array and as the direction of wave propagation is at right angles to the teeth, any change in its length alters the spacing of the teeth and hence the operating frequency. Tension in the transducer reduces the operating frequency while compression increases it. To measure the torque in a rotating shaft, two saw sensors are bonded to a shaft at 45deg to the axis of rotation. When the shaft is subjected to torque, a signal is produced which is transmitted to a stationary pick up via a capacitive couple comprising two discs, one of which rotates with the shaft, the other being static.

“Data is logged on board, but also transmitted back to shore so we can remotely monitor the operations,” says Cox. “We used TorqSense devices on the very first Evopod design to go into the sea, and they’ve been working reliably on our 1/10 scale test unit for five years. They are now being incorporated on our larger scale units, including a 35kW version. We are also developing a twin-turbine version: 1/40th scale model has been tested in Newcastle University’s flume tank with the support of a NEEIC grant. At full scale the unit would be fitted with twin 1.2MW rated generators, each coupled to a 16m diameter three-bladed turbine. The unit would generate its combined rated output of 2.4MW in flow speeds of 3.2m/s or above.”

See also: Tidal turbine development in Ireland and Canada (9/7/2011) for a similar development using this equipment.


Automation market at cross roads

15/11/2011

Factories of the future to meld high technology with evolving market demands

The field of industrial automation (IA) is at a cross roads. All major IA vendors acknowledge that the automation and control solutions (ACS) product portfolio is nearing saturation, either directly or indirectly. A major trend underlining this development is the narrowing product definition between individual ACS products, in particular the programmable logic controllers (PLC) and distributed control system (DCS) product line.

Automation at crossroads!

A new analyst insight from Frost & Sullivan on the Automation & Control Systems (ACS) Market examines the current market scenario and future landscape, as well as details on game changers in factories of the future. On optimistic estimates, the European distributed control systems market and programmable logic controllers market are expected to reach €6.45 billion in 2017.

“Vendors have currently emerged with hybrid products that combine PLC and DCS functionality as a means to counter high competition and gain end-user recognition,” according to Frost & Sullivan Senior Research Analyst Karthik Sundaram. “Despite economic advantages, the emergence of such products has clouded end-user perception to a large extent, and it remains to be seen if this technical strategy yields expected results.”

Clearly noticeable is a significant shift from the traditional parameters determining the ACS market. Currently, it is a company’s product portfolio that yields the maximum influence in the ACS market space, followed closely by service support and cost considerations. This, however, is set to change.

“In the coming years, the emphasis on the IA product portfolio is likely to diminish,” he continued. “In contrast, the need for globalised service support, coupled with cost factors, is expected to gain significant momentum.”

As the ACS market steadily graduates towards the next level, it will offer IA vendors challenging opportunities for growth and excellence. Vendor participants will need to be in tune with on-going developments and enhance their ability to compete and succeed in the future of factories.

Our vision of the factory of the future is catalysed by five mega trends – cyber security, mobile and wireless technology, enterprise ecosystem, cloud computing and sustainability. These mega trends will influence all aspects of an industrial enterprise.”

For instance, operating personnel in future factories will not be confined to work stations inside control rooms. The advent of tablets and mobile platforms will enable them to track production lines, perform maintenance operations and monitor process issues from their tablets – all while on the move.

The adoption of secure cloud computing technology will enable factories access to relevant strategic data from the Internet to execute real-time decisions and enhance operational efficiency.

“In essence, future factories will have secure wireless networks supporting a highly automated production process, seamlessly interlinked with enterprise software working through the clouds,” he concluded. “A high-end factory will also involve collaborative manufacturing promoting operational excellence and aiding sustainability.”


New monitoring technology helps reveal Arctic secrets

14/11/2011

Pic: Catlin Arctic Survey!

Last month we featured an article by Quantitech’s Dominic Duggan on technology for measuring the gases trapped in the High Arctic which could tip climate scales. This time we  describe how a group of Arctic researchers have employed the latest monitoring technology from YSI Hydrodata to investigate the effects of climate change, by measuring temperature and salinity in the water column beneath surface ice. The results of the investigation could cast new light on our understanding of the ways in which shifting ocean currents impact upon the climate in northern Europe.

A group of Arctic researchers has employed the latest monitoring technology to investigate the effects of climate change, by measuring temperature and salinity in the water column beneath surface ice. The results of the investigation, which utilised YSI’s new ‘Castaway-CTD’, could cast new light on our understanding of the ways in which shifting ocean currents impact upon the climate in northern Europe.

The Catlin Arctic Survey is a unique collaboration between scientists and explorers, and the Castaway enabled the researchers to work very quickly in extremely hostile conditions because the device is small, portable and can be operated in the field without the aid of a computer.

Previous research looked at ice thickness and ocean acidification, but the latest Catlin Survey work has studied freshwater currents beneath the ice surface to help understand their effect on bottom-up ice melting, which is disrupting global ocean circulation.

Background
It is well established that the Arctic environment has a significant effect upon the global climate. For many years, climate scientists have raised concerns over future shifts in global weather systems and highlighted the role that the Arctic plays in such systems. Changes in the Arctic heavily contribute to the Thermohaline Circulation; a giant aquatic conveyor connecting the planet’s oceans, distributing heat, oxygen and nutrients. Changes to the Thermohaline Circulation combined with vast atmospheric, positive feedback loops (that produce large quantities of methane from the melting permafrost) that occur within the Arctic, can have drastic repercussions on the global climate.

In 2011 the Catlin Arctic Survey was commissioned by The Catlin Group to assess the temporary ice base on the Prince Gustav Adolf Sea,  on the northern most fringe of Canada’s Arctic archipelago, around 800 miles from the North Pole.

Organic Matter
A key measurement parameter for the team was Coloured Dissolved Organic Matter (CDOM), because high levels can result in 40% higher light absorption. In the Arctic, much of the CDOM is derived from three of Northern Russia’s vast river mouths. Commenting on the significance of CDOM, Dr Victoria Hill, a British-born Oceanographer, said: “Locally CDOM should act to increase thermal stratification, trapping heat near the surface. The water becomes more stable and there is reduced mixing. However, if surface ice melts, it creates an upper layer of fresh, cold water which does not mix. In the long run, the surface water becomes warmer and no longer sinks to form the deep and colder water that draws the Gulf Stream to Northern Europe.”

The researchers anticipated that the Arctic Ocean would be highly transparent, because the rivers contributing CDOM were frozen. However, the team determined that this was not the case. In fact, Dr. Hill revealed: “In the Chukchi, between 70 and 80% of solar radiation was being absorbed by CDOM.” In another data set, retrieved by Adrian McCullum, from the Scott Polar Research institute, concerning results were obtained from a sample of the water column; at a depth below 200m, the water was 1 Deg C colder than expected. This significant change in normally stable, deep water, suggests that the surface melt water was sinking, driving warmer water into contact with the surface ice. This sparked further interest into the variation of temperature in the Arctic Ocean.”

Arctic Ocean profiling
Highly specialised equipment is necessary for profiling very deep water. However, YSI’s Castaway CTD has been developed to provide a simple and accurate method for the rapid determination of conductivity, temperature and depth down to 100 meters. Incorporating GPS, sensors, data logging and a display into one compact instrument, the device is literally cast (or lowered) into water and retrieved immediately. The Castaway automatically collects and computes the data and users are able to see the result of their work immediately on a small display. The investigation in to CDOM’s effect on ocean temperatures was therefore an ideal application for YSI’s Castaway-CTD (conductivity, temperature and depth). A light-weight and easy to use hydrographic profiling instrument, with high-resolution sampling of conductivity, temperature and depth, the Castaway was a vital piece of sampling equipment used by the Catlin Arctic Survey team.

Castaway CTD – user feedback
Ann Daniels, of the Catlin Expedition Team was keen to stress the importance of the CTD to the success of the survey, “It was very lightweight, perfect for a long-range scientific expedition. The LCD display was very useful as it allowed the team to view information from the CTD while in the field, and allowed ‘live science’ to be relayed back to HQ by phone. It meant there was interest generated during the expedition rather than having to wait till the unit was returned back to Britain.”

Easily deployed, the Castaway was cast into bore holes created in the Arctic ice, and allowed to free –fall at depths of up to 100 metres, its sensors gathering data, including a temperature system able to respond within 200 milliseconds. The device was especially well designed for surveys in this extreme environment. A rugged, non-corrosive housing, a flow-through design, AA battery power and tool-free operation meant Castaway was perfectly suited for an Arctic survey.

Commenting on the value of the Castaway to the survey team, Science Programme Manager Dr. Tim Cullingford, said: “The Castaway CTD was deployed by the explorer team for the Catlin Arctic Survey 2011 during March to May.  The conditions at this time of year in the Arctic are extreme, with temperatures down to -40DegC.  Nevertheless, the Castaway was successfully deployed through holes drilled in the ice to an ocean depth of 100 meters.  Its compact nature meant that it was easy to handle (e.g. keeping it warm just before deployment was simply done by placing inside the explorer’s jacket).  The screen allowed an immediate return of temperature and salinity readings, which were successfully relayed back to London HQ on a regular basis.  In the round, the Castaway provided an easy and useful back-up to the data returned by our main CTD.”

Maintenance of the Castaway is very simple; a quick rinse with fresh water and the occasional scrubbing of the corrosion-resistant electrodes is all that is required to keep the Castaway-CTD in shape between recommended annual factory calibrations.

Warm water application
Recently, the Castaway-CTD was employed in a similar manner on BBC 1’s Ocean Giants narrated by Stephen Fry.  The first episode in a 3-part series investigated why Blue Whales, usually a migratory species, stay around the Sri Lankan coastline in the warmer waters. Marine biologist Asha de Vos wanted to study the features of this water that sustain these whales year round, using the Castaway to record salinity and temperature levels at differing depths she concluded: “Along our coastline, there are areas of mass upwelling of cold, nutrient rich water from the depths. These upwellings provide the perfect conditions for whale food; krill.”

Whilst Castaway retains all the advantages of a traditional CTD system, its additional appeal lies with its convenience, flexibility and speed – whether the instrument is being used in the freezing waters of the Arctic or the warm tropical waters of Sri Lanka.


Gases trapped in High Arctic could tip climate scales!

07/10/2011
By Dominic Duggan, Quantitech.

Enormous quantities of greenhouse gases (GHG) exist within Arctic ice and frozen soils, so with the threat of global warming, a clear understanding of the relationship between GHG in the atmosphere and in the ice/soil is vital because melting of permafrost could cause a dangerous climate tipping point. There can be few more challenging environments for monitoring gases, but PhD researcher Martin Brummell from the University of Saskatchewan has successfully employed a Gasmet DX4015 FTIR analyser to do so in the High Arctic of Canada. This article explains the procedures and challenges of multiparameter gas detection in freezing remote locations.

Is this beautiful Arctic scene hiding a climate tipping point?

Working in the field imposes a number of requirements for analytical equipment. However, the extreme weather conditions of the High Arctic impose a new level of capability that is rarely available as standard. Field work in such conditions must be simple, flexible and fast, but most importantly, Martin Brummell says, “The equipment must also be extremely reliable because you do not have the luxury of a local Quantitech engineer.

“The Gasmet DX4015 was also the ideal choice because, as an FTIR analyser, it is able to monitor almost any gas, which is normally a feature of mains powered laboratory instruments, but the DX4015 is portable and powered by a small generator, so it is ideal for monitoring in remote locations.”

Sampling and analysis in the Arctic
A set of simple, perforated steel tubes were driven in to the soil, to the point of the permafrost threshold. Inside these tubes gases within the soil were allowed to reach equilibrium via diffusion over 24 hours. This allowed Brummell to analyse gas concentrations to a depth of 1 metre. The procedure was simple and therefore reliably repeatable. Furthermore, measurement of gas concentrations at different depths enabled direct comparison with soil analysis.

Using FTIR in the ‘field’

Ready to measure!

The Gasmet DX4015 is a portable FTIR gas analyser for ambient air analysis. FTIR, an abbreviation for fourier-transform infrared, is an interferometric spectroscopic instrument (interferometer) that uses the infrared component of the electromagnetic spectrum for measurements. A fourier-transform function is applied by the interferometer to obtain the absorption spectrum as a function of frequency or wavelength. Consequently, this unit is able to simultaneously analyse up to 50 gas compounds. The analyser is typically set up to measure a variety of different gases, including VOC´s, acids, aldehydes, and inorganic compounds such as CO, CO2, and N2O.

The DX4015 is operated using a laptop computer running Calcmet™ software, a program that not only controls the analyser but also undertakes the analysis. This software is capable of simultaneous detection, identification and quantification of ambient gases, which gives the DX4015 its ability to simultaneously analyse multiple gases in near-real-time.

The FTIR’s many beneficial traits, such as reliability, precision and flexibility make it a vital piece of analytical equipment in a very wide variety of applications including industrial emissions monitoring, occupational safety surveys, engine exhaust testing, process monitoring, leak detection, emergency response, chemical spill and fire investigations, and many others.

Brummell’s use of the DX4015 on his most recent research expedition investigating the soils in the polar deserts of the High Arctic, highlights the model’s capabilities in the field. Carried out on Ellesmere Island in the Baffin Region of Nunavut in Canada, the DX4015 had to perform reliably in extreme environmental conditions. The analyser was used to monitor the production, consumption and atmospheric exchange of the greenhouse gases Carbon Dioxide (CO2), Methane (CH4) and Nitrous Oxide (N2O); all three being major components of natural biogeochemical cycles. These gases are each released and up-taken by soil microbes in the Arctic.

The DX4015 was used to examine both the flux of gases from the soil surface and the concentration profiles of gases in the soil’s active layer above the permafrost. In doing so the FTIR provides raw data consisting of gas concentrations in parts-per-million (ppm).

Explaining his reasoning behind choosing the Gasmet DX4015, Martin Brummell highlighted some of the analyser’s key advantages: “The real-time nature of the Gasmet FTIR, allows me to see results within minutes of setting up in the field. This permits me to make changes to the experimental design and further investigate unexpected results whilst in the field. This contrasts with traditional methods of soil gas analysis, which employ lab-based gas chromatography systems and collection of samples ‘blind’ in the field.”

Results
Surprisingly, the work revealed areas of strong CO2 and CH4 production immediately above the permafrost. Brummell believed this was the result of the relative disparity in carbon distribution in Arctic soils in comparison with warmer climes. Carbon accumulates far lower in Arctic soils due to a process known as cryoturbation; the constant mixing and burying of organic matter, which fuels microbial activity at a deeper level.

Comparisons between the surface flux and the soil profile for each of the greenhouse gases was a key objective within Brummell’s investigation. Most notably, he observed a negative surface flux for NO2, but no significant regions of consumption were identified. The location of the NO2 sink is not yet clear, nor the organisms and biogeochemical processes responsible.

Conclusions
Martin Brummell’s research provided a new but complex insight into the production, consumption and exchange of greenhouse gases and soil microbe pathways in the Arctic. His work highlighted the importance of reliability, ruggedness, flexibility and accuracy in the equipment which is employed in such work. However, the ability of the DX4015 to provide simultaneous measurement of multiple gases in near real-time was a major advantage.

In comparison with all of the equipment that is necessary for research in Arctic conditions, one might imagine that a highly sensitive analytical instrument would be the most likely to be adversely affected. However, Martin Brummell found this not to be the case with the Gasmet DX4015: “In contrast to other field equipment I have used in the High Arctic, including self-destructing sledgehammers, unreliable generators and broken fibre-optic cables, the Gasmet DX4015 has never failed even in the most difficult field conditions. It has happily survived air-transport, inconsistent electrical supply, low temperatures, rain, snow, mud and all other insults, and always gives me accurate, precise measurements of gas concentrations.”


#EFMExpo – an industrial event in Cork

30/09/2011

Instrument Technology exhibit at EFM Exhibition in Cork

It is quite a number of years since the capital of Munster hosted an industrial exhibition so it was a great pleasure to reaquaint ourselves with many old and indeed new friends at the ENFMExpo held in the Silver Springs complex on the 27th and 28th of September.

The Exhibitors!
ABB Limited
ACE Control Systems
ADA Security Systems
ADI Ireland Ltd
Alpha Sign Nameplate & Decal
Alternative Heating & Cooling Ltd
Apex Fire Ltd

Bord Gáis Networks
Business Safety
Butler Transtest

Camfil Irl Ltd
Clarke Energy Ireland

Clasit Beecher
Complete Alternative Energy
Cooper Industries
Cross Hire
CSC LTD Chemical Systems
Cylon Active Energy

Dalkia

Edina Ltd
EFT Control Systems
EMC Energy
Eurotech Calibration Services Ltd

Finning (Ireland) Ltd
Firebird Boilers
Focus Hygiene Supplies Limited
Frontline Energy & Environmental

GARO Electric Irl
Gem Utilities
General Electronic Access Ltd
GSH Group Ireland

Honeywell
HSG Zander Ireland Ltd

In’Flector Ireland Ltd
Instrument Technology – ABB
Irish Cooling Towers
Irish Industrial Coatings
Irish Power & Process Ltd

Kellihers Electrical

Manotherm
MARK EIRE B.V.
Moloney & Associates

Newbridge Metal Products Ltd

O Neill Industrial Ltd

Phoenix Contact (Irl) Ltd
powerPerfector Ireland Ltd
Premium Power
PurchaseControl.com

Radio & Security Products Ltd
Rittal Ltd
RPS Group

Sartorius Mechatronics Ireland Ltd
Sartorius Stedim Ireland
Screenguard Ltd
Shred-it
Sirus Engineering Systems

TEMP TECHNOLOGY/ENER.G
Traka KMS Ltd

Watersave

The event promised the  latest developments in Energy Management, Facilities Management along with Safety Health & Security. It was an opportunity too to meet industry experts at the concurrent seminars discussing the latest ideas, technology and services capable of helping plants to increase efficiency.

Over 60 companies exhibited and some of these provided speakers for the almost 20 different seminar talks given throughout the period of the show.

Of course our principal interest was the area of automation and there were a number of instrument companies and system providers among these. We provide a short impression of some of these here.

Manotherm is a company which hardly needs an introduction in Ireland. They have, since time immemorial it seems, been supplying Irish industry with control and instrumentation products, the basics of all process, manufacturing and construction industry automation requirements. They have been called the Instrumentation Supermarket and many times they have come to the rescue in solving a knotty problem with an instrument, sensor or valve, from their extensive stock.

John Watts

JS Watts of Schubert & Salzer

One of their principals, Schubert & Salzer gave one of the seminars on control valves in reducing energy consumption, handling and maintenace costs. John Watts discussed their GS 3 valve – a handy light and highly accurate valve based on principles discussed many centuries by Leonardo da Vinci. Known as a sliding gate valve, the GS3 seat design features a non turbulent, straight through flow path.  The flow is broken apart into multiple streams creating a reduced field of energy.  The result is greater service life, quieter operation and a control valve that performs at the highest levels possible within extreme conditions.

Instrument Technology, who are now  associates for marketing the line of ABB process instruments, had a large selection of flow, pressure and temperature instruments from this range. They marketed the Fisher & Porter range for many years and after the takeover of F&P by ABB their association with the larger entity is a logical development.

A new company to us was Eurotech Calibration Services (ECSL) where Kevin Davis showed the New Zealand based Temprecord range of temperature mapping and monitoring instrumentation. Aplications included the transportaion of blood for transfusions a truly critical application. This company provides calibration to the pharma, medical device, food and beverage industries as well as to other sectors.

Representing the Yokogawa interest, Irish Power & Process  displayed field instrumentation and calibration equipment. These cover, data acquisition, analytical, pressure, flow, wired and wireless – using ISA 100 standard. This company also represents Fluke test and measurement equipment and Camille Bauer.

Phoenix Contact are leading edge manufacturers of industrial control and automation solutions. They have an enviable reputation in the energy industry as suppliers of terminal blocks, DC UPS and Power Supplies surge protection devices, HMIs, IPCs and Wireless communications.

The very active Ireland Section of the International Society of Automation (ISA) also had a stand under the watchful eye of Douglas Control & Automation’s Declan Lordan.

The event was organised by SDL Exhibitions with their usual flair an  professionalism and hopefully this successful show will see a return of industrial events as the ecnomic situation improves in the years to come.

The busy Manotherm Stand at EFM