Real-time access to Antarctic tide data.

14/07/2020

One of the most important challenges, when designing monitoring facilities in remote locations, is resilience. Remote tide gauge systems operate in extremely harsh environments and require robust communications systems that almost never fail and are capable of storing large amounts of data locally as an extra protection for data. Scientists from the National Oceanography Centre (NOC) are therefore upgrading the South Atlantic Tide Gauge Network (SATGN) to include the latest low power dataloggers with built-in satellite telemetry capability – the SatLink 3 from OTT Hydromet.

Installation at Vernadsky

Installation at Vernadsky 1400KM south of Argentina

The SATGN is maintained and operated by the National Oceanography Centre, which is the British centre of excellence for sea level monitoring, coastal flood forecasting and the analysis of sea levels. It is the focus for marine water level research in Britain and for the provision of advice for policy makers, planners and coastal engineers.

Satellite telemetry is becoming increasingly popular in many other parts of the world. “Some government and non-commercial organisations are able to utilise a variety of satellites free of charge,” explains OTT’s Nigel Grimsley. “However, the cost of transmitting data via satellite has reduced considerably recently, and now rivals the cost of cellular communications.”

The SATGN measures sea levels in some of the most remote places on Earth. Monitoring sites include Antarctic locations such as Rothera and Vernadsky; located around 1,400km below the southern tip of Argentina. Prior to the installation of this network there was a lack of information on sea level variations in the Southern Atlantic and a bias in tide gauge records towards the more densely populated Northern hemisphere. Over the last 30 years data from the SATGN have improved estimates of global sea level change, such as those reported by the Intergovernmental Panel on Climate Change.

The NOC at Liverpool operates and maintains the SATGN providing near real-time sea level data for operational purposes and scientific research. This has helped to provide a long-term sea level record that is used by British scientists and the wider scientific community to monitor the Antarctic Circumpolar Current (ACC) variability. The data is also being used to help in the ‘ground truthing’ of satellite altimetry as well as the evaluation of climate variability on various timescales including longer term changes. In addition, the data is being used by local communities to provide essential information for both government and port authorities.

Monitoring/telemetry system upgrade
In recent years, the SATGN has undergone a refurbishment programme to reduce running costs and to safeguard local populations and infrastructure by providing tsunami monitoring capability and improving resilience. These new gauges couple Global Navigation Satellite System (GNSS) land level monitoring technology with tsunami capable radar and pressure sensors, transmitting data in near real-time by satellite based communications systems to operational monitoring centres.

SatLink3_satellite_transmitter_loggerAs part of this NOC ongoing program, the tide gauges’ main datalogger and transmitter have been upgraded to incorporate OTT’s new Sutron SatLink3. The first site to receive this upgrade was the Vernadsky station located in Antarctica, which is now operated by Ukrainian scientists and is soon to be followed by the tide gauge at King Edward point, on the South Georgia islands.

A further advantage of the upgrade is the SatLink3’s ability to communicate via Wi-Fi with wireless devices, including smart phones, tablets and computers. This means that local staff can connect wirelessly to the logger from a few metres away, which is a major advantage during inclement weather conditions.

Sensors
The SatLink3 datalogger is capable of accepting readings from a wide variety of sensors, with 2 independent SDI-12 channels, 5 analogue channels, one 4-20 mA channel and 2 digital inputs. The Vernadsky station includes a barometric pressure sensor, a radar level sensor installed over a heated/insulated stilling well (keeps the inner core free of ice) and two OTT PLS pressure level sensors which provide accurate measurements of water depth.

Tide Gauge Hut at Vernadsky Antarctica

Tide Gauge Hut at Vernadsky

The network is using the Geostationary Operational Environmental Satellite (GOES) to transmit data. GOES is operated by the United States’ National Oceanic and Atmospheric Administration (NOAA)’s National Environmental Satellite, Data, and Information Service. One minute averaged data is transmitted every 15 minutes. The data is then made freely available on the IOC Sea Level Station Monitoring Facility web site.

Summary
By upgrading to the SatLink3 logger/transmitter, the NOC is enhancing the resilience of the South Atlantic Tide Gauge Network. Jeff Pugh from the Marine Physics and Ocean Climate Group at the NOC, says: “The data from this network informs models that assist with projections relating to climate change, and others which provide advance warnings that can help protect life and property. Given the remote locations of the monitoring sites, it is vitally important, therefore, that the instruments are extremely reliable, operating on low power, with very little requirement for service or spares. By transmitting almost live data via satellite, these monitoring systems enable the models to deliver timely warnings; advance notice of tsunami, for example, can be of critical importance.”

@_Enviro_News @NOCnews #OTThydromat #Environment #PAuto

 

 


Greenhouse reduces Carbon Dioxide emissions.

17/04/2020
The Dutch horticultural sector aims to be climate-neutral by 2040. Scientists at Wageningen University & Research have therefore recently built a new demonstration greenhouse ‘Greenhouse 2030’ in an effort to find ways to reduce CO2 emissions as well eliminating the need for crop protection chemicals and optimizing the use of water and nutrients.

Greenhouses helping to reduce greenhouse gas emissions

Scientists at Wageningen University & Research (WUR) in the Netherlands have employed Vaisala carbon dioxide sensors in their research greenhouses for over a decade. Carbon dioxide is an extremely important measurement parameter in plant science, not just because plants need carbon dioxide to grow, but also because environmental emissions contribute to climate change, so enormous threats and opportunities surround this gas. As a world renowned research organisation, the value of the institute’s work is partly dependent on the accuracy and reliability of sensors, so it is important that its researchers do not compromise on sensor quality.

Wageningen has been one of the driving forces in research and technology development for greenhouse horticulture in the Netherlands. The institute’s expertise in the greenhouse cultivation of ornamental, fruit and vegetable crops is unique, and together with growers and technology partners, it has developed new cultivation systems, climate control systems, revolutionary greenhouse cover materials and other innovations. The application of these new technologies has made greenhouse horticulture in the Netherlands a world leader.

The Plant Research Institute operates over 100 greenhouse compartments at its Bleiswijk site, which means that researchers are able to generate a wide variety of environmental conditions. Typical environmental variables include light, water, growing medium, nutrients, (biological) pest/disease control, temperature, humidity and of course carbon dioxide (CO2); all of which have significant effects on crop yields.

The Dutch horticultural sector aims to be climate-neutral by 2040. The Wageningen researchers have therefore recently built a new demonstration greenhouse ‘Greenhouse 2030’ for the cultivation of vegetables, fruit and flowers in an effort to find ways to reduce CO2 emissions as well eliminating the need for crop protection chemicals and optimizing the use of water and nutrients. Pests and diseases are preferably tackled biologically, and the energy-efficient greenhouse reuses water and nutrients as much as possible; leading to cleaner cultivation and improved yields.

Carbon Dioxide in Greenhouses
Carbon dioxide is a by-product of many processes in the oil, gas and petrochemical industries, but it is also required by plants to grow through photosynthesis, so Dutch greenhouse operators have collaborated with the country’s industrial sector to utilise this byproduct and thereby contribute in the fight against climate change by lowering the country’s net CO2 emissions. Globally, many greenhouse operators burn natural gas to generate CO2, but this also generates heat that may not be needed in the summer months, so the utilisation of an industrial byproduct is significantly preferable.

Carbon dioxide was first delivered to Dutch greenhouses in 2005 via a pipe network established by the company Organic Carbon Dioxide for Assimilation of Plants (OCAP). Commercial greenhouse operators pay for this CO2 supply, which is largely derived from a bio ethanol plant. A key feature of the Institute’s research is work to optimise the utilisation of CO2, along with other plant growth variables. For example, the Institute has developed a simulation tool for CO2 dosing: the “CO2-viewer.” This programme monitors and displays the effects of a grower’s dosing strategy. For instance, it enables the evaluation of CO2 dosing around midday compared with dosing in the morning. The computational results of such an evaluation take all relevant greenhouse building characteristics and climate control settings into account.

Monitoring Carbon Dioxide

CO2 Probe

After around 10 years of operation, the institute is replacing around 150 of the older model probes with a newer model. The calibration of all probes is checked prior to the commencement of every project, utilizing certified reference gases. It is important that calibration data is traceable, so each probe’s calibration certificate is retained and subsequent calibration checks are documented. A portable CO2 monitor (a Vaisala GM70) with a GMP252 CO2 probe are also used as a validation tool to check installed probes, even though further calibration is not necessary.

Currently, the Institute’s installed probes provide 4-20 mA signals which feed into ‘climate computers’ that are programmed to manage the greenhouses automatically. This system also raises alarms if CO2 levels approach dangerous levels for any reason.

CO2 Sensor Technology
Carbon dioxide absorbs light in the infrared (IR) region at a wavelength of 4.26 μm. This means that when IR radiation is passed through a gas containing CO2, part of the radiation is absorbed, and this absorbance can be measured. The Vaisala CARBOCAP® carbon dioxide sensor features an innovative micro-machined, electrically tunable Fabry-Perot Interferometer (FPI) filter. In addition to measuring CO2 absorption, the FPI filter enables a reference measurement at a wavelength where no absorption occurs. When taking the reference measurement, the FPI filter is electrically adjusted to switch the bypass band from the absorption wavelength to a non-absorption wavelength. This reference measurement compensates for any potential changes in the light source intensity, as well as for contamination or dirt accumulation in the optical path. Consequently, the CARBOCAP® sensor is highly stable over time, and by incorporating both measurements in one sensor, this compact technology can be incorporated into small probes, modules, and transmitters.

The CARBOCAP® technology means that the researchers don’t have to worry about calibration drift or sensor failure.

Carbon Dioxide Plant Science Research
Two projects are currently underway evaluating the effects of different CO2 levels on plant production. One is studying soft fruit and the other tomatoes; however with CO2 playing such an important role in both plant growth and climate change, the value of accurate measurements of this gas continues to grow. Most of the greenhouses are now connected to the institute’s Ethernet and a wide variety of new sensors are continually being added to the monitoring network; providing an opportunity to utilise new ‘smart’ sensors.

Summary
The accuracy, stability and reliability of the CO2 sensors at Bleiswijk are clearly vitally important to the success of the Institute’s research, particularly because data from one greenhouse are often compared with data from others.

The CO2 supply has a cost; it is therefore important that this resource is monitored and supplied effectively so that plant production can be optimized.

Clearly, moves to lower the use of fossil fuels and develop more efficient energy management systems will help to reduce CO2 emissions from the greenhouse sector. However, the importance of CO2 utilization is set to grow, given the 2040 climate-neutral target and the world’s need to find new and better ways to capture CO2 emissions in ways that are both sustainable and economically viable.

#Hortoculture #Environment @VaisalaGroup @_Enviro_News


Flood monitoring.

27/01/2020
Monitoring is an essential component of natural flooding management, helping to define appropriate measures, measure their success, keep stakeholders informed, identify mistakes, raise alarms when necessary, inform adaptive management and help guide future research.
GreatFenGB

Great Fen showing Holme Fen woods top left and new ponds and meres in April

Flooding is a natural process, but it endangers lives and causes heavy economic loss. Furthermore, flood risk is expected to increase with climate change and increased urbanisation, so a heavy responsibility lies with those that allocate funding and formulate flood management strategy. In the following article, Nigel Grimsley from OTT Hydromet explains how the success of such plans (both the design and implementation) depend on the accuracy and reliability of the monitoring data upon which they rely.

Climate projections for Britain suggest that rainfall will increase in winter and decrease in summer, and that individual rainfall events may increase in intensity, especially in winter. This paradigm predicates an increased risk of flooding.

Emphasising the urgent need for action on flood risk, (British) Environment Agency chairwoman Emma Howard Boyd, has said that on current trends, global temperature could rise between 2 deg C and 4 Deg C by 2100 and some communities may even need to move because of the risk of floods. Launching a consultation on the agency’s flood strategy, she said: “We can’t win a war against water by building away climate change with infinitely high flood defences.”

In response, Mike Childs, head of science at Friends of the Earth, said: “Smarter adaptation and resilience building – including natural flood management measures like tree-planting – is undeniably important but the focus must first and foremost be on slashing emissions so that we can avoid the worst consequences of climate chaos in the first place.”

Historically, floodplains have been exploited for agricultural and urban development, which has increased the exposure of people, property and other infrastructure to floods. Flood risk management therefore focused on measures to protect communities and industry in affected areas. However, flood risk is now addressed on a wider catchment scale so that initiatives in one part of a catchment do not have negative effects further downstream. This catchment based approach is embodied within the EU Floods Directive 2007/60/EC, and in recent years, those responsible for flood management have increasingly looked for solutions that employ techniques which work with natural hydrological and morphological processes, features and characteristics to manage the sources and pathways of flood waters. These techniques are known as natural flood management (NFM) and include the restoration, enhancement and alteration of natural features but exclude traditional flood defence engineering that effectively disrupts these natural processes.

NFM seeks to create efficiency and sustainability in the way the environment is managed by recognising that when land and water are managed together at the catchment scale it is possible to generate whole catchment improvements with multiple benefits.

Almost all NFM techniques aim to slow the flow of water and whilst closely connected, can be broadly categorised as infiltration, conveyance and storage.

Infiltration
Land use changes such as set-aside, switching arable to grassland or restricted hillside cropping, can improve infiltration and increase water retention. In addition, direct drilling, ‘no-till’ techniques and cross slope ploughing can have a similar effect. These land use techniques are designed to reduce the soil compaction which increases run-off. Livestock practices such as lower stocking rates and shorter grazing seasons can also help. Field drainage can be designed to increase storage and reduce impermeability, which is also aided by low ground pressure vehicles. The planting of shrubs and trees also helps infiltration and retention by generating a demand for soil moisture, so that soils have a greater capacity to absorb water. Plants also help to bind soil particles, resulting in less erosion – the cause of fertility loss and sedimentation in streams and rivers.

Conveyance
Ditches and moorland grips can be blocked to reduce conveyance, and river profiles can be restored to slow the flow. In the past, peats and bogs have been drained to increase cropping areas, but this damages peatlands and reduces their capacity to retain water and store carbon. The restoration of peatland therefore relies on techniques to restore moisture levels. Pumping and drainage regimes can be modified, and landowners can create strategically positioned hedges, shelter belts and buffer strips to reduce water conveyance.

Storage
Rivers can be reconnected with restored floodplains and river re-profiling, leaky dams, channel works and riparian improvements can all contribute to improved storage capability. In urban areas permeable surfaces and underground storage can be implemented, and washlands and retention ponds can be created in all areas. As mentioned above, the re-wetting of peatland and bogs helps to increase storage capacity.

Many of the effects of NFM might be achieved with the re-introduction of beavers, which build dams that reduce peak flows, create pools and saturate soil above their dams. The dams also help to remove pollutants such as phosphates. Beavers do not eat fish, instead preferring aquatic plants, grasses and shrubs during the summer and woody plants in winter. Beavers are now being introduced in a number of areas in trials to determine their value in the implementation of NFM. One of the key benefits offered by beavers is their ability to quickly repair and rebuild dams that are damaged during extreme weather. However, whilst the potential benefits of beavers are well known, several groups have expressed concern with the prospect of their widespread introduction. For example, farmers and landowners may find increased areas of waterlogged land due to blocked drainage channels. In addition, dams present a threat to migratory fish such as salmon and sea trout.

Beavers are native to Britain and used to be widespread, but they were hunted to extinction during the 17th century. However, other non-native species such as signal crayfish can have a detrimental effect on flood protection because they burrow into river banks causing erosion, bank collapse and sediment pollution. Signal crayfish are bigger, grow faster, reproduce more quickly and tolerate a wider range of conditions than the native white-clawed crayfish. Signal crayfish are also voracious predators, feeding on fish, frogs, invertebrates and plants, and as such can create significant negative ecological effects.

NFM benefits
NFM provides protection for smaller flood events, reduces peak flooding and delays the arrival of the flood peak downstream. However, it does not mitigate the risk from extreme flood events. Effective flood management strategy therefore tends to combine NFM with hard engineering measures. Nevertheless, NFM generally provides a broader spectrum of other benefits.

The creation of new woodlands and wetlands produces biodiverse habitats with greater flood storage capacity. They also enable more species to move between habitats. NFM measures that reduce soil erosion, run-off and sedimentation also help to improve water quality and thereby also improve habitats. In particular, these measures lower nutrient and sediment loading lower in the catchment; two issues which can have dramatic effects on water quality and amenity.

Land use and land management measures help to reduce the loss of topsoil and nutrients. This improves agricultural productivity and lowers the cost of fertilizers. Furthermore, a wide range of grants are available for NFM measures, such as the creation of green spaces and floodplains, to make them more financially attractive to farmers and landowners.

Many NFM measures help in the fight against climate change. For example, wetlands and woodlands are effective at storing carbon and removing carbon dioxide from the atmosphere. Measures that reduce surface run off and soil erosion, such as contour cultivation, can also reduce carbon loss from soil.

Monitoring NFM
Given the wide range of potential NFM benefits outlined above, the number and type of parameters to be monitored are likely to be equally diverse. Baseline data is essential if the impacts of implemented measures are to be assessed, but this may not always be deliverable. For example, it may only be possible to collect one season of data prior to a five year project. However, it may be possible to secure baseline data from other parties. In all instances data should of course be accurate, reliable, relevant and comparable.

Monitoring data should be used to inform the design of NFMs. For example, a detailed understanding of the ecology, geomorphology, hydrology and meteorology of the entire catchment will help to ensure that the correct measures are chosen. These measures should be selected in partnership with all stakeholders, and ongoing monitoring should provide visibility of the effects of NFM measures. Typically stakeholders will include funders, project partners, local communities, landowners, regulators and local authorities.

Since NFM measures are designed to benefit an entire catchment, it is important that monitoring is also catchment-wide. However, this is likely to be a large area so there will be financial implications, particularly for work that is labour-intensive. Consequently, it will be necessary to prioritise monitoring tasks and to deploy remote, automatic technology wherever it is cost-effective.

OTT ecoN with wiper

OTT ecoN Sensor

Clearly, key parameters such as rainfall, groundwater level, river level and surface water quality should be monitored continuously in multiple locations if the benefits of NFM are to be measured effectively. It is fortunate therefore that all of these measurements can be taken continuously 24/7 by instruments that can be left to monitor in remote locations without a requirement for frequent visits to calibrate, service or change power supplies. As a business OTT Hydromet has been focused on the development of this capability for many years, developing sensors that are sufficiently rugged to operate in potentially aggressive environments, data loggers with enormous capacity but with very low power requirement, and advanced communications technologies so that field data can be instantly viewed by all stakeholders.

Recent developments in data management have led to the development of web-enabled data management solutions such as Hydromet Cloud, which, via a website and App, delivers the backend infrastructure to receive, decode, process, display and store measurement data from nearly any remote hydromet monitoring station or sensor via a cloud-based data hosting platform. As a consequence, alarms can be raised automatically, which facilitates integration with hard engineering flood control measures. Hydromet Cloud also provides access to both current and historic measurement data, enabling stakeholders to view the status of an entire catchment on one screen.

Holme Fen – a monitoring lesson from the 1850s

Holme Fen post HS

Holme Fen post

Surrounded by prime agricultural land to the south of Peterborough (Cambridgeshire,GB) , the fens originally contained many shallow lakes, of which Whittlesey Mere was the largest, covering around 750 hectares in the summer and around twice that in the winter. Fed by the River Nene, the mere was very shallow and was the last of the ‘great meres’ to be drained and thereby converted to cultivatable land.

Led by William Wells, a group of local landowners funded and arranged the drainage project, which involved the development of a newly invented steam powered centrifugal pump which was capable of raising over 100 tons of water per minute by 2 or 3 feet. A new main drain was constructed to take water to the Wash. Conscious of the likely shrinking effect of drainage on the peaty soil, Wells instigated the burial of a measurement post, which was anchored in the Oxford Clay bedrock and cut off at the soil surface. In 1851 the original timber post was replaced by a cast iron column which is believed to have come from the Crystal Palace in London.

By installing a measurement post, Wells demonstrated remarkable foresight. As the drainage proceeded, the ground level sank considerably; by 1.44 metres in the first 12 years, and by about 3 metres in the first 40 years. Today, around 4 metres of the post is showing above ground, recording the ground subsidence since 1852. The ground level at Holme Post is now 2.75 metres below sea level – the lowest land point in Great Britain.
Several complications have arisen as a result of the drainage. Firstly, there has been a huge impact in local ecology and biodiversity with the loss of a large area of wetland. Also, as the ground level subsided it became less sustainable to pump water up into the main drain.

Holme Fen is now a National Nature Reserve, managed by Natural England, as is the nearby Woodwalton Fen. They are both part of the Great Fen Project, an exciting habitat restoration project, involving several partners, including the local Wildlife Trust, Natural England and the Environment Agency. At Woodwalton, the more frequent extreme weather events that occur because of climate change result in flooding that spills into the reserve. In the past, this was a good example of NFM as the reserve provided a buffer for excess floodwater. However, Great Fen Monitoring and Research Officer Henry Stanier says: “Floodwater increasingly contains high levels of nutrients and silt which can harm the reserve’s ecology, so a holistic, future-proof strategy for the area is necessary.”

Applauding the farsightedness of William Wells, Henry says: “As a conservationist I am often called in to set up monitoring after ecological recovery has begun, rather than during or even before harm has taken place. At the Wildlife Trust, we are therefore following the example provided by Wells, and have a network of monitoring wells in place so that we can monitor the effects of any future changes in land management.

“For example, we are setting up a grant funded project to identify the most appropriate crops for this area; now and in the future, and we are working with OTT to develop a monitoring strategy that will integrate well monitoring with the measurement of nutrients such as phosphate and nitrate in surface waters.”

Summary
Monitoring provides an opportunity to measure the effects of initiatives and mitigation measures. It also enables the identification of trends so that timely measures can be undertaken before challenges become problems, and problems become catastrophes.

Monitoring is an essential component of NFM, helping to define appropriate measures, measure their success, keep stakeholders informed, identify mistakes, raise alarms when necessary, inform adaptive management and help guide future research.

#Environment @OTTHydromet @EnvAgency @friends_earth


High frequency monitoring needed to protect UK rivers!

29/06/2018
Nigel Grimsley from OTT Hydrometry describes relatively new technologies that have overcome traditional barriers to the continuous monitoring of phosphate and nitrate.

The science behind nutrient pollution in rivers is still poorly understood despite the fact that nitrate and phosphate concentrations in Britain’s rivers are mostly unacceptable, although an element of uncertainty exists about what an acceptable level actually is. Key to improving our understanding of the sources and impacts of nutrient pollution is high-resolution monitoring across a broad spectrum of river types.

Background

Green Box Hydro Cycle

Phosphates and nitrates occur naturally in the environment, and are essential nutrients that support the growth of aquatic organisms. However, water resources are under constant pressure from both point and diffuse sources of nutrients. Under certain conditions, such as warm, sunny weather and slow moving water, elevated nutrient concentrations can promote the growth of nuisance phytoplankton causing algal blooms (eurtrophication). These blooms can dramatically affect aquatic ecology in a number of ways. High densities of algal biomass within the water column, or, in extreme cases, blankets of algae on the water surface, prevent light from reaching submerged plants. Also, some algae, and the bacteria that feed on decaying algae, produce toxins. In combination, these two effects can lower dissolved oxygen levels and potentially kill fish and other organisms. In consequence, aquatic ecology is damaged and the water becomes unsuitable for human recreation and more expensive to treat for drinking purposes.

In its State of the Environment report, February 2018, the British Environment Agency said: “Unacceptable levels of phosphorus in over half of English rivers, usually due to sewage effluent and pollution from farm land, chokes wildlife as algal blooms use up their oxygen. Groundwater quality is currently deteriorating. This vital source of drinking water is often heavily polluted with nitrates, mainly from agriculture.”

Good ecological status
The EU Water Framework Directive (WFD) requires Britain to achieve ‘good status’ of all water bodies (including rivers, streams, lakes, estuaries, coastal waters and groundwater) by 2015. However, only 36% of water bodies were classified as ‘good’ or better in 2012. Nutrient water quality standards are set by the Department for Environment, Food & Rural Affairs (DEFRA), so for example, phosphorus water quality standards have been set, and vary according to the alkalinity and height above mean sea level of the river. Interestingly, the standards were initially set in 2009, but in 75% of rivers with clear ecological impacts of nutrient enrichment, the existing standards produced phosphorus classifications of good or even high status, so the phosphorus standards were lowered.

Highlighting the need for better understanding of the relationships between nutrients and ecological status, Dr Mike Bowes from the Centre for Ecology & Hydrology has published research, with others, in which the effects of varying soluble reactive phosphate (SRP) concentrations on periphyton growth rate (mixture of algae and microbes that typically cover submerged surfaces) where determined in 9 different rivers from around Britain. In all of these experiments, significantly increasing SRP concentrations in the river water for sustained periods (usually c. 9 days) did not increase periphyton growth rate or biomass. This indicates that in most rivers, phosphorus concentrations are in excess, and therefore the process of eutrophication (typified by excessive algal blooms and loss of macrophytes – aquatic plants) is not necessarily caused by intermittent increases in SRP.

Clearly, more research is necessary to more fully understand the effects of nutrient enrichment, and the causes of algal blooms.

Upstream challenge
Headwater streams represent more than 70% of the streams and rivers in Britain, however, because of their number, location and the lack of regulatory requirement for continuous monitoring, headwater streams are rarely monitored for nutrient status. Traditional monitoring of upland streams has relied on either manual sampling or the collection of samples from automatic samplers. Nevertheless, research has shown that upland streams are less impaired by nutrient pollution than lowland rivers, but because of their size and limited dilution capacity they are more susceptible to nutrient impairment.

References
• Bowes, M. J., Gozzard, E., Johnson, A. C., Scarlett, P. M., Roberts, C., Read, D. S., et al. (2012a). Spatial and temporal changes in chlorophyll-a concentrations in the River Thames basin, UK: are phosphorus concentrations beginning to limit phytoplankton biomass? Sci. Total Environ. 426, 45–55. doi: 10.1016/j.scitotenv. 2012.02.056
• Bowes, M. J., Ings, N. L., McCall, S. J., Warwick, A., Barrett, C., Wickham, H. D., et al. (2012b). Nutrient and light limitation of periphyton in the River Thames: implications for catchment management. Sci. Total Environ. 434, 201–212. doi: 10.1016/j.scitotenv.2011.09.082
• Dodds, W. K., Smith, V. H., and Lohman, K. (2002). Nitrogen and phosphorus relationships to benthic algal biomass in temperate streams. Can. J. Fish. Aquat Sci. 59, 865–874. doi: 10.1139/f02-063
• McCall, S. J., Bowes, M. J., Warnaars, T. A., Hale, M. S., Smith, J. T., Warwick, A., et al. (2014). Impacts of phosphorus and nitrogen enrichment on periphyton accrual in the River Rede, Northumberland, UK. Inland Waters 4, 121–132. doi: 10.5268/IW-4.2.692
• McCall, S. J., Hale, M. S., Smith, J. T., Read, D. S., and Bowes, M. J. (2017). Impacts of phosphorus concentration and light intensity on river periphyton biomass and community structure. Hydrobiologia 792, 315–330. doi: 10.1007/s10750-016-3067-1

Monitoring technology
Sampling for laboratory analysis can be a costly and time-consuming activity, particularly at upland streams in remote locations with difficult access. In addition, spot sampling reveals nutrient levels at a specific moment in time, and therefore risks missing concentration spikes. Continuous monitoring is therefore generally preferred, but in the past this has been difficult to achieve with the technology available because of its requirement for frequent re-calibration and mains power.

High resolution SRP monitoring has been made possible in almost any location with the launch by OTT Hydromet of the the ‘HydroCycle PO4’ which is a battery-powered wet chemistry analyser for the continuous analysis of SRP. Typically, the HydroCycle PO4 is deployed into the river for monitoring purposes, but recent work by the Environment Agency has deployed it in a flow-through chamber for measuring extracted water.

The HydroCycle PO4 methodology is based on US EPA standard methods, employing pre-mixed, colour coded cartridges for simple reagent replacement in the field. Weighing less than 8kg fully loaded with reagents, it is quick and easy to deploy, even in remote locations. The instrument has an internal data logger with 1 GB capacity, and in combination with telemetry, it provides operators with near real-time access to monitoring data for SRP.

The quality of the instrument’s data is underpinned by QA/QC processing in conjunction with an on-board NIST standard, delivering scientifically defensible results. Engineered to take measurements at high oxygen saturation, and with a large surface area filter for enhanced performance during sediment events, the instrument employs advanced fluidics, that are resistant to the bubbles that can plague wet chemistry sensors.

Environment Agency application
The National Laboratory Service Instrumentation team (NLSI) provides support to all high resolution water quality monitoring activities undertaken across the Agency, underpinning the EA’s statutory responsibilities such as the WFD, the Urban Waste Water Directive and Statutory Surface Water Monitoring Programmes. It also makes a significant contribution to partnership projects such as Demonstration Test Catchments and Catchments Sensitive Farming. Technical Lead Matt Loewenthal says: “We provide the Agency and commercial clients with monitoring systems and associated equipment to meet their precise needs. This includes, of course, nutrient monitoring, which is a major interest for everyone involved with water resources.”

Matt’s team has developed water quality monitoring systems that deliver high resolution remote monitoring with equipment that is quick and easy to deploy. There are two main options. The ‘green box’ is a fully instrumented cabinet that can be installed adjacent to a water resource, drawing water and passing it though a flow-through container with sensors for parameters such as Temperature Dissolved Oxygen, Ammonium, Turbidity, Conductivity pH and Chlorophyll a. Each system is fitted with telemetry so that real-time data is made instantly available to users on the cloud.

Conscious of the need to better understand the role of P in rivers, Matt’s team has integrated a HydroCycle PO4 into its monitoring systems as a development project.
Matt says: “It’s currently the only system that can be integrated with all of our remote monitoring systems. Because it’s portable, and runs on 12 volts, it has been relatively easy to integrate into our modular monitoring and telemetry systems.

“The HydroCycle PO4 measures SRP so if we need to monitor other forms of P, we will use an auto sampler or deploy a mains-powered monitor. However, monitoring SRP is important because this is the form of P that is most readily available to algae and plants.”

Explaining the advantages of high resolution P monitoring, Matt refers to a deployment on the River Dore. “The data shows background levels of 300 µg P/l, rising to 600 µg P/l following heavy rain, indicating high levels of P in run-off.”

Nitrate
Similar to phosphates, excessive nitrate levels can have a significant impact on water quality. In addition, nitrates are highly mobile and can contaminate groundwater, with serious consequences for wells and drinking water treatment. Nitrate concentrations are therefore of major interest to the EA, but traditional monitoring technology has proved inadequate for long-term monitoring because of a frequent recalibration requirement. To address this need, which exists globally, OTT Hydromet developed the SUNA V2, which is an optical nitrate sensor, providing high levels of accuracy and precision in both freshwater and seawater.

The NLSI has evaluated the SUNA V2 in well water and Matt says: “It performed well – we took grab samples for laboratory analysis and the SUNA data matched the lab data perfectly. We are therefore excited about the opportunity this presents to measure nitrate continuously, because this will inform our understanding of nitrate pollution and its sources, as well as the relationship between groundwater and surface water.”

Summary
The new capability for high-resolution monitoring of nutrients such as phosphorus will enable improved understanding of its effects on ecological status, and in turn will inform decisions on what acceptable P concentrations will be for individual rivers. This is vitally important because the cost of removing P from wastewater can be high, so the requirements and discharge limits that are placed on industrial and wastewater companies need to be science based and supported by reliable data. Similarly, nitrate pollution from fertilizer runoff, industrial activities and wastewater discharge, has been difficult to monitor effectively in the past because of the technology limitations. So, as improved monitoring equipment is developed, it will be possible to better understand the sources and effects, and thereby implement effective prevention and mitigation strategies.

@OTTHydrometry @EnvAgency @CEHScienceNews #Water #Environment

Energy & Environment – 3 big predictions!

25/04/2012

Frost & Sullivan has released its three big predictions for 2012 for the global energy and environment market. Industrial convergence, smart technology and distributed generation will be the key topics in 2012 and beyond.

Based on a survey of several thousand companies conducted in December 2011, this research paper highlights areas of growth.  “Data and opinions of key stakeholders, combined with analysis and commentary from Frost & Sullivan industry experts, have been used to present key market highlights, hot growth topics, global and regional hot spots, areas of market convergence, and bold predictions for 2012,” explains John Raspin, Director and Partner at Frost & Sullivan.

Convergence and Value Chain Integration
Energy and environment players see the greatest convergence today from within their own sector as well  as the industrial automation and ICT industries. It is also highly significant the convergence between energy and automotive markets and relates primarily to the emergence of electric vehicles and e-mobility that is driving innovation in batteries, energy storage, transmission and distribution infrastructure, battery charging and integration of mobility into the smart home. Convergence opportunities exist also in the water sector driven by innovation among treatment technology, process control and automation & instrumentation sectors.

Value chains, less consolidated and stretching across hundreds of components, have been and are still presenting many opportunities in the renewable energy industry. Wind power is the most highly attractive segment to European companies. Solar PV still has a positive growth outlook but opportunities for European players are slowly vanishing as the global market becomes more dominated by Chinese players. Chemical and material companies have begun to take significant steps further down the value chain to acquire key technology and solution capabilities in the fast growing and high potential energy and environment markets.

Smart Technology
Smart technology is going to play a key role in the future development of the energy and environment sector with efficiency improvements at the centre of the market evolution.  Smart grids, buildings, homes, cities and water networks will all become a reality this decade, thus creating far-reaching market growth opportunities.

Distributed Generation
There is and will be increasing focus on deployment of small scale renewable energy close to the point of consumption.  This will create opportunities for suppliers or micro-generation technologies as well as requiring new strategies and business models from power generation companies and utilities. Growth of distributed generation, legislation surrounding this market as well as integration of distributed generation into the grid are emerging trends especially in the developed markets.

The key themes outlined above feature heavily in Frost & Sullivan’s energy and environment research programme for 2012.  A key driver for the research is the work Frost & Sullivan has been conducting around Mega Trends which are driving new and emerging market segments for key industry participants.

See the presentation!
Frost & Sullivan insight on the three big predictions for 2012 and beyond is available on SlideShare.


The best cooling solutions!

15/12/2011
Heat generated by datacenters is ten times greater than the heat generated by them around 10 years back

A study by Frost & Sullivan’s Gautham Gnanajothi

Datacenter technology has arrived to a point of no return in the recent times. The servers used in them have evolved and have reduced in physical size but have increased in performance levels. The trouble with this fact is that it has considerably increased their power consumption and heat densities. Thus, the heat generated by the servers in datacenters is currently ten times greater than the heat generated by them around 10 years back; as a result, the traditional computer room air conditioning (CRAC) systems have become overloaded. Hence, new strategies and innovative cooling solutions must be implemented to match the high-density equipment. The rack level power density increase has resulted in the rise of thermal management challenges over the past few years. Reliable datacenter operations are disrupted by hot spots created by such high-density equipment.

Emerson's global data center (St Louis MO US) uses the latest energy-efficiency technologies, precision cooling products, and efficiency strategies.

Some of the main datacenter challenges faced in the current scenario are adaptability, scalability, availability, life cycle costs, and maintenance. Flexibility and scalability are the two most important aspects any cooling solution must possess; this, combined with redundant cooling features, will deliver optimum performance. The two main datacenter cooling challenges are airflow challenge and space challenge. These challenges can be overcome with the use of innovative cooling solutions. Some of the cooling techniques used in datacenters are discussed below.

Aisle Containment

Aisle containment strategies have gained immense popularity among data center operators in the past and this trend is expected to continue in the future as well.  With the use of hot aisle and cold aisle containment, energy-efficient best practice in server rooms can be achieved. Usage of hot aisle or cold aisle depends uniquely on the type of application used. Most data centers have a standard hot aisle/cold aisle layout: the aisle containments are the refinements of these layouts. In these layouts each successive aisle is labeled either hot aisle or cold aisle. In the hot aisle, the banks of the server rack exhausts hot air. In a cold aisle, the server racks are aligned in such a way that the equipment inlets face each other in the opposite sides. There is usually a raised floor system known as the “plenum” under which the cool air from the CRAC or the computer room air handler (CRAH) flows to the perforated floor tiles. These floor tiles are located in the cold aisles and facilitate the cool air into the server inlets in front of the racks and exhaust via the hot aisle. By the hot aisle/cold aisle containment, the cool air can be directed closer to the server inlets; thereby increasing the latter’s energy efficiency.

Rows of server racks at the computer center at CERN in Switzerland. (Pic CERN)

However, there are a couple of challenges faced by the aisle containment solution  the first one being “Bypass Air”; this situation arises when the cool air refuses to enter the server. The other one is “recirculation” where the heated exhaust air flows back into the cold aisle through empty space or over the top of the racks. These two conditions are known for creating hot spots in the server rooms. Data center operators use sheets made of plastic, cardboard, and so on to make barriers for the cold aisles so that the hot air does not re-enter the cold aisle.

High-density Supplemental Cooling
Data center densities have increased from 2 to 3 kW per rack to an excess of 30 kW per rack. A different cooling approach needs to be implemented to meet the high-density requirements. This is when supplemental cooling comes into place. It uses two different approaches: “rear door heat exchangers” and “over head heat exchangers”. Rear door heat exchangers come to the rescue of the struggling CRAC by conditioning the hot air and returning it to the room at colder temperature. They require a chilled water source and a connection to a remote chiller unit. The over head heat exchangers, as the name suggests, are suspended above the server rows. They compliment the hot aisle/cold aisle containment by sucking the hot air from the hot aisle exhaust, condition it, and send cool air to the cold aisles. The supplemental cooling reduces the pressure off the CRAC unit.

Liquid Cooling

The Aurora supercomputer from Eurotech, which uses liquid cooling.

With the rise in the number of applications and services that require high-density configurations, liquid cooling is generating a lot of interest among data center operators. As the name suggests, it brings the liquid (either chilled water or refrigirant) closer to the heat source for a more effective cooling. On contrary to a CRAC unit where it is isolated to a corner of the room, liquid cooling solutions are embedded in row of server racks or suspended from the ceiling or installed in a closed relationship with one or more server racks. There are two types of the liquid cooling – “in row liquid cooling” and “in rack liquid cooling”; both of them require chilled water (or refrigirant) and return piping. It is run either overhead or beneath the floor to each individual cooler.

Closed Couple Cooling
Another remedy for the high-density computing would be closed couple cooling where the distant air conditioner is moved closer to the computing load. The latest generation cooling products can be described by the term closed coupled cooling. Although their solutions vary in terms of configuration and capacity, their approach is the same. It brings the heat transfer closest to the source, which is the server rack. By doing so, the inlet air is delivered more precicely and the exhaust air is captured efficiently. There are two configurations in which it operates – “closed loop” and “open loop”. In the open loop configuration, the air stream will tend to interact with the room environment to an extent. However, closed loop configuration is completely independent of the room in which it is installed. It creates a micro climate within the enclosure because the rack and the heat exchanger work exclusively with one another.

The present high-density computing data center has thousands of racks each with multiple computing units. The parts of the computing units include multiple microprocessors – each one dissipates about 250 W of power. The heat dissipation from the racks containing such computing units excedes 10 KW. Assuming that the present data centers have 1,000 racks and more than 30,000 square feet, these would require 10 MW of power for the computing infrasructure. The future datacenters, which would be even bigger with more servers, would have greater power requirements and, hence, more energy-efficient and innovative cooling solutions.

Conclusion
There are a number of cooling solutions available in the market place, however there is not one particular cooling solution which would be suitable for all kinds of data center applications. They are often dependant on many factors like room layout, installation densities and geographic location. On the whole when we compare the different cooling solutions, it can be said that the liquid cooling technique is proving itself as an efficient and effective cooling solution for high density data centers because it brings the cooling liquid closer to the heat source for a more effective cooling. This type of cooling solutions are gaining popularity among the data center operators as the units are embedded in rows of server racks and do not take up floor space, they are also retrofit friendly which means that the data center can stay operational as the units are brought online. Data centers would benefit by using liquid cooling solutions for their high density servers and this would be the best way forward.

Heat generated by the servers in datacenters is currently ten times greater than the heat generated by them around 10 years back