High frequency monitoring needed to protect UK rivers!

29/06/2018
Nigel Grimsley from OTT Hydrometry describes relatively new technologies that have overcome traditional barriers to the continuous monitoring of phosphate and nitrate.

The science behind nutrient pollution in rivers is still poorly understood despite the fact that nitrate and phosphate concentrations in Britain’s rivers are mostly unacceptable, although an element of uncertainty exists about what an acceptable level actually is. Key to improving our understanding of the sources and impacts of nutrient pollution is high-resolution monitoring across a broad spectrum of river types.

Background

Green Box Hydro Cycle

Phosphates and nitrates occur naturally in the environment, and are essential nutrients that support the growth of aquatic organisms. However, water resources are under constant pressure from both point and diffuse sources of nutrients. Under certain conditions, such as warm, sunny weather and slow moving water, elevated nutrient concentrations can promote the growth of nuisance phytoplankton causing algal blooms (eurtrophication). These blooms can dramatically affect aquatic ecology in a number of ways. High densities of algal biomass within the water column, or, in extreme cases, blankets of algae on the water surface, prevent light from reaching submerged plants. Also, some algae, and the bacteria that feed on decaying algae, produce toxins. In combination, these two effects can lower dissolved oxygen levels and potentially kill fish and other organisms. In consequence, aquatic ecology is damaged and the water becomes unsuitable for human recreation and more expensive to treat for drinking purposes.

In its State of the Environment report, February 2018, the British Environment Agency said: “Unacceptable levels of phosphorus in over half of English rivers, usually due to sewage effluent and pollution from farm land, chokes wildlife as algal blooms use up their oxygen. Groundwater quality is currently deteriorating. This vital source of drinking water is often heavily polluted with nitrates, mainly from agriculture.”

Good ecological status
The EU Water Framework Directive (WFD) requires Britain to achieve ‘good status’ of all water bodies (including rivers, streams, lakes, estuaries, coastal waters and groundwater) by 2015. However, only 36% of water bodies were classified as ‘good’ or better in 2012. Nutrient water quality standards are set by the Department for Environment, Food & Rural Affairs (DEFRA), so for example, phosphorus water quality standards have been set, and vary according to the alkalinity and height above mean sea level of the river. Interestingly, the standards were initially set in 2009, but in 75% of rivers with clear ecological impacts of nutrient enrichment, the existing standards produced phosphorus classifications of good or even high status, so the phosphorus standards were lowered.

Highlighting the need for better understanding of the relationships between nutrients and ecological status, Dr Mike Bowes from the Centre for Ecology & Hydrology has published research, with others, in which the effects of varying soluble reactive phosphate (SRP) concentrations on periphyton growth rate (mixture of algae and microbes that typically cover submerged surfaces) where determined in 9 different rivers from around Britain. In all of these experiments, significantly increasing SRP concentrations in the river water for sustained periods (usually c. 9 days) did not increase periphyton growth rate or biomass. This indicates that in most rivers, phosphorus concentrations are in excess, and therefore the process of eutrophication (typified by excessive algal blooms and loss of macrophytes – aquatic plants) is not necessarily caused by intermittent increases in SRP.

Clearly, more research is necessary to more fully understand the effects of nutrient enrichment, and the causes of algal blooms.

Upstream challenge
Headwater streams represent more than 70% of the streams and rivers in Britain, however, because of their number, location and the lack of regulatory requirement for continuous monitoring, headwater streams are rarely monitored for nutrient status. Traditional monitoring of upland streams has relied on either manual sampling or the collection of samples from automatic samplers. Nevertheless, research has shown that upland streams are less impaired by nutrient pollution than lowland rivers, but because of their size and limited dilution capacity they are more susceptible to nutrient impairment.

References
• Bowes, M. J., Gozzard, E., Johnson, A. C., Scarlett, P. M., Roberts, C., Read, D. S., et al. (2012a). Spatial and temporal changes in chlorophyll-a concentrations in the River Thames basin, UK: are phosphorus concentrations beginning to limit phytoplankton biomass? Sci. Total Environ. 426, 45–55. doi: 10.1016/j.scitotenv. 2012.02.056
• Bowes, M. J., Ings, N. L., McCall, S. J., Warwick, A., Barrett, C., Wickham, H. D., et al. (2012b). Nutrient and light limitation of periphyton in the River Thames: implications for catchment management. Sci. Total Environ. 434, 201–212. doi: 10.1016/j.scitotenv.2011.09.082
• Dodds, W. K., Smith, V. H., and Lohman, K. (2002). Nitrogen and phosphorus relationships to benthic algal biomass in temperate streams. Can. J. Fish. Aquat Sci. 59, 865–874. doi: 10.1139/f02-063
• McCall, S. J., Bowes, M. J., Warnaars, T. A., Hale, M. S., Smith, J. T., Warwick, A., et al. (2014). Impacts of phosphorus and nitrogen enrichment on periphyton accrual in the River Rede, Northumberland, UK. Inland Waters 4, 121–132. doi: 10.5268/IW-4.2.692
• McCall, S. J., Hale, M. S., Smith, J. T., Read, D. S., and Bowes, M. J. (2017). Impacts of phosphorus concentration and light intensity on river periphyton biomass and community structure. Hydrobiologia 792, 315–330. doi: 10.1007/s10750-016-3067-1

Monitoring technology
Sampling for laboratory analysis can be a costly and time-consuming activity, particularly at upland streams in remote locations with difficult access. In addition, spot sampling reveals nutrient levels at a specific moment in time, and therefore risks missing concentration spikes. Continuous monitoring is therefore generally preferred, but in the past this has been difficult to achieve with the technology available because of its requirement for frequent re-calibration and mains power.

High resolution SRP monitoring has been made possible in almost any location with the launch by OTT Hydromet of the the ‘HydroCycle PO4’ which is a battery-powered wet chemistry analyser for the continuous analysis of SRP. Typically, the HydroCycle PO4 is deployed into the river for monitoring purposes, but recent work by the Environment Agency has deployed it in a flow-through chamber for measuring extracted water.

The HydroCycle PO4 methodology is based on US EPA standard methods, employing pre-mixed, colour coded cartridges for simple reagent replacement in the field. Weighing less than 8kg fully loaded with reagents, it is quick and easy to deploy, even in remote locations. The instrument has an internal data logger with 1 GB capacity, and in combination with telemetry, it provides operators with near real-time access to monitoring data for SRP.

The quality of the instrument’s data is underpinned by QA/QC processing in conjunction with an on-board NIST standard, delivering scientifically defensible results. Engineered to take measurements at high oxygen saturation, and with a large surface area filter for enhanced performance during sediment events, the instrument employs advanced fluidics, that are resistant to the bubbles that can plague wet chemistry sensors.

Environment Agency application
The National Laboratory Service Instrumentation team (NLSI) provides support to all high resolution water quality monitoring activities undertaken across the Agency, underpinning the EA’s statutory responsibilities such as the WFD, the Urban Waste Water Directive and Statutory Surface Water Monitoring Programmes. It also makes a significant contribution to partnership projects such as Demonstration Test Catchments and Catchments Sensitive Farming. Technical Lead Matt Loewenthal says: “We provide the Agency and commercial clients with monitoring systems and associated equipment to meet their precise needs. This includes, of course, nutrient monitoring, which is a major interest for everyone involved with water resources.”

Matt’s team has developed water quality monitoring systems that deliver high resolution remote monitoring with equipment that is quick and easy to deploy. There are two main options. The ‘green box’ is a fully instrumented cabinet that can be installed adjacent to a water resource, drawing water and passing it though a flow-through container with sensors for parameters such as Temperature Dissolved Oxygen, Ammonium, Turbidity, Conductivity pH and Chlorophyll a. Each system is fitted with telemetry so that real-time data is made instantly available to users on the cloud.

Conscious of the need to better understand the role of P in rivers, Matt’s team has integrated a HydroCycle PO4 into its monitoring systems as a development project.
Matt says: “It’s currently the only system that can be integrated with all of our remote monitoring systems. Because it’s portable, and runs on 12 volts, it has been relatively easy to integrate into our modular monitoring and telemetry systems.

“The HydroCycle PO4 measures SRP so if we need to monitor other forms of P, we will use an auto sampler or deploy a mains-powered monitor. However, monitoring SRP is important because this is the form of P that is most readily available to algae and plants.”

Explaining the advantages of high resolution P monitoring, Matt refers to a deployment on the River Dore. “The data shows background levels of 300 µg P/l, rising to 600 µg P/l following heavy rain, indicating high levels of P in run-off.”

Nitrate
Similar to phosphates, excessive nitrate levels can have a significant impact on water quality. In addition, nitrates are highly mobile and can contaminate groundwater, with serious consequences for wells and drinking water treatment. Nitrate concentrations are therefore of major interest to the EA, but traditional monitoring technology has proved inadequate for long-term monitoring because of a frequent recalibration requirement. To address this need, which exists globally, OTT Hydromet developed the SUNA V2, which is an optical nitrate sensor, providing high levels of accuracy and precision in both freshwater and seawater.

The NLSI has evaluated the SUNA V2 in well water and Matt says: “It performed well – we took grab samples for laboratory analysis and the SUNA data matched the lab data perfectly. We are therefore excited about the opportunity this presents to measure nitrate continuously, because this will inform our understanding of nitrate pollution and its sources, as well as the relationship between groundwater and surface water.”

Summary
The new capability for high-resolution monitoring of nutrients such as phosphorus will enable improved understanding of its effects on ecological status, and in turn will inform decisions on what acceptable P concentrations will be for individual rivers. This is vitally important because the cost of removing P from wastewater can be high, so the requirements and discharge limits that are placed on industrial and wastewater companies need to be science based and supported by reliable data. Similarly, nitrate pollution from fertilizer runoff, industrial activities and wastewater discharge, has been difficult to monitor effectively in the past because of the technology limitations. So, as improved monitoring equipment is developed, it will be possible to better understand the sources and effects, and thereby implement effective prevention and mitigation strategies.

@OTTHydrometry @EnvAgency @CEHScienceNews #Water #Environment

Energy & Environment – 3 big predictions!

25/04/2012

Frost & Sullivan has released its three big predictions for 2012 for the global energy and environment market. Industrial convergence, smart technology and distributed generation will be the key topics in 2012 and beyond.

Based on a survey of several thousand companies conducted in December 2011, this research paper highlights areas of growth.  “Data and opinions of key stakeholders, combined with analysis and commentary from Frost & Sullivan industry experts, have been used to present key market highlights, hot growth topics, global and regional hot spots, areas of market convergence, and bold predictions for 2012,” explains John Raspin, Director and Partner at Frost & Sullivan.

Convergence and Value Chain Integration
Energy and environment players see the greatest convergence today from within their own sector as well  as the industrial automation and ICT industries. It is also highly significant the convergence between energy and automotive markets and relates primarily to the emergence of electric vehicles and e-mobility that is driving innovation in batteries, energy storage, transmission and distribution infrastructure, battery charging and integration of mobility into the smart home. Convergence opportunities exist also in the water sector driven by innovation among treatment technology, process control and automation & instrumentation sectors.

Value chains, less consolidated and stretching across hundreds of components, have been and are still presenting many opportunities in the renewable energy industry. Wind power is the most highly attractive segment to European companies. Solar PV still has a positive growth outlook but opportunities for European players are slowly vanishing as the global market becomes more dominated by Chinese players. Chemical and material companies have begun to take significant steps further down the value chain to acquire key technology and solution capabilities in the fast growing and high potential energy and environment markets.

Smart Technology
Smart technology is going to play a key role in the future development of the energy and environment sector with efficiency improvements at the centre of the market evolution.  Smart grids, buildings, homes, cities and water networks will all become a reality this decade, thus creating far-reaching market growth opportunities.

Distributed Generation
There is and will be increasing focus on deployment of small scale renewable energy close to the point of consumption.  This will create opportunities for suppliers or micro-generation technologies as well as requiring new strategies and business models from power generation companies and utilities. Growth of distributed generation, legislation surrounding this market as well as integration of distributed generation into the grid are emerging trends especially in the developed markets.

The key themes outlined above feature heavily in Frost & Sullivan’s energy and environment research programme for 2012.  A key driver for the research is the work Frost & Sullivan has been conducting around Mega Trends which are driving new and emerging market segments for key industry participants.

See the presentation!
Frost & Sullivan insight on the three big predictions for 2012 and beyond is available on SlideShare.


The best cooling solutions!

15/12/2011
Heat generated by datacenters is ten times greater than the heat generated by them around 10 years back

A study by Frost & Sullivan’s Gautham Gnanajothi

Datacenter technology has arrived to a point of no return in the recent times. The servers used in them have evolved and have reduced in physical size but have increased in performance levels. The trouble with this fact is that it has considerably increased their power consumption and heat densities. Thus, the heat generated by the servers in datacenters is currently ten times greater than the heat generated by them around 10 years back; as a result, the traditional computer room air conditioning (CRAC) systems have become overloaded. Hence, new strategies and innovative cooling solutions must be implemented to match the high-density equipment. The rack level power density increase has resulted in the rise of thermal management challenges over the past few years. Reliable datacenter operations are disrupted by hot spots created by such high-density equipment.

Emerson's global data center (St Louis MO US) uses the latest energy-efficiency technologies, precision cooling products, and efficiency strategies.

Some of the main datacenter challenges faced in the current scenario are adaptability, scalability, availability, life cycle costs, and maintenance. Flexibility and scalability are the two most important aspects any cooling solution must possess; this, combined with redundant cooling features, will deliver optimum performance. The two main datacenter cooling challenges are airflow challenge and space challenge. These challenges can be overcome with the use of innovative cooling solutions. Some of the cooling techniques used in datacenters are discussed below.

Aisle Containment

Aisle containment strategies have gained immense popularity among data center operators in the past and this trend is expected to continue in the future as well.  With the use of hot aisle and cold aisle containment, energy-efficient best practice in server rooms can be achieved. Usage of hot aisle or cold aisle depends uniquely on the type of application used. Most data centers have a standard hot aisle/cold aisle layout: the aisle containments are the refinements of these layouts. In these layouts each successive aisle is labeled either hot aisle or cold aisle. In the hot aisle, the banks of the server rack exhausts hot air. In a cold aisle, the server racks are aligned in such a way that the equipment inlets face each other in the opposite sides. There is usually a raised floor system known as the “plenum” under which the cool air from the CRAC or the computer room air handler (CRAH) flows to the perforated floor tiles. These floor tiles are located in the cold aisles and facilitate the cool air into the server inlets in front of the racks and exhaust via the hot aisle. By the hot aisle/cold aisle containment, the cool air can be directed closer to the server inlets; thereby increasing the latter’s energy efficiency.

Rows of server racks at the computer center at CERN in Switzerland. (Pic CERN)

However, there are a couple of challenges faced by the aisle containment solution  the first one being “Bypass Air”; this situation arises when the cool air refuses to enter the server. The other one is “recirculation” where the heated exhaust air flows back into the cold aisle through empty space or over the top of the racks. These two conditions are known for creating hot spots in the server rooms. Data center operators use sheets made of plastic, cardboard, and so on to make barriers for the cold aisles so that the hot air does not re-enter the cold aisle.

High-density Supplemental Cooling
Data center densities have increased from 2 to 3 kW per rack to an excess of 30 kW per rack. A different cooling approach needs to be implemented to meet the high-density requirements. This is when supplemental cooling comes into place. It uses two different approaches: “rear door heat exchangers” and “over head heat exchangers”. Rear door heat exchangers come to the rescue of the struggling CRAC by conditioning the hot air and returning it to the room at colder temperature. They require a chilled water source and a connection to a remote chiller unit. The over head heat exchangers, as the name suggests, are suspended above the server rows. They compliment the hot aisle/cold aisle containment by sucking the hot air from the hot aisle exhaust, condition it, and send cool air to the cold aisles. The supplemental cooling reduces the pressure off the CRAC unit.

Liquid Cooling

The Aurora supercomputer from Eurotech, which uses liquid cooling.

With the rise in the number of applications and services that require high-density configurations, liquid cooling is generating a lot of interest among data center operators. As the name suggests, it brings the liquid (either chilled water or refrigirant) closer to the heat source for a more effective cooling. On contrary to a CRAC unit where it is isolated to a corner of the room, liquid cooling solutions are embedded in row of server racks or suspended from the ceiling or installed in a closed relationship with one or more server racks. There are two types of the liquid cooling – “in row liquid cooling” and “in rack liquid cooling”; both of them require chilled water (or refrigirant) and return piping. It is run either overhead or beneath the floor to each individual cooler.

Closed Couple Cooling
Another remedy for the high-density computing would be closed couple cooling where the distant air conditioner is moved closer to the computing load. The latest generation cooling products can be described by the term closed coupled cooling. Although their solutions vary in terms of configuration and capacity, their approach is the same. It brings the heat transfer closest to the source, which is the server rack. By doing so, the inlet air is delivered more precicely and the exhaust air is captured efficiently. There are two configurations in which it operates – “closed loop” and “open loop”. In the open loop configuration, the air stream will tend to interact with the room environment to an extent. However, closed loop configuration is completely independent of the room in which it is installed. It creates a micro climate within the enclosure because the rack and the heat exchanger work exclusively with one another.

The present high-density computing data center has thousands of racks each with multiple computing units. The parts of the computing units include multiple microprocessors – each one dissipates about 250 W of power. The heat dissipation from the racks containing such computing units excedes 10 KW. Assuming that the present data centers have 1,000 racks and more than 30,000 square feet, these would require 10 MW of power for the computing infrasructure. The future datacenters, which would be even bigger with more servers, would have greater power requirements and, hence, more energy-efficient and innovative cooling solutions.

Conclusion
There are a number of cooling solutions available in the market place, however there is not one particular cooling solution which would be suitable for all kinds of data center applications. They are often dependant on many factors like room layout, installation densities and geographic location. On the whole when we compare the different cooling solutions, it can be said that the liquid cooling technique is proving itself as an efficient and effective cooling solution for high density data centers because it brings the cooling liquid closer to the heat source for a more effective cooling. This type of cooling solutions are gaining popularity among the data center operators as the units are embedded in rows of server racks and do not take up floor space, they are also retrofit friendly which means that the data center can stay operational as the units are brought online. Data centers would benefit by using liquid cooling solutions for their high density servers and this would be the best way forward.

Heat generated by the servers in datacenters is currently ten times greater than the heat generated by them around 10 years back