It all began with the War of the Currents…

24/01/2020

Today, people greatly appreciate having electrical energy available at the flip of a switch, seemingly at any time and for any occasion. But where does electricity actually come from? The answer most people would give you is: “from the wall socket, of course”. So does this automatically settle the question of security of supply? More on this later.

If we compare the history of electric current with the 75 years of the history of Camille Bauer Metrawatt AG, it is easy to see how they were interlinked at certain times in the course of their development. Why is that?

It all began with the War of the Currents – an economic dispute about a technical standard

It was around 1890 when the so-called War of the Currents started in the USA. At that time, the question was whether the direct current favoured by Thomas Alva Edison (1847-1931) or the alternating current promoted by Nicola Tesla (1856-1943) and financially supported by George Westinghouse (1846-1914), was the more suitable technology for supplying the United States of America with electrical energy over large areas and constructing power grids. Because of Westinghouse’s market dominance at that time compared to Edison General Electric (called General Electric from 1890 on), it soon became clear that the alternating voltage invented by Nicola Tesla was rapidly gaining the upper hand. This was not least because its approximately 25% lower transmission losses weighed unquestionably in its favour. Soon afterward, came the breakthrough for alternating voltage as the means of transmitting electrical energy using. Initially, the main target application was electric lighting, which to be spurred on by the invention of the incandescent lamp by Edison. The reasons for this were logical. Westinghouse was initially a lighting manufacturing company and wanted to secure as great a market share as possible.

As developments continued, it is no surprise that already by 1891, in Germany for example, the first long-distance transmission of electrical energy was put into operation, over a distance of more than 170 km from Lauffen am Neckar to Frankfurt am Main. It was a technological breakthrough using three-phase current technology. However, this has by no means been the end of the story for direct current. Not least because of digitalization, electromobility, decentralized energy supplies, etc., DC voltage has experienced a full-blown renaissance and now is treated almost as a brand-new topic.

The Camille Bauer story.
The foundation of the Camille Bauer company dates back to 1900, immediately after the War of the Currents just described, at a time when electricity was rapidly gaining in importance. At the turn of the century, the Camille Bauer company, named after its founder Camille Bauer-Judlin, began importing measuring instruments for the trendy new phenomenon called “electricity” into Switzerland for sale to the local market. Some years later, in 1906, Dr. Siegfried Guggenheimer (1875 – 1938), formerly a research scientist for Wilhelm Conrad Röntgen (1845 – 1923) and who in 1901, became the first winner of the Nobel Prize for physics, founded what was a start-up company in Nuremberg, Germany, trading under his own name. The company was engaged in the production and sale of electrical measuring instruments. However, due to pressure from the Nazis because Dr. Guggenheimer was of Jewish descent, he had to rename the company in 1933, creating Metrawatt AG.

Four technological segments.

Four technological segments.

In 1919, a man by the name of Paul Gossen entered the picture. He was so dissatisfied with his employment with Dr. Guggenheimer that he founded his own company in Erlangen, near Nuremberg, and for decades the two rivals were continuously in fierce competition with one another. In 1944, towards the end of the Second World War, Camille Bauer could see that its importing business had virtually come to a standstill. All the factories of its suppliers, which were mainly in Germany (for example Hartmann & Braun, Voigt & Haeffner, Lahmeyer, etc.), had been converted to supplying materials for the war. At this point, a decision had to be made quickly. Camille Bauer’s original trading company located in Basel (CH), undertook a courageous transformation. In order to survive, it turned itself into a manufacturing company. In a first step, the recently formed manufacturing company Matter, Patocchi & Co. AG in Wohlen (CH) was taken over, in order to be get the business up and running quickly with the necessary operating resources at their disposal. Thus the Swiss manufacturing base in Wohlen in the canton of Aargau was born.

The story does not end there. In 1979, Camille Bauer was taken over by Röchling a family-owned company in Mannheim, Germany. At that time, Röchling wanted to quit the iron and steel business and enter the field of I&C technology. Later, in 1993, Gossen in Erlangen and Metrawatt in Nuremberg were reunited in a single company, after Röchling became owner of the Gossen holding company as a result of the acquisition of the Bergmann Group from Siemens in 1989, and Metrawatt was acquired from ABB in 1992. At the same time, Camille Bauer’s German sales operation in Frankfurt-Dreieich also became a part of the company. Today the companies operate globally and successfully under the umbrella brand of GMC-I (Gossen Metrawatt Camille-Bauer-Instruments).

A new era.
The physics of electric current have not changed over the course of time. However, business conditions have changed drastically, especially over the last 5-10 years. Catch phrases such as electricity free market, collective self-consumption, renewable energy sources, PV, wind power, climate targets, reduction of CO2 emissions, e-mobility, battery storage, Tesla, smart meters, digitalization, cyber security, network quality, etc. are all areas of interest for both people and companies. And last but not least, with today’s protest demonstrations, climate change has become a political issue. We will have to see what results from this. At the very least, the catch phrases mentioned above are perfect for developing scenarios for electricity supply security. And it really is the case that the traditional electricity infrastructure, which is often as old as Camille Bauer Metrawatt itself, was not designed for the new types of energy behaviour, either those on the consumer side or the decentralised feed-in side. As a result, it is ever more important to have increasing numbers of intelligent systems which need to work from basic data obtained from precise measurements in order to avoid outages, blackouts and resulting damage.

The overall diversity of these new clusters of topics has prompted Camille Bauer Metrawatt AG to once more face the challenges with courage and above all to do so in an innovative and productive way. In this spirit, Camille Bauer Metrawatt AG develops, produces and distributes its product range globally in 4 technological segments.

These are:
(1) Measurement & Display,
(2) Power Quality,
(3) Control & Monitoring,
(4) Software, Systems and Solutions.

Through its expert staff, modern tools and external partners Camille Bauer Metrawatt is able, for example, to analyse power quality and detect power quality problems. In addition, the Camille Bauer Metrawatt Academy, recently founded in 2019, puts its focus on knowledge transfer by experienced lecturers, with the latest and most important topics as its main priority. Furthermore, we keep in very close contact with customers, authorities, associations, specialist committees, educational institutions, practice-oriented experts and the scientific community in order to continually provide the requisite solutions to the market and interested parties.

#Camille_Bauer_Metrawatt #PAuto @irishpwrprocess


Monitoring and managing the unpredictable.

07/06/2018
Energy sector investments in big data technologies have exploded. In fact, according to a study by BDO, the industry’s expenditure on this technology in 2017 has increased by ten times compared to the previous year, with the firm attributing much of this growth to the need for improved management of renewables. Here, Alan Binning, Regional Sales Manager at Copa-Data UK, explores three common issues for renewables — managing demand, combining distributed systems and reporting.

Renewables are set to be the fastest-growing source of electrical energy generation in the next five years. However, this diversification of energy sources creates a challenge for existing infrastructure and systems. One of the most notable changes is the switch from reliable to fluctuating power.

Implementing energy storage
Traditional fossil-fuel plants operate at a pre-mitigated level, they provide a consistent and predictable amount of electricity. Renewables, on the other hand, are a much more unreliable source. For example, energy output from a solar farm can drop without warning due to clouds obscuring sunlight from the panels. Similarly, wind speeds cannot be reliably forecasted. To prepare for this fluctuation in advance, research and investment into energy storage systems are on the rise.

For example, wind power ramp events are a major challenge. Therefore, developing energy storage mechanisms is essential. The grid may not always be able to absorb excess wind power created by an unexpected windspeed increase. Ramp control applications allow the turbine to store this extra power in the battery instead. When combined with reliable live data, these systems can develop informed models for intelligent distribution.

Britain has recently become home to one of the largest energy storage projects to use EV batteries. While it is not the first-time car batteries have been used for renewable power, the Pen y Cymoedd wind farm in Wales has connected a total of 500 BMW i3 batteries to store excess power.

Combining distributed systems
Control software is the obvious solution to better monitor this fluctuating source of power. However, many renewable energy generation sites, like solar PV and wind farms, are distributed across a wide geographical scope and are therefore more difficult to manage without sophisticated software.

Consider offshore wind farms as an example. The world’s soon-to-be-largest offshore wind farm is currently under construction 74.5 miles off the Yorkshire coastline. To accurately manage these vast generation sites, the data from each asset needs to be combined into a singular entity.

This software should be able to combine many items of distributed equipment, whether that’s an entire wind park or several different forms of renewable energy sources, into one system to provide a complete visualisation of the grid.

Operators could go one step further, by overlaying geographical information systems (GIS) data into the software. This could provide a map-style view of renewable energy parks or even the entire generation asset base, allowing operators to zoom on the map to reveal greater levels of detail. This provides a full, functional map enabling organisations to make better informed decisions.

Reporting on renewables
Controlling and monitoring renewable energy is the first step to better grid management. However, it is what energy companies do with the data generated from this equipment that will truly provide value. This is where reporting is necessary.

Software for renewable energy should be able to visualise data in an understandable manner so that operators can see the types of data they truly care about. For example, wind farm owners tend to be investors and therefore generating profit is a key consideration. In this instance, the report should compare the output of a turbine and its associated profit to better inform the operator of its financial performance.

Using intelligent software, like zenon Analyzer, operators can generate a range of reports about any information they would like to assess — and the criteria can differ massively depending on the application and the objectives of the operator. Reporting can range from a basic table of outputs, to a much more sophisticated report that includes the site’s performance against certain key performance indicators (KPIs) and predictive analytics. These reports can be generated from archived or live operational data, allowing long term trends to be recognised as well as being able to react quickly to maximise efficiency of operation.

As investments in renewable energy generation continue to increase, the need for big data technologies to manage these sites will also continue to grow. Managing these volatile energy sources is still a relatively new challenge. However, with the correct software to combine the data from these sites and report on their operation, energy companies will reap the rewards of these increasingly popular energy sources.


Celebrating Northern Europe’s Automation Engineers Engineering.

08/12/2017

NIDays welcomed hundreds of delegates from across Northern Europe to the historic Sandown Park Racecourse in England in November 2017, for its annual conference and exhibition. Each event was designed to educate and inspire the engineering community. Delegates to NIDays were given exclusive access to innovative technologies and could explore NI’s latest software, in a full day of keynote speeches, technical presentations and hands-on sessions.

Northern European Engineering Impact Awards
The night before, some of Northern Europe’s best engineers attended the prestigious Engineering Impact Awards.  The well-respected Engineering Impact Awards celebrated the most innovative engineering applications based on NI hardware and software.

Coventry University’s Dr Bo Tan won ‘Application of the Year’ for his system that combines passive WiFi sensing hardware and machine learning algorithms to monitor the health, activity and well-being of patients within nursing homes, allowing staff to improve their levels of efficiency and care.

Other winners include:

Advanced Manufacturing: Paving the Way for Industry 4.0 with Smart, Reconfigurable Manufacturing Machines
Biomedical: Combining Passive WiFi Sensing and Machine Learning Systems to Monitor Health, Activity and Well-Being within Nursing Homes
Education: Teaching Electronics to the Next Generation of Engineers using VirtualBench
Innovative Research: Unlocking Fusion Energy – Our Path to a Sustainable Future
Test and Validation: Saab Elevates Testing of the World’s Most Cost-Effective Fighter Plane
Wireless Communication: Using the LabVIEW Communications System Design Suite to Increase Spectral Efficiency for Wireless Communication

“The Northern European EIA’s were incredible this year. The breadth of applications showed what our products can do in the hands of world-class scientists and engineers!” says Dave Wilson, Vice President – Product Marketing for Software, Academia and Customer Education.

NIDays
Professors, researchers and design engineers were amongst the audience of the morning keynote ‘Testing and Deploying the Next Generation of Technology’ hosted by NI VP Dave Wilson. In this session, NI experts explained how the NI platform is accelerating innovation in applications ranging from transportation safety to the IoT.

During the afternoon keynote, Stuart Dawson, Chief Technology Officer at the University of Sheffield’s (GB) Advanced Manufacturing Research Centre (AMRC) was welcomed to the stage to discuss how super-trends like Industry 4.0, energy and the electrification of transportation are changing the way we live and work. Charlotte Nicolaou, Software Field Marketing Engineer, walked through how NI are continuing the LabVIEW legacy with the evolution of the world’s most productive and efficient engineering software, introducing LabVIEW NXG 2.0 and other new software releases including NI Package Manager.

Delegates had a chance to ‘dirty their hands!’

Delegates also had the opportunity to view application specific demonstrations that showcased the latest NI products and technology in the Expo Area, with plenty of NI engineers on hand to discuss their engineering challenges and technical questions. Participants also enjoyed an array of track sessions that included LabVIEW Power Programming and Test & RF Hands-On, giving users the opportunity to learn practical skills and network with specialists and peers.

Throughout the day, several guest presenters took to the stage including Jeff Morgan and Garret O’Donnell of Trinity College Dublin (IRL) and Niklas Krakau from Saab Aeronautics who discussed their incredible application enabling efficient testing of the world’s most cost-effective fighter plane, the Saab Gripen E.

Attentive Audience!

“NIDays allows us to highlight game-changing industry trends, whilst unveiling new, innovative technologies. However, it is the attendees, presenters, partners and exhibitors that provide the conference’s true highlights. What was my favourite part of the day? Learning how Coventry University is using WiFi signals to wirelessly monitor patient health through-walls? Meeting elite researchers and heads of industry during the dedicated networking sessions? Taking a tour of Cardiff University’s historic race car? Or sampling a ‘perfect pint’ of ale, courtesy of the robot bartender from Leeds University? NIDays was packed with inspiring moments and experiences that I will remember for a long, long time to come” says Richard Roberts, Senior Academic Technical Marketing Engineer.

12 exhibitors joined the lively atmosphere of the main exhibition hall, including Amfax, Austin Consultants and The Formula Student Silverstone 2017 winners, Cardiff Racing, who proudly displayed their history making Formula 1 car. Many more NI customers and partners filled the hall with their impressive applications, some of which won awards at the Engineering Impact Awards the previous evening.

@NIukie #PAuto #TandM #NIDays @NIglobal

Future factory – a moderator’s impression!

01/02/2016

Read-out was asked to moderate the automation stream at the National Manufacturing & Supplies conference held last week outside Dublin. (26th January 2016). In their wisdom the organisers selected “Future Factory!” as a title for this half day seminar and there were 11 speakers organised to speak on their particular subjects for about 15 minutes each. This was replicated in the the over a dozen different seminars held on this one day.

q#MSC16

Long queues lasted well into the morning to enter the event!

We were a little sceptical that this would work but with the help of the organisers and the discipline of the speakers the time targets were achieved. Another target achieved was the number of attendees at the event as well as those who attended this particular seminar.
In all between exhibitors, speakers and visitors well over 3000 packed the venue. Probably far more than the organisers had anticipated and hopefully a potent sign that the economy is again on the upturn. Indeed it was so successful that it was trending (#MSC16) on twitter for most of the day.

Seminar
But back to our seminar. If you google the term Future Factory you get back 207million links, yet it is difficult to find a simple definition as to what it means. The term automation similarly is a very difficult term to define though the term in Irish “uathoibriú” perhaps is a bit clearer literally meaning “self-working.”

uturefactory.jpg

Good attendance at the Seminar

Background
The world of automation has changed to an extrordinary degree and yet in other ways it remains the same. The areas where it has experienced least change is in the areas of sensing – a thermometer is a thermometer – and final control – a valve is a valve. Where it has changed almost to the point of unrecognisability is in that bit in the middle, what one does with the signal from the sensor to activate the final control element.

From single parameter dedicated Indicator/Controller/Recorders in the sixties which transmitted either pnuematically (3-15psi) or electrically (4-20mA). Gradually (relatively speaking) most instruments became electronic, smaller in size and multifunctional. The means of communication changed too and fieldbus communication became more common to intercact with computors which themselves were developing at breaknech speed. Then transmission via wireless became more common and finally the internet and the ability to control a process from the computer that we call the intelligent phone. There are problems with these latter, internet/cellphone, of course. One is that the reach of the internet is focussed at present on areas of high population. Another is the danger of infiltration of systems by hostile or mischivous strangers. The importance of security protocols is one that has only recently been apparent to Automation professionals.

• Many of the presentations are available on-line here. The password is manufac2016

The Presentations
Maria Archer of Ericsson spoke on the enabling and facilitating IoT in the manufacturing industry. Diving straight into topic she drew on her experience of big data, e-commerce, media, cyber security, IOT and connected devices.

The second speaker was Cormac Garvey of Hal Software who addressed Supply Chain prototyping. The Supply Chain ecosystem is incredibly complex, usually requiring significant integration of each suppliers’ standards and processes to the manufacturer’s. Cormac will introduce the concept of supply chain prototyping, where easy-to-use, standards-based technology is used to wireframe out the entire supply chain ecosystem prior to integration, thus significantly reducing cost, time and risk on the project. This wireframe can then be used as a model for future integration projects.

Two speakers from the Tralee Institute of Technology, Dr. Pat Doody and Dr. Daniel Riordan spoke on RFID, IoT, Sensor & Process Automation for Industry 4.0. They explained how IMaR’s (Intelligent Mechatronics and RFID) expertise is delivering for their industrial partners and is available to those aiming to become a part of Industry 4.0.

Smart Manufacturing – the power of actionable data was the topic addressed by Mark Higgins of Fast Technology. He shared his understanding of the acute issues companies face on their journey to Business Excellence and how leveraging IT solutions can elevate the business to a new point on that journey.

Assistant Professor (Mechanical & Manuf. Eng) at TCD, Dr Garret O’Donnell,   explained how one of the most significant initiatives in the last 2 years has been the concept of the 4th industrial revolution promoted by the National Academy for Science and Engineering in Germany- ACATECH, known as Industrie 4.0. (Industrie 4.0 was first used as a term in Germany in 2011).

Another speaker from Fast Technologies, Joe Gallaher, addressed the area of Robotics and how Collaborative Robots are the “Game Changer” in the modern manufacturing facility.

Dr. Hassan Kaghazchi of the University of Limerick and Profibus spoke on PROFINET and Industrie 4.0. Industrial communications systems play a major role in today’s manufacturing systems. The ability to provide connectivity, handle large amount of data, uptime, open standards, safety, and security are the major deciding factors. This presentation shows how PROFINET fits into Industrial Internet of Things (Industrie 4.0).

White Andreetto

Maurice Buckley CEO NSAI

The CEO of NSAI, the Irish National Standards Authority, Maurice Buckley explained how standards and the National Standards Authority of Ireland can help Irish businesses take advantage of the fourth industrial revolution and become more prepared to reap the rewards digitisation can bring.

The next two speakers stressed the impact of low forecast accuracy on the bottom line and how this coulbe be addressed. Jaap Piersma a consultant with SAS UK & Ireland explained that low forecast accuracies on the business performance is high in industry but with the right tools, the right approach and experienced resources you can achieve very significant result and benefits for your business. Following him Dave Clarke, Chief Data Scientist at Asystec, who mantains the company strategy for big data analytics service development for customers. He showed how are incredible business opportunities possible by harnessing the massive data sets generated in the machine to machine and person to machine hyper connected IoT world.

The final speaker David Goodstein, Connected Living Project Director, GSMA, described new form factor mobile SIMs which are robust, remotely manageable which are an essential enabler for applications and services in the connected world.

All in all a very interesting event and useful to attendees. Papers are being collected and should be available shortly on-line.

It is hoped to do it all again next year on 24th January 2017- #MSC17.

See you there.

@NationalMSC #MSC16 #PAuto #IoT


What’s the future for the electronics instrumentation sector?

11/12/2015

Looking back at the past 10-15 years of the electronic instrumentation industry, it is certainly disappointing to realize that the market for new test equipment in 2015 is about the same size or less. What does this tell us and will the industry perform better in the future?

Recently, Frost & Sullivan published three market insights about the future of the electronic industry and what will determine it, where the new opportunities for growth are, and how to stay profitable in changing economical environment.

These market insights are listed below:

Jessy_Cavazos

Jessy Cavazos – Frost & Sullivan

“In the past decade, the electronics instrumentation industry did not maximize the revenue opportunity coming from the move towards connectivity and the proliferation of electronics as most companies missed out on dramatic changes happening in the customer base,” says Jessy Cavazos, Industry Director for Test & Measurement, Frost & Sullivan.

Over the next 5-10 years, 5G and other technologies will take the electronics instrumentation market to higher frequencies spelling significant growth opportunities for test manufacturers. The move towards a more connected, zero-latency, and autonomous world will certainly provide room for growth for the electronic instrumentation market. With the Internet of Things (IoT), a myriad of devices will be connected to the Internet. While low latency will not be provided for all applications and devices in the short term due to costs, the desire for low or no latency for a number of devices and applications is here and will provide opportunities to test manufacturers.

While wireless communications and aerospace and defense will remain significant end-user segments for electronic test and measurement equipment, demand is likely to increase in smaller end-user segments such as automotive and industrial electronics due to the greater integration of wireless technology in various devices.

The world is also on the path to become more autonomous with mobile robots, drones, and autonomous cars. While all of these technologies will translate into demand for electronic instrumentation, some, such as the autonomous car, will generate significant opportunities for test manufacturers due to the onus put on safety. Leading automotive OEMs are currently embracing automated driving translating into significant R&D opportunities for test manufacturers.

The hyper connectivity of customers will also call for a greater focus from test manufacturers on their go-to-market channels. While online channels have grown in importance for mid and low-end test equipment, this trend is also relevant to more high-end expensive test equipment from a digital marketing perspective.

“The next decade will not come without challenges for the electronics instrumentation industry. However, trends are favorable to the future growth of the electronic test and measurement market. Test manufacturers must not only be aware of the evolution of technologies and related test requirements but also expand their horizons to understand the impact of other trends on their business,” summarised Ms. Cavazos.


Demand for IoT testing and monitoring equipment.

28/06/2015

As the trend towards connected living and the Internet of Things (IoT) continues to permeate home, work and city solutions, the need to keep tabs on a myriad of connected devices will thrust the global IoT testing and monitoring equipment market into the spotlight. The incorporation of machine-to-machine (M2M) communication – central to IoT deployment – as well as modules that require less power and bandwidth will bring with it several challenges that turn into a boon for testing and monitoring vendors.

New analysis from Frost & Sullivan, Global fands Equipment Market, finds that the market earned revenues of $346.9 million in 2014 and estimates this to reach $900.1 million in 2021.

“As the escalating number of connected devices adds breadth to the IoT concept, solutions that can proactively monitor, test and zero in on anomalies in the infrastructure will garner a sustained customer base,” said Frost & Sullivan Measurement and Instrumentation Research Analyst Rohan Joy Thomas. “The incorporation of new testing and wireless standards will broaden testing requirements and further aid development in IoT testing and monitoring equipment.”

Educating end users on the importance of interoperability and the requirement for specialised testing equipment is vital for market success. Currently, the lack of end-user awareness on the need for proactive solutions stalls the large-scale use of IoT testing and monitoring equipment. End-user inability to identify the most appropriate solution from a plethora of identical systems too limits adoption.

High capital expenditure associated with procuring equipment coupled with inadequate standardisation around IoT adds to the challenge. Such concerns over high investment costs and standardisation should abate as IoT matures in the years ahead.

“Industry vendors must fill the gaps in their product portfolio in order to facilitate an open testing environment and lay the foundation for long-term growth,” concluded Thomas. “To that end, building partnerships with or acquiring participants from other industry niches will help solution providers extend their horizons in the global IoT testing and monitoring equipment market.”


Monitoring nuclear waste legacy ponds!

22/04/2014

Following a rigorous assessment period, EXO water quality monitoring sondes from Xylem Analytics are being deployed in what is arguably one of the most hostile environments imaginable – nuclear waste legacy storage ponds at the Sellafield nuclear reprocessing site in Cumbriain the North West of England.

Background
One of the major challenges facing Sellafield Ltd is the safe decommissioning of the First Generation Magnox Storage Pond (FGMSP), a nuclear fuel storage facility that was originally built in the 1950s and 1960s as part of Britain’s expanding nuclear programme to receive and store, cool irradiated Magnox fuel prior to reprocessing.

In the 1970s a lengthy shutdown at the Magnox Reprocessing Plant, combined with increased throughput of fuel due to electricity shortages, spent fuel to be stored in the pond for longer than the designed period which led to increased fuel corrosion and radiation levels.

Over the years the pond has accumulated significant quantities of waste materials, sludges from corrosion of fuel cladding, skips of fuel, and fuel fragments and other debris which has blown into the pond. Standing above ground, this 5m deep open pond holding some 14,000 cubic metres of contaminated water (approximately the size of two Olympic swimming pools) is considered a decommissioning priority. To assist with future retrievals, a detailed knowledge of the facility’s inventory through visual inspection of the pond is needed.

Despite high levels of radioactivity, this open pond appears to intermittently bloom with a range of microorganisms that cloud the water, reducing visibility and hampering inspection and retrieval operations.

Sellafield Ltd is the company responsible for safely delivering decommissioning, reprocessing and nuclear waste management activities on behalf of the Nuclear Decommissioning Authority (NDA), and a project team led by Xavier Poteau has specific responsibility for transferring monitoring technologies to the FGMSP pond.

FGMSP Pond (Image supplied courtesy of Sellafield Ltd)

Water passing through the pond reaches the Sellafield Ion Exchange Effluent Plant (SIXEP) which removes radioactivity from liquid feeds from a number of plants across the Sellafield site. The plant settles out and filters solids using a carbonation process to neutralise the alkaline pond water and then employs ion exchange to remove radionuclides.

Why monitor?
Water samples are routinely collected from the pond for laboratory analysis, and analytical data is reported to the Environment Agency and the NDA. In addition to this regulatory requirement, water quality data is also required to inform efficient operation of SIXEP and to ensure that legacy fuel is stored in optimal conditions. For example, the water is caustic dosed to maintain a pH of around 11.5 which reduces the speed of nuclear fuel degradation.

Water monitoring challenges

Preparing to test the water!

Preparing to test the water! (Image supplied courtesy of Sellafield Ltd)

As a result of physical restrictions, it has only been possible to take water samples from specific locations around the edge of the pond and, being radioactive, routine samples have to be limited to about 100ml to be within laboratories guidelines. Sampling is also an arduous, time-consuming process; two people have to be involved and each sampler has to wear a pvc suit and facemask, two pairs of pvc waterproof gloves and a pair of Kevlar gloves to ensure that the gloves are not accidentally punctured. The samplers are also only allowed to be close to the pond for a limited time.

Instrumentation might appear to be the obvious solution, but again, there are several challenges, not least of which is that gamma spectrum analysis has to be conducted on a sample in a lab. In addition, electrical instruments often fail in a radioactive environment, so the general assumption is that they will do so, unless proven otherwise. Continuous monitoring probes, similar to those employed by the water industry, are not feasible because of the wiring that would be required. However, portable instruments offer the potential to reduce the volume and frequency of water sampling.

Trials with EXO sondes
The EXO2 sondes are multiparameter 6-port water quality monitors that have been developed for remote, long-term monitoring applications. Employed globally by regulatory authorities, researchers, industrial companies and those responsible for the protection of water resources, the EXO sondes are the result of many years’ of development and feedback from thousands of users from all over the world. As a result, these instruments are lightweight and rugged, with internal batteries and datalogging capability for long-term monitoring applications. The EXO sondes operate on extremely low power and incorporate a range of features that minimise maintenance requirements and avoid biofouling. For example: wet-mateable connectors resist corrosion; components are isolated to prevent short-circuits; welded housings and double o-rings prevent leaks, and high-impact plastic and titanium resists impact damage.

The ‘smart’ EXO sensors are easily interchangeable and users are able to select the sensors that best meet their needs. The FGMSP project team, for example, uses sensors for pH, temperature, conductivity, turbidity, fDOM (Fluorescent Dissolved Organic Matter – a surrogate for Coloured DOM), Blue-green Algae and Chlorophyll.

Initially, the FGMSP project team trialled an extended deployment version of the YSI 6600 multiparameter water quality monitoring sonde – a predecessor of the EXO. “This enabled us to assess the quality of the YSI sensors and demonstrate that they were able to operate well in a radioactive environment,” comments Technical Specialist Marcus Coupe, adding: “The launch of the EXO was of great interest to us because, with Bluetooth communications and smart sensors that retain their calibration data, the EXO offered an opportunity to dramatically reduce time spent at the pond.

“The snap-on probes are calibrated in the laboratory and can then be quickly and simply swapped with those that have been deployed on an EXO sonde. This means that the main part of the sonde can be left onsite while the sensors are quickly swapped, and the Bluetooth comms enable us to collect 18,600 sets of data in less than 20 minutes.”

Commenting further on the success of the EXO trials, Xavier Poteau says: “It has been common experience in the nuclear industry to have to apply significant adaptations to electrical equipment, so that it is able to function correctly in a radioactive environment, and this can incur a heavy cost and time penalty. However, the EXO sondes have performed very well ‘off the shelf’ which is a sign of good design.”

ROV with EXO probe

ROV with EXO probe (Image supplied courtesy of Sellafield Ltd)

As part of their work with the EXO sondes, the FGMSP project team has deployed an EXO sonde with a submersible remotely operated vehicle (ROV). This enabled the team to monitor water quality at previously unachievable locations. “Any loss of visibility in the pond can potentially cause a significant risk to operations within the legacy ponds, as well as potentially slowing down future retrievals, so the ability to deploy an EXO with a ROV offers a valuable insight into understanding the challenge, and moves us from single point sampling to a more 3D-like data stream,” adds Marcus Coupe.

Looking forward
Neill Cornwell from Xylem Analytics has been involved with the trials at Sellafield from the start. He says: “A lot of hard work has gone into the process of demonstrating EXO’s suitability for deployment in the nuclear sector; not only has the equipment had to perform well in challenging conditions, but we have also had to demonstrate a high level of technical and service support.

“Naturally, we are very pleased that the sondes have performed so well, and further instruments are now being deployed in other applications at the Sellafield site. For example, a slimmer version of the EXO, the EXO1, is being used to monitor the effluent distribution tanks because the only access is via narrow pipes and the EXO1 is ideal because its outer diameter is just 1.85 inches.”

The data from the FGMSP sondes compare favourably with the results of laboratory analysis, so Xavier Poteau believes “a high level of confidence is being established in the EXO data and this means that we will be able to reduce the amount of sampling that we undertake, which will save a great deal of time, hassle and money.

“I strongly believe that our experience could be beneficial to the wider audience as well as the nuclear industry.”

EXO2_titanium_bulkhead

EXO2 Titanium Bulkhead