Two million mag meters plus…

02/05/2016

Endress+Hauser has produced over two million electromagnetic flowmeters since 1977. “That is more than any other manufacturer,” they claim. “This magic number stands for high-quality measuring technology and, above all, satisfied customers in all kinds of industries,” says Bernd-Josef Schäfer, Managing Director of Endress+Hauser Flowtec AG, the center of competence for flow measuring technology.

EH_MD_01The company’s success story as a manufacturer of electromagnetic flowmeters began in the middle of the 1970s. In order to enter the water and wastewater market which was emerging at that time, Endress+Hauser purchased the company Flowtec in Bern in 1977 and moved it to a new location in Reinach (Basel-Landschaft, Switzerland). This is where Endress+Hauser started to produce flowmeters with just three employees in former military barracks.

Work was done on an on-demand basis. “Whereas today,” says Bernd-Josef Schäfer, “our production spans six sites around the globe – in Switzerland, France, the USA, China, India, and Brazil – and boasts state-of-the-art logistics. This infrastructure is what has enabled us to produce two million electromagnetic flowmeters to date in accordance with required quality standards.”

To put this into context: These two million electromagnetic flowmeters could measure a volume corresponding to four times the flow rate of the Amazon. Each production site also features precise calibration facilities which are regularly checked by national accreditation bodies and which guarantee consistently high measuring quality for each individual device.

Constant innovation guarantees customer satisfaction
The company’s success, which spans almost 40 years, is due to many factors. In particular, its inventive talent has enabled Endress+Hauser to keep offering its customers new, groundbreaking devices capable of measuring all kinds of fluids, such as water, milk, acids, alkalis, or ore slurry, with the greatest accuracy.

With clever innovations such as the precision measurement of difficult fluids (Autozero, 1981), microprocessor control (Variomag, 1984), two-wire technology (Eximag, 1987), or the operating matrix (Tecmag, 1990), Endress+Hauser has always managed to stay one step ahead of the competition.


In 1993, all of these device variants were brought together to form a single product family under the name of “Proline”. Alongside this family, however, Endress+Hauser also produces flowmeters for very particular applications – for example, filling bottles at one-second intervals.

Looking to the future with Proline 
Since 1993, the Proline device family has undergone constant development toEH_MID_03 ensure that it meets the prevailing requirements in a wide range of industries. Following the second generation launched in 2000, the third and most recent Proline generation (2012) offers a multitude of unique functions and device properties.

This means that system operators will not only be able to retrieve measurement and diagnostic data via display, WLAN, web server, or fieldbus, but will also be able to monitor the process comprehensively and, if necessary, check the functioning of a flowmeter during operation.

Bernd-Josef Schäfer sees the future of Endress+Hauser optimistically: “Innovations such as these enable us to align our product portfolio consistently with the needs of every industry. We are looking ahead to our three-millionth electromagnetic flowmeter with great confidence.”

@Endress_Hauser #PAuto

Food & Pharmaceutical Futures.

21/03/2016

ISA’s first international symposium outside of North America is adjudged a success.

centreview

From the time it was firsted mooted for Ireland in 2015 the planning for the 3rd ISA Food & Pharmaceutical Symposium was embraced with enthusiasm by the local Ireland Section. This was in Philadelphia early in 2015  and since then the ISA’s Food & Pharma Division under the able directorship of Canadian Andre Michel has ploughed forward overcoming setbacks and the not inconsiderable distances between North America and the capital of Munster. Chair of the symposium and former Ireland Section President, Dave O’Brien directed a strong committee charged with ensuring the this, the first such international symposium organised by the ISA outside of North America would be a resounding success.

And it was.

Venues were assessed, speakers recruited and the various minutiae associated with organising an international event were discussed, duties asigned and problems solved over many late night transatlantic telephone conferences. Using the experience of the ISA staff in North Carolina and the many years experience of organising table-top events and conferences in Ireland by the Ireland Section a very creditable event was staged at the Rochestown Park Hotel. With some justification the Symposium Chair could state before the event started “We have assembled a truly outstanding program this year, featuring some of the world’s most accomplished experts in serialization, process optimization, cyber security and alarm management to name a few. These experts will speak on the vital issues affecting food and drug manufacturers and distributors. We are delighted to have the opportunity to bring this event to Ireland for its first time outside of the United States!” Indeed upwards of 200 registrands attended the two day event and it was notable that the bulk of these stayed until the final sessions were completed.

• All through the event highlights were tweeted (and retweeted on the Ireland Section’s own twitter account) with the hashtag #FPID16. See also the ISA official release after the event: Food & Pharma symposium almost doubles in size!

day1e

ISA President Jim Keaveney (3rd from right) with some of the speakers ath the FPID Symposium

Technology and Innovation for 2020 Global Demands
Two fluent keynote speakers, Paul McKenzie, Senior Vice President, Global Biologics Manufacturing & Technical Operations at Biogen (who addressed “Driving Change Thru Innovation & Standards”) and Dr Peter Martin, VP and Edison Master, Schneider Electric Company (Innovation and a Future Perspective on Automation and Control) may be said to have set the tone. The event was also graced with the presence of ISA Internationa President for 2016 Mr Jim Keaveney.

We will highlight a few of the sessions here!

Serialization:
The important subject of serialization which affects all level of the pharmaceutical business especially in view of deadlines in the USA and the EU. From an overview of the need and the technology to a deep dive into the user requirements, this session provided the latest information on the world requirements and helping provide the solution needed in each facility. Speakers, as in most sessions, were drawn from standard, vendor and user organisations as well as state enforcement agencies.

Track & Trace:
In the parallel Food thread of the symposium the role of track and trace technologies were examined. Product safety, output quality, variability and uniqueness of customer requirements manufacturers are facing increasing demands on the traceability of raw materials, real-time status of manufactured goods and tracking genealogy of products throughout the value chain from single line to the multiple sites of global manufacturers. The evolution of data systems and technologies being offered means greater benefits for Industry and presenters Vision ID and Crest will show these solutions and the advantage of modernization.

 

day1a2Both threads came together for much of the event mirroring the similarity of many of the technologies and requirements of each sector.

Digitalization:
Digitalization in industry shows what bringing the worlds of automation and digitalization together provides true and advanced paperless manufacturing with more complex devices and interconnected data systems. This is an enabler to integrated operations within industry. Using MES as a core concept to create a Digital Plant and optimized solutions with data driven services was explained. And a practicale example of a plant was discussed showing the journey to paperless manufacturing and a real pharmaceutical strategy of integrating automated and manual operations.

 

eric_cosman

Eric Cosman makes a point!

Cybersecurity:
Of course this is one of the key topics in automation in this day and age. Without implementing the proper preventative measures, an industrial cyber-attack can contribute to equipment failure, production loss or regulatory violations, with possible negative impacts on the environment or public welfare. Incidents of attacks on these critical network infrastructure and control systems highlight vulnerabilities in the essential infrastructure of society, such as the smart grid, which may become more of a focus for cybercriminals in the future. As well as threats from external sources steps ought to be taken to protect control and automation systems from internal threats which can cripple a company for days or months. This session highlighted the nature of these threats, how systems and infrastructure can be protected, and methods to minimize attacks on businesses.

 

Automation Challenges for a Greenfield Biotech Facility:
These were outlined in this session in the pharmaceutical thread. Recent advances in biotechnology are helping prepare for society’s most pressing challenges. As a result, the biotech industry has seen extensive growth and considerable investment over the last number of years. Automation of Biotech plants has become increasingly important and is seen as a key differentiator for modern biotech facilities. Repeatable, data rich and reliable operations are an expectation in bringing products to market faster, monitor and predict performance and ensure right first time delivery. This session provided the most topical trends in automation of biotech facilities and demonstrated how current best practices make the difference and deliver greater value to businesses.

Process Optimization and Rationalization:
Meanwhile in the Food & Beverage thread incremental automation improvement keeps competitiveness strong. Corporate control system standardization leads to constant demand for increases in production and quality.

Industry 4.0 (Digital Factory: Automate to Survive):

Networking

Networking between sessions

The fourth industrial revolution is happening! This session asked how Global Industry and Ireland are positioned. What did this mean to Manufacturer’s and Industry as a whole? The use of data-driven technologies, the Internet of things (IoT) and Cyber-Physical Systems all integrate intelligently in a modern manufacturing facility. Enterprise Ireland and the IDA headlined this topic along with the ICMR (Irish Centre for Manufacturing Research) and vendors Rockwell and Siemens.

OEE and Automation Lifecycle: Plant lifecycle and Operational Equipment Effectiveness

Networking2

More networking

Worldwide today many of the over 60 Billion Euro spend in installed control systems are reaching the end of their useful life. However, some of these controls, operational since the 80’s and 90’s, invested significantly in developing their intellectual property and much of what was good then is still good now. Of course some aspects still need to evolve with the times. This requires funding, time and talent. For quite some time now there has been a skilled automation shortage at many companies leading organizations to outsourcing, partnerships and collaboration with SME’s to help manage the institutional knowledge of their installed control systems. With corporate leadership sensitive to return to shareholders, plant renovation approval hurdle rates are usually high when it comes to refreshing these control systems. In many manufacturing facilities, engineers and production managers have been asked to cut costs and yet still advance productivity. To solve this dilemma, many world class facilities continue to focus on driving improvements through the use of automation and information technology. Some are finding that using existing assets in conjunction with focused enhancement efforts can take advantage of both worlds. Here we were shown great examples of where innovation and such experiences are helping to create real value for automatio modernization.

 

day1b2

Alarm management:
And of course no matter how sophisticated systems are Alarms are always require and neccessary. DCSs, SCADA systems, PLCs, or Safety Systems use alarms. Ineffective alarm management systems are contributing factors to many major process accidents and so this was an importan session to end the symposium.

The social aspect of this event was not forgotton and following a wine reception there was a evening of networking with music at the end of the first day.

Training Courses:
On the Wednesday, although the symposium itself was finished there were two formal all day training courses. These covered, Introduction to Industrial Automation Security and the ANSI/ISA-62443 Standards (IC32C – Leader Eric Cosman, OIT Concepts ), and Introduction to the Management of Alarm Systems (IC39C – Leader Nick Sands, DuP0nt). These, and other, ISA courses are regularly held in North America and the Ireland Section occasionally arranges for them in Ireland.

All in all the Ireland Section and its members may feel very proud in looking back on a very well organised and informative event which in an email from one of the attendees, “Thank you all, It was the best symposium I attended in the last 10 years!”

Well done!

day1c

#FPID16 #PAuto #PHarma #Food

Beyond smoke and mirrors!

07/01/2016

Three things you didn’t know about IIoT examined by Martyn Williams, Managing Director of COPA-DATA UK.

The human brain is a wonderful thing that works tirelessly from the day we are born until the day we die, only stopping on special occasions, like when presenting in front of large audiences. We’ve been studying the brain for many centuries, but we still know relatively little about the trillions of connections that make it work. Creating a road map of the brain is a bit like trying to map out the Industrial Internet of Things (IIoT). IIoT is a concept that has intrigued industry for several years now, but much like the human brain, is not yet fully understood.

COP146_3 things-IIoTTo gain a better understanding of the IIoT universe, we need to look at specifics. We need to understand how hardware and software, communication protocols and the human connection come together to support a stable and flexible interaction that enhances production, control and efficiency in industrial environments.

Machines to machines
Every time you form a new memory, new connections are created in the brain, making the system even more complex than before. Similarly, IIoT relies on many-to-many applications or groups of nodes to accomplish a single task. The plural of “machine” is important when discussing IIoT because it highlights the complexity of the system.

For example, on a sandwich biscuit production line, the biscuit sandwiching machine at the heart of the line should be able to communicate with the previous elements of the process, as well as the ones that come after it. The mixing, cutting and baking machines at the very start of the production process should also be able to “speak” to the conveyers, the pile packing sandwich machine, the cream feed system, lane multiplication and packaging machines. This level of communication allows the production line to be more flexible and cater for a wider range of biscuit varieties.

Regardless of whether we’re talking about biscuits, automotive manufacturing or even smart grids, IIoT has communication requirements that go beyond the standard client/server needs and conventional thinking.

Instead, the nodes act as peers in a network, each making decisions and reporting to other nodes.

Besides performing core tasks, the production system is also connected to an enterprise level that can automatically issue alarms, collect and analyse data and even make predictions or recommendations based on this analysis.

A common language
IIoT will only work if it uses a compatible language across systems and industries. To help achieve this objective, industry giants AT&T, Cisco, General Electric, IBM and Intel founded the Industrial Internet Consortium in 2014. The Consortium aims to accelerate the development and adoption of interconnected machines and intelligent analytics.

As IIoT cuts across all industry sectors, from manufacturing to energy, common standards, harmonised interfaces and languages are crucial for successful implementation of the concept. The consortium hopes to lower the entry barriers to IIoT by creating a favourable ecosystem that promotes collaboration and innovation. The next step is to facilitate interoperability and open standards, allowing machines or systems from different original equipment manufacturers (OEMs) to communicate with each other and with control systems.

The old and the new
Perhaps one of the biggest challenges when it comes to implementing IIoT on a larger scale comes from integrating legacy systems with the latest generation of smart factory equipment.

Learning new things changes the structure of the brain and similarly, in manufacturing, implementing new automation equipment usually results in changes across the entire system. The solution is to use standards-based protocol gateways to integrate legacy systems in brownfield environments. This allows organisations to free data from proprietary constraints and use it for real-time and historical data collection and analysis.

There is as much risk in sticking to a single vendor based on current install base as there is to accepting these new concepts with multiple new vendors and interoperability between intelligent devices. Their concept is something that we have experienced greatly within the energy and infrastructure sector and the concepts behind IEC61850 and interoperability.

Much like the human brain, the Industrial Internet of Things is always changing and there are still a lot of questions to be answered before we fully understand its requirements, implementation and potential. Luckily, these conversations are taking place and new ideas are put into practice every day. The next step is to figure out an easy way of practically implementing IIoT innovations in manufacturing environments across the world.


The future is (almost) now!

29/11/2015

Buzzwords fly around in industry like wasps at a picnic. Industry 4.0 is one of these hugely popular concepts, particularly when it comes to manufacturing. Here Steve Hughes, managing director of REO UK, gives further insight into Industry 4.0.

A business man with an open hand ready to seal a deal!

The first industrial revolution saw the development of mechanisation using water and steam power. The second was the introduction of electricity in manufacturing environments, which facilitated the shift to mass production. The digital revolution happened during our lifetime, using electronics and IT to further automate manufacturing.

Industry 4.0 is the fourth in this series of industrial revolutions. Although it is still, relatively speaking, in its infancy, the idea relies on sophisticated software and machines that communicate with each other to optimise production.

In Industry 4.0, strong emphasis is placed on the role of intelligent factories. They are energy efficient organisations based on high-tech, adaptable and ergonomic production lines. Smart factories aim to integrate customers and business partners, while also being able to manufacture and assemble customised products.

Industry 4.0 is more about machines doing the work and interpreting the data, than relying on human intelligence. The human element is still central to the manufacturing process, but fulfils a control, programming and servicing role rather than a shop floor function.

Siemens_Amberg

At Siemens’ Amberg Plant Simatic PLCs manage the production of PLCS!

The Siemens (IW 1000/34) Electronic Works facility in Amberg (D), is a good example of the next generation of smart plants. The 108,000 square-foot high-tech facility is home to an array of smart machines that coordinate everything from the manufacturing line to the global distribution of the company’s products.

Despite the endless variables within the facility, a Gartner industry study conducted in 2010 found that the plant boasts a reliability rate of more than 99 per cent, with only 15 defects in every million finished products.

Thanks to the data processing capacity of Industry 4.0-ready devices, it is possible to generate the information, statistics and trends that allow manufacturers to make their production lean and more fuel efficient.

If you work in the food manufacturing industry, you probably know that many production lines today operate at less than 60 per cent efficiency, which means there is considerable room for improvement. Saving electricity and water are also key requirements for modern plant managers, who can achieve their eco-friendly goals by using smart plant connectivity.

The great news is that a lot of the technology associated with Industry 4.0 already exists. The not so great news is that implementing it will probably cost your company a pretty penny, especially if you aim to be an early adaptor.

What the future holds
For most automation companies, the move will be a gradual one, an evolution rather than a revolution. This is why continuity with older systems will still be essential for manufacturing in the years to come.

Industry 4.0 will ultimately represent a significant change in manufacturing and industry. In the long run, the sophisticated software implanted in factory equipment could help machines self-regulate and make more autonomous decisions. Decentralisation also means tasks currently performed by a central master computer will be taken over by system components.

In years to come, geographical and data boundaries between factories could become a thing of the past, with smart plants joining up sites located in different places around the world.

Industry 4.0 is an excellent opportunity for industries to apply their skills and technologies to gradually start the shift towards smarter factories. New technologies will also lead to more flexible, sustainable and eco-friendly production and manufacturing lines. The first step is taking the Industry 4.0 concept from the land of buzzwords, to the land of research and development.


Wash, Rinse, Dry: Cleaning mass-produced automotive parts!

04/08/2014
High quality components keep vacuum cleaning plant running smoothly

The Multiclean-D-4-4-F full vacuum plant from Höckh is a true giant among washing machines. While the drum of a household washing machine can hold six kilograms at any one time, an industrial washing machine recently delivered to a German customer can take two 600 kg loads of metal parts for the automotive industry.

Festo’s  technology keeps the twin-chamber cleaning plant running smoothly.

The Multiclean-D-4-4-F full vacuum plant from Höckh is a true giant among washing machines.

The Multiclean-D-4-4-F full vacuum plant from Höckh is a true giant among washing machines.

In metalworking, greases and special emulsions protect cutting tools against wear. While this is good for the machines, it leaves a residue on the metal parts and must be removed before further processing. Assembly processes or surface treatments such as galvanising or painting require clean parts. Depending on the application, aqueous cleaning solutions or solvents can be used.

Solvents are preferable to aqueous cleaners for oily mass produced parts for the automotive industry as they are quick, economical and resource-saving. A new twin-chamber perchlorethylene-based cleaning plant from Höckh Metall-Reinigungsanlagen GmbH has raised the bar with operation under full vacuum.

When integrated into the production cycle, it increases part throughput significantly. Up to ten crates filled with pressed and stamped parts pass through the system every hour in a three shift operation. State-of-the-art valve terminal technology from Festo contributes to this excellent performance.

Everything in one chamber
The capacity of the huge washing machine for metal parts is simply enormous. In addition to rapidly cleaning large volumes of metal parts in either a 65° or 98° wash with liquid or vaporous perchlorethylene, the system also dries the parts using a vacuum after they have been washed. And all of this in less than 15 minutes per crate.

Before that, the pressed parts are transported in bulk. Forklift trucks move the parts in crates measuring approx. 900 x 800 x 850 mm and with a total weight of between 500 and 600 kg. To select the right program, the system operator simply scans the bar code on the accompanying ticket. As soon as he has left the loading area, automatic feeding begins and the crate is transported to the next free process chamber. To achieve the required capacity of 10 batches per hour in a three-shift operation, the process has been divided between two chambers.

The door of the giant washing machine drum is closed by a standard cylinder.

The door of the giant washing machine drum is closed by a standard cylinder.

The loading gantry then loads the rotating crate holder and a Festo standard cylinder DNG with a stroke of 180 cm closes the sliding door of the process chamber vacuum tight. When it reaches the last few centimetres, a clever toggle lever mechanism ensures it is firmly closed.

When the door reaches the last few centimetres, a clever toggle lever mechanism ensures it is firmly closed.

When the door reaches the last few centimetres, a clever toggle lever mechanism ensures it is firmly closed.

10 batches per hour
Depending on the parts type, this is then followed by an individual cleaning programme, which can be made up of various modules such as evacuation of the process chamber to process vacuum, pre-washing in the spray process, flood cleaning (full bath) from tank one, post washing in the spray process, flood cleaning (full bath) from tank two, vapour degreasing with solvent vapour and vacuum drying. A limit value encoder monitors the drying process so that only completely dry, solvent free parts are removed from the process chamber.

When the door reaches the last few centimetres, a clever toggle lever mechanism ensures it is firmly closed.

When the door reaches the last few centimetres, a clever toggle lever mechanism ensures it is firmly closed.

The cleaned parts then pass through a cooling tunnel on the unloading roller conveyor so that the crates can be packed directly for shipment. To achieve maximum flexibility, the system was designed as three separate modules.

For cleaning there are two identical, completely independent cleaning modules with process chamber, twin tank, distillation plant, pumps and filters. Because of standalone operation, one module can be switched off in the event of maintenance or low capacity utilisation and the system can continue to operate at half capacity. Both cleaning modules are connected to a central supply module, which houses the vacuum pumps as well as the activated carbon absorber for process air preparation.

The entire vacuum performance of over 1,000 m³/h can be divided into variable ratios between the two process chambers if required. This ensures a very high throughput for the size of the chamber and the complexity of the process of 10 batches per hour.

Reliable process engineering
This demanding process is kept running smoothly by a variety of Festo components. These include valve terminals type CPX/MPA with Profibus control. These valve terminals look after all of the process engineering, activate the angle seat valves and the actuators, ensure the crates are locked and control the liquid transport and the vacuum.

Thanks to ‘intelligence on the terminal’; the cleaning plant from Höckh does not require any additional multi-pin cables. The MS series service unit ensures correct and reliable compressed air preparation. The latest Festo technology also offers a condition monitoring option. Values such as maximum, peak and average consumption as well as effective and apparent power are displayed.


Cavity pressure monitoring ensures zero defect injection moulding!

16/06/2014

At German injection moulding specialist, neo-plastic Dr. Doetsch Diespeck GmbH, monitoring the quality of large-scale production of injection moulded parts is not a matter of chance. Using cavity pressure to determine the switchover to holding pressure for process optimization and cavity pressure-based monitoring from Kistler Instruments for quality assurance using both direct or indirect cavity pressure measuring, ensures minimum rejects. The medium-size German company focuses on producing high quality technical components mostly for manufacturers of ball bearings, linear guides and the automotive industry.

vontwickelInjection moulding of hinge covers is a typical example of seamless in-line quality assurance. These flat, palm-size SEBS parts protect the sensitive electric seat adjustment systems during the production of foamed car seat systems. The seat manufacturer inserts the injection moulded covers into a mould, where they form a very tight bond with the seat during foaming. Although these inserts are installed in a concealed place, they need to be precisely moulded to ensure that they are fully functional.

The injection moulding machine for this project, acquired in 2008, was equipped with a machine control system that provided outputs for pressure signals and integrated cavity pressure monitoring. Each cavity of the 2+2-cavity hot-runner family mould for the production of right-side and left-side hinge covers is equipped with Kistler 2.5 mm pressure sensors.

For other projects, the company also deploys Kistler’s CoMo Injection system. “CoMo is fully configured for analysing and monitoring injection moulding processes. When it comes to direct comparison, machine control systems provide rather limited analysis options,”  managing director, Patrick Freiherr von Twinkle reports.

With new, medium-term projects with six or seven-digit annual output rates, neo-plastic operates with cavity pressure technology right from the start. This applies to the production of a small technical PA46 breaker plate with a shot weight of only 3.5 grams. The brand new 8-cavity mould, made by the company’s in-house mould engineering department, is equipped with eight direct 1 mm pressure sensors. Again, the CoMo Injection process monitoring system will control the process by means of cavity pressure-dependent switchover and guarantees the quality of the moulded parts by monitoring the pressure curves. “Without sensors, this project would generate massive problems due to underfed parts. Automatic switchover makes the process significantly more stable.”

Faster setup changes and restarts
How long does it take before the investment in a cavity pressure monitoring system pays off? Von Twickel: “This is hard to pin down. There are many positive influences. Just think of the cost of complaints and the subsequent sorting effort. With the new system, we have removed that risk completely. Cavity pressure dependent switchover also facilitates and speeds up any setup changes: after ten shots with the new mould, the quality is perfect again. During the active production process, lot variations or changes of flow are registered immediately and can be remedied directly. Assuming an out-put rate of 200,000 parts per year, I would expect the system to have paid off after 18 months.”

At neo-plastic, the CoMo Injection monitoring system is not operated in fixed connection with one single machine, but, like the moulds, is flexibly used on several machines of similar size. Everywhere the system is applied, the process achieves stable conditions, no matter whether the machines are electric or hydraulic, and independent of their age.

After several years of experience, von Twickel is able to sum up the benefits of cavity pressure measuring and the integration of Kistler sensors and systems: “I can look into the cavity. That constitutes an unbeatable advantage. I have not encountered any other method that would deliver similar information”, he sums up. “Today, we are working in the mould, not in the machine.”


Continuous Mercury monitoring benefits cement plants.

15/05/2014
Antti Heikkilä from Gasmet Technologies highlights the challenges faced by mercury monitoring in cement kilns, and explains how a new continuous mercury monitoring system addresses these issues and provides process operators with an opportunity to improve environmental performance and demonstrate compliance with forthcoming legislation.

Background
The production of cement klinker and lime in rotary kilns is responsible for 10.7% of mercury emissions to air (3,337 kg) according to a recent study. Most of the mercury and mercury compounds pass through the kiln and preheater; they are only partly adsorbed by the raw gas dust, depending on the temperature of the waste gas. For these reasons, monitoring and controlling emissions of mercury to air is important and steps are being taken in several countries to impose emission limits. In the European Union BREF guidance for Cement kilns (CLM BREF), mercury has a BAT-associated emission level of 0.05 mg/Nm3 (50 µg/Nm3) for the half-hour average.

New monitoring technology

Figure 1

Figure 1

Gasmet Technologies has launched a new continuous mercury emission monitoring system (CMM) based on the cold vapour atomic fluorescence (CVAF) measurement principle. The analyser is integrated in an air conditioned cabinet together with a vacuum pump, an automatic calibrator and a nitrogen gas generator. The sample gas is extracted from the process duct with a dilution probe and heated sample line specially designed for sampling mercury from harsh process conditions (see figure 1 right). The analyser has a detection limit of 0.02 µg/Nm3 and the lowest measuring range for total mercury concentration is 0 – 10 µg/Nm3 when a dilution rate of 1:50 is used in the sample extraction probe.

Since the CMM analyser employs a CVAF spectrometer, the sensitivity of the instrument is excellent and the main source of measurement uncertainty that needs to be addressed by the analyser and the system design is the quenching effect; where other gases present in the sample, such as O2 and H2O, lower the fluorescence signal due to mercury atoms. In order to avoid these adverse effects, a dilution sampling approach is used and the dilution gas is synthetic nitrogen formed in a nitrogen generator inside the analyser cabinet. As the detection limit of the analyser is much lower than would be needed to monitor mercury in low µg/Nm3 ranges, dilution does not compromise the sensitivity of the instrument. On the other hand, dilution lowers the quenching effect by lowering the concentration of interfering gases by a factor of 50. Measuring mercury in a gas consisting of 98% nitrogen guarantees consistent measurement regardless of the fuel or emission abatement techniques used in the plant.

The CVAF spectrometer measures atomic mercury vapour (Hg0) and in order to measure total mercury including oxidized forms, a thermal catalytic converter is used to convert all forms of mercury such as Mercury Chloride into atomic mercury. The converter is close-coupled with the fluorescence cell to minimise the risk of recombination reactions where the atomic mercury converts back to oxidised forms between the converter and spectrometer.

The system has been field tested on various types of industrial plants (coal fired power plant, hazardous waste incinerator, sulphuric acid plant and a cement plant) to characterise the suitability and long-term stability of the sample probe and dilution system in various processes. Given the reactive nature of mercury, special care has been taken to ensure that mercury in the flue gas is not absorbed into dust accumulating in the sample probe filters. Mercury reacts readily with limestone dust, resulting in analyte loss and increased response time of the analyser. The Gasmet CMM solution includes a smaller filter element, which minimises the amount of dust deposition on the filter, and a two-stage blowback mechanism which first removes dust from the filter element and then in the second stage expels the dust from the probe tube back into the process.

Field test at Finnish Cement Plant

Figure 2

Figure 2

The CMM was installed on the emission stack of a rotary kiln cement plant with an Electrostatic Precipitator (ESP) for particulate emission control (see figure 2 above). The test period lasted 30 days. The fuels used during the test included coal, petroleum coke and recovered fuels. The flue gas composition at the measurement point is summarised in table 1. During the field trial, the raw mill was periodically stopped and the variation in mercury levels was monitored together with changes in other process parameters. Average mercury concentration when the raw mill was running was 6 to 8 µg/Nm3 and when the raw mill was stopped, the concentrations could increase to 20 – 40 µg/Nm3. The plant had an emission limit value of 50 µg/Nm3 for total mercury.

Figure 3

Figure 3

Figure 3 (above) shows a typical 24-hour period of emissions including raw mill on and raw mill off conditions. In addition to Hg0 concentration, the dust loading and raw mill state are shown because these are the main parameters expected to have an impact on the mercury analyser.

Results
The main goal of the test was to ensure the stability and repeatability of mercury measurement in demanding process conditions and to determine whether cement dust causes analyte loss and increased response time in the sample extraction probe.

The only process variable which clearly correlates with mercury concentration is the raw mill on/off state. When the raw mill is on, the variation in dust loading or other gas concentrations (O2, H2O, acid gases such as SO2 and HCl) does not correlate with variation observed in mercury concentration. When the raw mill is switched off, all gases including mercury undergo a change in concentration but this is clearly brought about by the raw mill state.

In order to estimate the repeatability of the Hg measurement at zero and span levels, the CMM analyser was configured to perform zero tests with synthetic nitrogen and span tests with Hg0 test gas generated by the mercury calibrator in the CMM system at 4 hour intervals. The normal test interval required by the analyser is 24 hours, but in the interest of creating more test data, the interval was shortened in this test. All test gases are injected into the probe upstream of particle filters so that the test gas has to pass through the potentially contaminated filters.

Figure 4

Figure 4

The results from six repeated span/zero test cycles are shown in figure 4 (above). The target level for the span check was 6.5 µg/Nm3 and the average span level was 6.60±0.036 µg/Nm3. The average result for the zero check was -0.006 ± 0.036964 µg/Nm3. If the dust accumulating in the sample extraction probe were to cause analyte loss during span tests, the later tests would show a decrease from the span check target value, but this was not observed. If the dust in the probe were to make the response time longer (memory effect), the later tests would show a slower response than the first tests. Again, there was no systematic change in the test results and the tests 1-6 exhibited very consistent results.

The span and zero checks also provided an opportunity to characterise the response time of the analyser when the span test at a known concentration is followed by a zero check with a zero concentration. The data from all six tests in figure 3 were combined together into one dataset in figure 4 by synchronising the moment when the span/zero check cycle was started. A Boltzmann sigmoidal curve (eqn 1) was fitted to the experimental data using GRG nonlinear fitting routine in the Microsoft Excel Solver package. The parameters of the response curve are summarised in table 2. The response time was evaluated as T90-10, the time interval between a reading representing 90% of the span check value and a reading representing 10% of the span check value.  The response time from this calculation was 10.15 minutes or just over two measurement cycles (measurement data is obtained as 5 minute rolling averages of the mercury concentration). The live data from the emissions shows peaks of comparable sharpness, but these were not subjected to the same analysis as the span/zero check data.

Summary
The requirements of a Continuous Mercury Monitoring system in a Cement plant are as follows:

  • capable of measuring a low baseline level with high sensitivity when the raw mill is on and the fuel feed contains low levels of metals
  • capable of measuring excursions to higher concentrations when the raw mill is off
  • low cross-interference from gases e.g. SO2
  • no analyte loss or other sampling issues in high dust loading
  • stable calibration and simplified calibration check routine with built-in calibration gas generator.

Since the main application areas for continuous Mercury monitoring systems have been in hazardous and municipal waste incineration, and coal fired power stations with conditions that are different to Cement plants; care must be taken to ensure that the monitoring system, and especially its sample extraction probe, is suitable for the process conditions. This study demonstrates that a CVAF spectrometer and dilution sampling approach can be successfully used in this application.


Follow

Get every new post delivered to your Inbox.

Join 513 other followers