Managing NOx gas emissions from combustion.

26/09/2019
Pollution can only be managed effectively if it is monitored effectively.

James Clements

As political pressure increases to limit the emissions of the oxides of nitrogen, James Clements, Managing Director of the Signal Group, explains how the latest advances in monitoring technology can help.

Nitrogen and oxygen are the two main components of atmospheric air, but they do not react at ambient temperature. However, in the heat of combustion, such as in a vehicle engine or within an industrial furnace or process, the gases react to form nitrogen oxide (NO) and nitrogen dioxide (NO2). This is an important consideration for the manufacturers of combustion equipment because emissions of these gases (collectively known as NOx) have serious health and environmental effects, and are therefore tightly regulated.

Nitrogen dioxide gas is a major pollutant in ambient air, responsible for large numbers of premature deaths, particularly in urban areas where vehicular emissions accumulate. NO2 also contributes to global warming and in some circumstances can cause acid rain. A wide range of regulations therefore exist to limit NOx emissions from combustion sources ranging from domestic wood burners to cars, and from industrial furnaces and generators to power stations. The developers of engines and furnaces therefore focus attention on the NOx emissions of their designs, and the operators of this equipment are generally required to undertake emissions monitoring to demonstrate regulatory compliance.

The role of monitoring in NOx reduction
NOx emissions can be reduced by:

  • reducing peak combustion temperature
  • reducing residence time at the peak temperature
  • chemical reduction of NOx during the combustion process
  • reducing nitrogen in the combustion process

These primary NOx reduction methods frequently involve extra cost or lower combustion efficiency, so NOx measurements are essential for the optimisation of engine/boiler efficiency. Secondary NOx reduction measures are possible by either chemical reduction or sorption/neutralisation. Naturally, the effects of these measures also require accurate emissions monitoring and control.

Choosing a NOx analyser
In practice, the main methods employed for the measurement of NOx are infrared, chemiluminescence and electrochemical. However, emissions monitoring standards are mostly performance based, so users need to select analysers that are able to demonstrate the required performance specification.

Rack Analyser

Infrared analysers measure the absorption of an emitted infrared light source through a gas sample. In Signal’s PULSAR range, Gas Filter Correlation technology enables the measurement of just the gas or gases of interest, with negligible interference from other gases and water vapour. Alternatively, FTIR enables the simultaneous speciation of many different species, including NO and NO2, but it is costly and in common with other infrared methods, is significantly less sensitive than CLD.

Electrochemical sensors are low cost and generally offer lower levels of performance. Gas diffuses into the sensor where it is oxidised or reduced, which results in a current that is limited by diffusion, so the output from these sensors is proportional to the gas concentration. However, users should take into consideration potential cross-sensitivities, as well as rigorous calibration requirements and limited sensor longevity.

The chemiluminescence detector (CLD) method of measuring NO is based on the use of a controlled amount of Ozone (O3) coming into contact with the sample containing NO inside a light sealed chamber. This chamber has a photomultiplier fitted so that it measures the photons given off by the reaction that takes place between NO and O3.

NO is oxidised by the O3 to become NO2 and photons are released as a part of the reaction. This chemiluminescence only occurs with NO, so in order to measure NO2 it is necessary to first convert it to NO. The NO2 value is added to the NO reading and this is equates to the NOx value.

Most of the oxides of nitrogen coming directly from combustion processes are NO, but much of it is further oxidised to NO2 as the NO mixes with air (which is 20.9% Oxygen). For regulatory monitoring, NO2 is generally the required measurement parameter, but for combustion research and development NOx is the common measurand. Consequently, chemiluminescence is the preferred measurement method for development engineers at manufacturer laboratories working on new technologies to reduce NOx emissions in the combustion of fossil fuels. For regulatory compliance monitoring, NDIR (Non-Dispersive Infrared) is more commonly employed.

Typical applications for CLD analysers therefore include the development and manufacture of gas turbines, large stationary diesel engines, large combustion plant process boilers, domestic gas water heaters and gas-fired factory space heaters, as well as combustion research, catalyst efficiency, NOx reduction, bus engine retrofits, truck NOx selective catalytic reduction development and any other manufacturing process which burns fossil fuels.

These applications require better accuracy than regulatory compliance because savings in the choice of analyser are negligible in comparison with the market benefits of developing engines and furnaces with superior efficiency and better, cleaner emissions.

Signal Group always offers non-heated, non-vacuum CLD analysers for combined cycle gas turbine (CCGT) power stations because these stations emit lower than average NOx levels. NDIR analysers typically have a range of 100ppm whereas CLD analysers are much more sensitive, with a lower range of 10ppm. Combustion processes operating with de-NOX equipment will need this superior level of sensitivity.

There is a high proportion of NO2 in the emissions of CCGT plants because they run with high levels of air in the combustion process, so it is necessary to convert NO2 to NO prior to analysis. Most CLD analysers are supplied with converters, but NDIR analysers are not so these are normally installed separately when NDIR is used.

In the USA, permitted levels for NOx are low, and many plants employ de-NOx equipment, so CLD analysers are often preferred. In Europe, the permitted levels are coming down, but there are fewer CCGT Large Plant operators, and in other markets such as India and China, permitted NOx emissions are significantly higher and NDIR is therefore more commonly employed.

In England, the Environment Agency requires continuous emissions monitors (CEMS) to have a range no more than 2.5 times the permitted NOx level, so as a manufacturer of both CLD and NDIR analysers, this can be a determining factor for Signal Group when deciding which analysers to recommend. The UK has a large number of CCGT power plants in operation and Signal Group has a high number of installed CEMS at these sites, but very few new plants have been built in recent years.

New NOx analysis technology
Signal Group recently announced the launch of the QUASAR Series IV gas analysers which employ CLD for the continuous measurement of NOx, Nitric Oxide, Nitrogen Dioxide or Ammonia in applications such as engine emissions, combustion studies, process monitoring, CEMS and gas production.

Chemiluminescence Analyser

The QUASAR instruments exploit the advantages of heated vacuum chemiluminescence, offering higher sensitivity with minimal quenching effects, and a heated reaction chamber that facilitates the processing of hot, wet sample gases without condensation. Signal’s vacuum technology improves the signal to noise ratio, and a fast response time makes it ideal for real-time reporting applications. However, a non-vacuum version is available for trace NOx measurements such as RDE (Real-world Driving Emissions) on-board vehicle testing, for which a 24VDC version is available.

A key feature of these latest instruments is the communications flexibility – all of the new Series IV instruments are compatible with 3G, 4G, GPRS, Bluetooth, Wifi and satellite communications; each instrument has its own IP address and runs on Windows software. This provides users with simple, secure access to their analyzers at any time, from almost anywhere.

In summary, it is clear that the choice of analyser is dictated by the application, so it is important to discuss this with appropriate suppliers/manufacturers. However, with the latest instruments, Signal’s customers can look forward to monitoring systems that are much more flexible and easier to operate. This will improve NOx reduction measures, and thereby help to protect both human health and the environment.


Bob Lally – Piezoelectric sensing technology pioneer.

27/03/2018

Molly Bakewell Chamberlin, president, Embassy Global LLC pays touching tribute to an important instrument pioneer and innovator. She acknowledges the help of Jim Lally, retired Chairman of PCB Group in preparing this eulogy.

Bob Lally (1924-2018)

During my earliest days in the sensors industry, at PCB Piezotronics (PCB), I can still remember the excitement which accompanied publication of my first technical article. It was a primer on piezoelectric sensing technology, which ran some 15 years ago in the print edition of Sensors. About a month later, I recall receiving a package at PCB, containing both a copy of my article and a congratulatory letter. The article was covered in a sea of post-it notes, filled with new insights and explanatory diagrams. I recall marveling at the sheer kindness of anyone taking such time and interest in the work. I’d sent an immediate thank you, then received yet another encouraging response.  From that time onward, nearly each time I’d publish an article, another friendly envelope would arrive. I’d look forward to them, and the opportunities for learning and growth they’d offered.

As I’d soon come to know, those envelopes were sent by none other than PCB Founder, Bob Lally, who passed away last month at the age of 93. For me, Bob was my PCB pen pal, who along with his brother, Jim, helped me to develop a real appreciation for piezoelectric sensing technology. They made it fun. I also had the privilege of learning quite a bit about this kind, brilliantly complex and insightful person who was so helpful to me. To the sensors industry, Bob’s technical contributions were legendary. What is less known about Bob, however, were his equally remarkable histories, first as a decorated veteran of WW II; and later, as an innovator in STEM.

After graduating from high school in 1942, Bob entered military service, as part of the United States Army which helped liberate mainland Europe during World War II. His service was recognised with two Bronze Stars for bravery. When the hostilities ended, Bob returned home, and was able to benefit from a special U.S. government program which funded the university education of military veterans and their families. This benefit allowed Bob to attend the University of Illinois at Urbana-Champaign, where he earned both Bachelor of Science and Master of Science degrees in Mechanical Engineering with a minor in Mathematics. He graduated with high honors, as University co-salutatorian, in 1950. Bob also later continued this commitment to lifelong learning via studies at both Purdue and the State University of New York at Buffalo (NY USA).

Bob’s first engineering job upon graduation was as a guidance and control engineer at Bell Aircraft Corp. (Bell) in Buffalo, (NY USA). This a position in which he would serve for four years. He worked in test flight control systems R&D for experimental aircraft, glide bombs and guided missiles. He also supervised the inertial guidance group. It was from his work at Bell that Bob first learned about the application of piezoelectric sensing technology for the dynamic measurement of physical parameters, such as vibration, pressure, and force. That technology was first developed by Bob’s colleague, Walter P. Kistler, the Swiss-born physicist who had successfully integrated piezoelectric technology into Bell’s rocket guidance and positioning systems.

Original PCB Piezotronics facility in the family home of Jim Lally, ca 1967. Bob Lally, centre background, is operating a DuMont oscilloscope in the Test department.
Jim Lally, left foreground, leads the Sales department.

In 1955, Bob and some of his Bell colleagues decided to form what was the original Kistler Instrument Company. That company sought to further commercialize piezoelectric sensing technologies for an expanded array of applications and markets, beyond the aerospace umbrella. In addition to his role as co-founder, Bob remained at the original Kistler Instrument Company for 11 years, serving as VP of Marketing, while continuing his roles in engineering, production, testing, and sales. Upon learning that the company was being sold to a firm out of Washington State, Bob decided to form PCB Piezotronics. Established in 1967, PCB specialized in the development and application of integrated electronics within piezoelectric sensors for the dynamic measurement of vibration, pressure, force and acceleration. The original PCB facility had rather humble beginnings, with all sales, marketing, R&D and operations running from the basement of Jim Lally’s family home.

IR-100 Award plaque, presented to Bob Lally, 1983.

It was also in this timeframe that Bob became world-renowned for his capability to successfully integrate piezoelectric sensing technology into mechanical devices, setting a new industry standard for test and measurement. He was awarded multiple U.S. patents for these innovations, including the modally-tuned piezoelectric impact hammer, pendulum hammer calibrator, and gravimetric calibrator, all for the modal impact testing of machines and structures. The modally tuned impulse excitation hammer was further recognized with a prestigious IR-100 award, as one of the top 100 industry technical achievements of 1983.

Bob was also renowned for his successful commercialization of a two-wire accelerometer with built-in electronics. That concept was marketed by PCB as integrated circuit piezoelectric, or ICP. Bob’s 1967 paper for the International Society of Automation (ISA), “Application of Integrated Circuits to Piezoelectric Transducers”, was among the first formally published technical explanations of this concept. As Bob had detailed, the application of this technology made the sensors lower cost, easier to use and more compatible with industrial environments. Subsequent widespread industry adoption of these accelerometers created new markets for PCB, such as industrial machinery health monitoring, and formed a major cornerstone for the company’s success. In 2016, PCB was acquired by MTS Systems Corporation and employs more than 1000 worldwide, with piezoelectric sensing technologies still among its core offerings.

Beyond Bob’s many R&D accomplishments, he is known for his invaluable contributions to the establishment of industry standards and best practices, as a member of the technical standards committees of the Society of Automotive Engineers (SAE), Society for Experimental Mechanics (SEM), and Industrial Electronics Society (IES), among others. Bob also served on the ISA Recommended Practices Committee for Piezoelectric Pressure Transducers and Microphones, as well as the ASA Standards Committee for Piezoelectric Accelerometer Calibration. Many of the standards that Bob helped to develop, as part of these committees, remain relevant today.

Upon retirement, Bob remained committed to the education and training of the next generation of sensors industry professionals. He often gave tutorials and donated instrumentation for student use. Bob later continued that work as an adjunct professor at the University of Cincinnati. In the mid-2000s, he began to develop an innovative series of Science, Technology, Engineering, and Math (STEM) educational models. Each was designed to provide a greater understanding of various sensing technologies, their principles of operation, and “real life” illustrations of practical applications.

STEM sensing model, with adjustable pendulums, by Bob Lally.

Among Bob’s final works was a unique STEM model consisting of three adjustable connected pendulums. That model was used to illustrate the concept of energy flex transference and the influence of physical structural modifications on structural behavior. Bob continued his mentoring and STEM work nearly right up until his passing. He did so with unwavering dedication and enthusiasm, despite being left permanently disabled from his combat injuries.

In addition to co-founding two of the most successful sensor manufacturers in history and his many R&D accomplishments, Bob’s generosity of spirit shall remain an important part of his legacy. I, like many, remain truly grateful for the selfless and meaningful contributions of Bob Lally to my early professional development, particularly in my technical article work. It is an honour to tell his story.

• He is survived by his son, Patrick (Kathi) Lally of Orchard Park, New York; his grandson, Joshua Lally; his surviving siblings, Jim, MaryAnn (Wilson), and Patricia; and his many nieces, nephews, friends and colleagues.

• Special thanks to Jim, Kathi and Patrick Lally for their support and contributions to this article.

• All pictures used hear are by kind courtesy of the Lally family.

Train derailment prompts contaminated land investigation.

11/01/2018

A train derailment in Mississippi resulted in ground contamination by large quantities of hazardous chemicals, and environmental investigators have deployed sophisticated on-site analytical technology to determine the extent of the problem and to help formulate an effective remediation strategy. Here Jim Cornish from Gasmet Technologies discusses this investigation.

Jim Cornish

On March 30th 2015 a long freight train, transporting a variety of goods including lumber and chemicals, wound its way through the state of Mississippi (USA). At around 5pm, part of the train failed to negotiate a curved portion of the track in a rural area near Minter City, resulting in the derailment of nine railcars, one of which leaked chemicals onto agricultural farmland and woodlands. Emergency response and initial remediation activities were undertaken, but the remainder of this article will describe an environmental investigation that was subsequently conducted by Hazclean Environmental Consultants using a portable multiparameter FTIR gas analyzer from Gasmet Technologies.

Background
Over 17,000 gallons of Resin Oil Heavies were released from the railcar, and the main constituent of this material is dicyclopentadiene (DCPD). However, in addition to DCPD, Resin Oil Heavies also contains a cocktail of other hydrocarbons including ethylbenzene, indene, naphthalene, alpha-methyl styrene, styrene, vinyl toluene, 1, 2, 3-trimenthylbenzene, 1, 2, 4-trimethylbenzene, 1, 3, 5-trimethylbenzene and xylenes.

DCPD is highly flammable and harmful if swallowed and by inhalation. Its camphor-like odor may induce headaches and symptoms of nausea, and as a liquid or vapor, DCPD can be irritating to the eyes, skin, nose, throat or respiratory system. DCPD is not listed as a carcinogen, however DCPD products may contain benzene, which is listed as a human carcinogen. DCPD is not inherently biodegradable, and is toxic to aquatic organisms with the potential to bioaccumulate.

It is a colorless, waxy, flammable solid or liquid, used in many products, ranging from high quality optical lenses through to flame retardants for plastics and hot melt adhesives. As a chemical intermediate it is used in insecticides, as a hardener and dryer in linseed and soybean oil, and in the production of elastomers, metallocenes, resins, varnishes, and paints. DCPD-containing products are also used in the production of hydrocarbon resins and unsaturated polyester resins.

Emergency Response
Emergency response phase activities were performed from March 31 through May 2, 2015. Response objectives and goals were formally documented by utilizing Incident Action Plans for each operational period. Activities between April 11 and April 28, 2015 were summarized in weekly reports and submitted to the Mississippi Department of Environmental Quality (MDEQ) and the Environmental Protection Agency (EPA).

Approximately 10,189 gallons of the leaked product was recovered, leaving 5,458 gallons to contaminate the farmland surface and subsurface soil, surface waters, groundwater and ambient air. The site contamination problem was exacerbated due to heavy rainfall and associated stormwater runoff which caused the unrecovered product to migrate from the spill site.

Taking account of the high rainfalls levels that followed the event, it was calculated that contaminated stormwater runoff from the immediate project site (10 acres with 8.7 inches of rainfall) was 2,362,485 gallons less that retained by emergency retention berms. Approximately 207,000 gallons of contaminated stormwater were collected during the emergency response, in addition to approximately 7,870 tons of impacted material which were excavated for disposal. Following removal of the gross impacted material, the site was transferred into Operation and Maintenance status, conducted in accordance with a plan approved by MDEQ.

Ongoing site contamination
Groundwater and soil samples were collected and analyzed in 2015 and 2016, producing analytical data which confirmed that widespread soil and groundwater contamination still existed at the site. Further remediation was undertaken, but the landowners were extremely concerned about the fate of residual chemicals and contracted Hazclean Environmental Consultants to conduct a further investigation.

“The affected land is used for agricultural purposes, producing crops such as soybeans and corn,” says Hazclean President, E. Corbin McGriff, Ph.D., P.E. “Consequently, there were fears that agricultural productivity would be adversely affected and that chemicals of concern might enter the food chain.
“This situation was exacerbated by the fact that the landowners could still smell the contamination and initial investigation with PID gas detectors indicated the presence of volatile organic compounds (VOCs).”

Hazclean’s Joseph Drapala, CIH, managed and conducted much of the site investigation work. He says: “While PID gas detectors are useful indicators of organic gases, they do not offer the opportunity to quantify or speciate different compounds, so we spoke with Jeremy Sheppard, the local representative of Gasmet Technologies, a manufacturer of portable FTIR (Fourier Transform Infrared) gas analyzers.

Soil Vapor Analysis with FTIR

“Jeremy explained the capabilities of a portable, battery-powered version of the Gasmet FTIR gas analyzer, the DX4040, which is able to analyze up to 25 gases simultaneously, producing both qualitative and quantitative measurements. Gasmet was therefore contacted to determine whether this instrument would be suitable for the Mississippi train spill application.

“In response, Gasmet confirmed that the DX4040 would be capable of measuring the target species and offered to create a specific calibration so that these compounds could be analyzed simultaneously on-site.”

Site investigation with FTIR analysis
A sampling zone was defined to capture potential contamination, and measurements were taken for surface and subsurface soil, groundwater, and surface and subsurface air for a range of VOCs.

Vapor Well

The area-wide plan resulted in the installation of four permanent monitoring wells for groundwater sampling, twenty vapor monitoring wells, and twenty test borings for field screening. The test borings indicated the presence of VOCs which were further characterized by sampling specific soil sections extracted from the parent core.

In addition to the almost instantaneous, simultaneous measurement of the target compounds, the Gasmet DX4040 stores sample spectra, so that post-measurement analyses can be undertaken on a PC running Gasmet’s Calcmet™ Pro software, providing analytical capability from a library of 250 compounds. “The Gasmet DX4040 was manufacturer-calibrated for dicyclopentadiene, benzene, ethylbenzene, naphthalene, styrene, toluene, 1,2,3-trimenthylbenzene, 1,2,4-trimethylbenzene, 1,3,5-trimethylbenzene and m, o, and p and total xylenes at a detection range of 0.01 ppm to 100 ppm in air,” Joseph reports, adding: “The ability to compare recorded spectra with the Calcmet Pro library is a major advantage because it enables the measurement of unknown compounds.”

The operating procedures for the DX4040 indicate a simple, convenient requirement for daily calibration with zero gas prior to each monitoring activity. However, in addition to the use of nitrogen as the zero gas, Joseph also employed specialty gas (DCPD) certified for 1 ppm and 5 ppm as a calibration check and a response (akin to bump testing) gas.

Site screening
The test borings provided soil samples that were vapor-tested on-site as part of the screening process. Vapor from the extracted soil samples was analyzed by placing the soil samples in vessels at ambient temperature and connecting the DX4040 in a closed loop from the vessel, so that air samples could be continually pumped from the vessel to the analyzer and returned to the vessel. This screening activity helped to determine the location for vapor wells.

All soil samples were screened with the DX4040 and those with the highest reading from each boring were sent for laboratory analysis.

Vapor wells were fitted with slotted PVC liners and capped. Before monitoring, the cap was replaced with a cap containing two ports to enable the DX4040 to be connected in a similar closed-loop monitoring system to that which was employed for the soil samples.

Conclusions
As a result of this investigation it was possible for Hazclean to determine that the release of DCPD in the vapor state, as measured in the vapor monitor wells, is a result of surface and subsurface contamination in the soil and groundwater, and that this contamination will remain in the future.

Vapor analysis data provided by the DX4040 identified DCPD, benzene, styrene and xylene previously adsorbed on soil and/or wetted surfaces undergoing diffusion and evaporation. The adsorption, diffusion and evaporation of DCPD et al. released and spread across the farmland is a mechanism to explain the vapor concentrations found in vapor monitor wells as well as the ambient malodor problem.

The long term release of DCPD and other VOCs will continue to occur in the impact area unless a larger remediation project is conducted to remove soil and groundwater contamination. Furthermore, Hazclean recommends that, as a result of the effectiveness of the Gasmet DX4040 in this investigation, the same technology should be employed in any subsequent screening activities, using the same Gasmet calibration configuration.

Summarizing, Joseph Drapala says: “The Gasmet DX4040 was an essential tool in this investigation. Screening activities should have the ability to detect and identify the target compounds, as well as any secondary compounds that may have already been present on-site or could have been produced as a result of chemical interactions.
“As an FTIR gas analyzer, the DX4040 meets these requirements, providing enormous analytical capability through Gasmet’s Calcmet software. However, the instrument is also small, lightweight and battery powered which makes it ideal for field investigations.”


#EMrex On top of the world!

19/10/2015
Pic tweeted by @NewEngControls

Pic tweeted by @NewEngControls

Unfortunately we were unable, in any detail, to follow this years Emerson User Group love in. It took place high up in the Rocky mountain city of Denver (CO USA) and certainly looked like a very full programme with the usual enthusiastic plethora of tweets submerging our twitter feed, such as “Great venue, great presentations, great networking, great week – I’m #Elevated!” from @ChristopAmstutz. And obviously singing from the same hymn sheet @MCChow_88 with “I truly had my experience elevated this past week @EmersonExchange!”  Obviously all were on a higher plane – or altitude than us mere mortals at sea level!

A new item (as far as I can remember) was a feature which included those who were unable to be in Denver who were invited to participate in the final “Ask the Experts” – seven gurus with all information and knowledge on the topics featured in the four packed days of information sharing. The @EmersonExchange twitter constantly referred to the various forum posts questions and solutions of interest to users.

As always the Jim Cahill, Mr Emerson On-Line, was ever present guiding, pointing and highlighting interesting happenings, speakers and events.

There were news letters and videos published on a daily basis which helped inform people not present what was happening.   This was also useful for those attending but who had to make choices as to which presentation to attend.

A very comprehensive account of highlights has been written by Gary Mintchell, “Wireless, Enhanced Sensing Lead Emerson Product Announcements,” which is “a summary—running through many of the new products introduced to the press and analysts during Emerson Exchange 2015.”  He also made an video of his experience:

During the week although we were unable to follow events we did have a link on our home page, which allowed visitor to follow things.

CRHlg6hVEAAlojx

Daily Reports
Day 1
Day 2
Day 3

Videos from Major Sessions

FieldCommtechnologies at #EMrex (report Control-online 25 Nov’15)

Our Reports for earlier EMrex Events

The next User Group Meeting is scheduled for Europe in Brussels (B) in April 2016. Emrex Americas is planned for later next year in the Capital of Texas, Austin – 24-28th October 2016. And if you wish to plan even further ahead the 2017 event is to be held in Minneapolis.


A fascinating story: Trash to gas project to help life on Mars!

30/11/2014
If you are travelling to Mars on a journey that will last for several months, you need to maintain good breathing air quality and you need to manage your resources very carefully. This article describes research on the off-gases from astronaut waste; checking that they are not harmful and figuring out if they can be converted into water, oxygen and rocket propellant.

As part of a project to measure the effects of long-term isolation on astronauts, small groups of individuals have been selected to live in a tiny ‘Habitat’ perched on the upper slope of a volcano in Hawaii. In doing so, the project team has contributed to the understanding of issues that would confront a manned mission to Mars.

NASA’s Anne Caraccio analyzing waste gases during simulated Mars mission

NASA’s Anne Caraccio analyzing waste gases during simulated Mars mission

For example NASA’s Anne Caraccio studied off-gases from the crew’s trash with a portable Gasmet FTIR gas analyzer. “Waste from the crew’s everyday activities are routinely sorted and stored, but we need to know the composition of the off-gases from these materials for health and safety reasons, and also to determine whether these gases could be utilised beneficially,” Anne reports.

The work was undertaken during the second of four HI-SEAS (Hawaiʻi Space Exploration Analog and Simulation) missions which involved living with 5 other crew members for a period of 120 days in a two-story solar powered dome just 11 metres in diameter with a small attached workshop the size of a shipping container. In addition to the completion of a range of tasks that were set by the project, each crew member conducted their own research, which in Anne’s case was known as ‘Trash to Gas’, a programme working on the development of a reactor to convert waste from long-duration missions into useful commodities such as water, life-support oxygen and rocket propellant.

The main objective of the second HI-SEAS mission was to evaluate the performance and the social and psychological status of the crew members whilst they lived in cramped isolated conditions in a lava rock environment that resembled Mars.

Crew members were allowed outside of the Habitat, but in order to do so they had to wear simulated spacesuits and undergo a 5 minute mock compression/decompression. Since the FTIR gas analyser is portable (14Kg), Anne was able to conduct additional monitoring both inside and outside the Habitat in order to compare data with the waste off-gas measurements. “Size, weight and portability are obviously of major importance on a project such as this, but the main advantage of this technology was its ability to measure a large number of compounds simultaneously; I measured 24 VOCs such as acetaldehyde, methane and ethylene, but the instrument also stores spectra for the measurements so it is possible to retrospectively analyze data if it becomes necessary to look for a particular compound at a later stage.”

Anne’s monitoring provided a clear view of the most important gases within the Habitat. For example, stored waste had the highest relative levels of ethanol (due to crew members’ hygiene wipes and cleaning products) and water vapor (due to residual water from food and plant waste). The laboratory where plants were growing had the lowest relative level of methane. The waste bins had higher relative levels of nitrous oxide and pentane, and the bathroom had the highest levels of acetaldehyde.

The FTIR gas analyser, a DX4040, was supplied by the company Gasmet Technologies. “We were very pleased to be able to help with this project,” says Gasmet’s Jim Cornish. “The simultaneous monitoring of multiple compounds is a common application for our FTIR analyzers, however, they are usually employed measuring gases in stack emissions, industrial processes, greenhouse gas research and in hazmat scenarios. We usually tell prospective customers that advanced FTIR technology is simple to use; ‘it’s not rocket science’ we tell them, so I guess we will have to rephrase that now.”

The waste produced during the HI-SEAS mission was measured during the entire mission, although this was for a shorter period than would be expected of an actual long duration mission. The Trash-to-Gas reactor processed HI-SEAS waste simulant at the Kennedy Space Center with results demonstrating that a future reactor would be most efficient with specific material processing cycles to maximize the desired output. Automation will also be needed in the future Trash-to-Gas reactor because the current technology would require too much of a crew member’s logistical time. The Trash-to-Gas reactor first converts waste into carbon dioxide, which is then mixed with hydrogen in a Sabatier reaction to produce methane and water.

The Kennedy Space Center Trash-to-Gas reactor processed three waste types and produced 9% of the power that would have been needed during the HI-SEAS mission. As part of the psychological assessment, each member of the crew completed regular surveys and kept diaries. They also wore ‘sociometric’ badges that recorded conversation patterns and voice tone.

Commenting on the psychological results of the project, Anne says “The crew were essentially strangers when they entered the Habitat, which is unlike a typical space mission in which the crew would have worked and trained together for a number of months or even years. Nevertheless, the crew coped extremely well with living and working in such close proximity, and there were no significant periods of stress in my opinion.”

The third Hi-SEAS mission began on October 15, 2014. Again, a 6 member crew will conduct a similar mission, with the exception that it will last for 8 months. Anne says: “Participation in these missions requires a real passion for science, technology and space travel. The application process includes a class 2 flight medical, a personal research project proposal, essays, interviews and educational requirements, all of which is similar to the NASA astronaut application procedure.” Looking forward, she says: “The technology to travel to Mars has not yet been fully developed, but it is anticipated that a human mission could be possible in the future. The journey to Mars would take around one year, so I hope that our Trash-to-Gas research will contribute to the science that could make such a mission possible.”


#HUG2014 Americas’ symposium adjudged success!

19/06/2014

A record number of attendees were at the annual gathering of Honewell customers from across a wide range of industries throughout the Americas.

hug2014Other stories from HUG 2015
In addition to the Americas conference, Honeywell will also hold HUG events in Queensland, Australia (Sept. 21-23) and The Hague in The Netherlands (Nov. 10-14).

More than 1,300 customers, distributors and Honeywell leaders and engineers attended the 2014 Honeywell Users Group (HUG) Americas symposium held between June 2nd and 6th in San Antonio, Texas. The conference brought together many of the world’s largest process manufacturers to discuss how to apply new technologies to overcome challenges facing their respective industries and operations.

Attendance was 25 percent higher than the 2013 Americas conference held in Phoenix, Ariz., with about 40 percent attending HUG for the first time. The record number of attendees represented more than 475 companies from 36 countries and more than 60 industries. Almost 200 participants attended Honeywell’s Channel Partner conference, which was held in parallel with HUG.

“Honeywell’s User Group is a great opportunity for us, and our customers, to share and discuss issues they are facing in an open, collaborative environment,” said Vimal Kapur, president of Honeywell Process Solution. “A number of our technologies for industrial processors had their beginnings in discussions with customers at HUG. The information shared here helps us develop products and solutions that help our customers overcome their specific issues or issues common to their industries.”

Key themes for this year’s event centered around delivering information and improving collaboration. Honeywell announced availability of its Experion® Orion Console, an advanced display technology designed to reduce operator fatigue and improve situational awareness through features such as improved ergonomics and a large, flexible, high-definition display. Other highlighted technologies designed to better deliver information and improve collaboration included:

• Uniformance® Release 320, software which helps plant managers make better and faster decisions with superior data management, and significantly improved event investigation and trend analysis.

• Intuition® Operations Logbook Release 100, software that provides a unique tool to better log operational activities in a plant, resulting in a more-effective workforce better able to minimize incidents, improve operations and meet regulatory requirements.

• Intuition® Executive Release 220.5, which features improved dashboard call-up performance and enables multiple site deployments for a corporate-wide view of safety, operations and business metrics.

In addition to control room technologies, the company also featured technology to deliver better information in the field and in remote locations. For example, the company showcased its new SmartLine™ industrial temperature transmitters, which can improve overall plant and employee efficiency in harsh and noisy process environments. SmartLine transmitters use advanced displays to show process data in graphical formats and communicate control room messages, and use modular components to simplify field repairs.

Modularity was also a theme with SCADA applications, such as the Honeywell RTU2020 Remote Terminal Unit – a modular and scalable controller capable of all remote automation and control applications. In conjunction with SCADA technologies, the RTU2020 provides an integrated solution to solve complex remote automation requirements in applications such as remote well-head monitoring.

Finally, Honeywell showcased a new approach to project execution called LEAP™, that combines several proven technologies to help companies more quickly design, build and start their plants. LEAP project services combines HPS’ proprietary hardware and software, virtualization and cloud engineering to reduce risk and total automation costs by up to 30 percent. The approach represents a significant departure from the way plants are typically designed and built, by using lean execution methodologies and parallel workflows to keep automation systems off critical implementation paths.


The US cybersecurity framework for implementation!

14/02/2014
A unique, public-private partnership effort now turns to the plan’s implementation

The official rollout of the US Cybersecurity Framework, recognized this past Wednesday in an announcement delivered by President Barack Obama, represents the completion of a successful partnership effort among The White House, the Automation Federation and its founding organization, the International Society of Automation (ISA). Now, the second phase of the partnership—working together to implement the framework—begins.

US President Obama

US President Obama

The US Cybersecurity Framework, the result of a year-long initiative to develop a voluntary how-to guide for American industry and operators of critical infrastructure to strengthen their cyber defenses. is a key deliverable from the Executive Order on “Improving Critical Infrastructure Cybersecurity” that President Obama announced in his 2013 State of the Union address.

During the past year, representatives of the Automation Federation and the International Society of Automation (ISA) have been assisting the US government—at the White House’s request—to help develop and refine a draft of the US Cybersecurity Framework. Both organisations were sought out as essential government advisors given their expertise in developing and advocating for industrial automation and control system (IACS) security standards. The ANSI/ISA99, Industrial Automation and Control Systems Security standards (known internationally as ISA99/IEC 62443), are recognized globally for their comprehensive, all-inclusive approach to IACS security.

ISA’s IACS security standards are among the framework’s recommendations because they’re designed to prevent and mitigate potentially devastating cyber damage to industrial plant systems and networks—commonly used in transportation grids, power plants, water treatment facilities, and other vital industrial settings. Without these defenses in place, industrial cyberattack can result in plant shutdown, operational and equipment impairment, severe economic and environmental damage, and public endangerment.

A significant step forward in protection
President Obama, in his statement released on last Wednesday in Washington, DC, said that “cyber threats pose one the gravest national security dangers that the United States faces. I am pleased to receive the Cybersecurity Framework, which reflects the good work of hundreds of companies, multiple federal agencies and contributors from around the world.”

The 41-page framework takes a risk-management approach that allows organizations to adapt to “a changing cybersecurity landscape and responds to evolving and sophisticated threats in a timely manner,” according to the document.

Though the adoption of the framework is voluntary, the Department of Homeland Security (DHS) has established the Critical Infrastructure Cyber Community (C3) Voluntary Program to increase awareness and use of the Cybersecurity Framework. The C3 Voluntary Program will connect companies, as well as federal, state and local partners, to DHS and other federal government programs and resources that will assist their efforts in managing their cyber risks. Participants will be able to share lessons learned, receive guidance and learn about free tools and resources.

Automation Federation Chairman Terry Ives

Terry Ives, Automation Federation Chair

Attending the Wednesday launch event in the nation’s capital was a contingent of Automation Federation officials, including Michael Marlowe, Automation Federation Managing Director and Director of Government Relations; Terry Ives, 2014 Chair of the Automation Federation; and Leo Staples, a past Chair of the Automation Federation who serves as leader of the Automation Federation’s Cybersecurity Framework team.

“Given that the risk of cyberattacks targeted to industrial automation and control systems across all industry sectors continues to grow, it’s important that the Automation Federation and ISA have been actively involved in the development of this national cybersecurity initiative,” said Ives. “The Cybersecurity Framework provides an effective, comprehensive approach for industry sectors to determine their vulnerability to these kinds of attacks and the means to mitigate them.”

Moving forward to implementation 
“Now that the Cybersecurity Framework has been officially launched by the Obama administration, we have been asked by The White House and the National Institute of Standards and Technology (NIST) to assist in the framework’s implementation,” reports Marlowe. “We are actively underway in planning a series of implementation seminars throughout the US and as far away as London.”

In fact, the first implementation seminar is to be conducted  on Friday, 21 February 2014 in Birmingham (AL USA). The seminar will be sponsored by the Automation Federation and the Alabama Technology Network, a Working Group of the Automation Federation.

At the seminar, representatives from the White House, NIST and leading cybersecurity subject matter experts will outline the provisions and details of the Cybersecurity Framework, and will illustrate why IACS security standards are such fundamental components of the plan and its implementation.