Managing NOx gas emissions from combustion.

26/09/2019
Pollution can only be managed effectively if it is monitored effectively.

James Clements

As political pressure increases to limit the emissions of the oxides of nitrogen, James Clements, Managing Director of the Signal Group, explains how the latest advances in monitoring technology can help.

Nitrogen and oxygen are the two main components of atmospheric air, but they do not react at ambient temperature. However, in the heat of combustion, such as in a vehicle engine or within an industrial furnace or process, the gases react to form nitrogen oxide (NO) and nitrogen dioxide (NO2). This is an important consideration for the manufacturers of combustion equipment because emissions of these gases (collectively known as NOx) have serious health and environmental effects, and are therefore tightly regulated.

Nitrogen dioxide gas is a major pollutant in ambient air, responsible for large numbers of premature deaths, particularly in urban areas where vehicular emissions accumulate. NO2 also contributes to global warming and in some circumstances can cause acid rain. A wide range of regulations therefore exist to limit NOx emissions from combustion sources ranging from domestic wood burners to cars, and from industrial furnaces and generators to power stations. The developers of engines and furnaces therefore focus attention on the NOx emissions of their designs, and the operators of this equipment are generally required to undertake emissions monitoring to demonstrate regulatory compliance.

The role of monitoring in NOx reduction
NOx emissions can be reduced by:

  • reducing peak combustion temperature
  • reducing residence time at the peak temperature
  • chemical reduction of NOx during the combustion process
  • reducing nitrogen in the combustion process

These primary NOx reduction methods frequently involve extra cost or lower combustion efficiency, so NOx measurements are essential for the optimisation of engine/boiler efficiency. Secondary NOx reduction measures are possible by either chemical reduction or sorption/neutralisation. Naturally, the effects of these measures also require accurate emissions monitoring and control.

Choosing a NOx analyser
In practice, the main methods employed for the measurement of NOx are infrared, chemiluminescence and electrochemical. However, emissions monitoring standards are mostly performance based, so users need to select analysers that are able to demonstrate the required performance specification.

Rack Analyser

Infrared analysers measure the absorption of an emitted infrared light source through a gas sample. In Signal’s PULSAR range, Gas Filter Correlation technology enables the measurement of just the gas or gases of interest, with negligible interference from other gases and water vapour. Alternatively, FTIR enables the simultaneous speciation of many different species, including NO and NO2, but it is costly and in common with other infrared methods, is significantly less sensitive than CLD.

Electrochemical sensors are low cost and generally offer lower levels of performance. Gas diffuses into the sensor where it is oxidised or reduced, which results in a current that is limited by diffusion, so the output from these sensors is proportional to the gas concentration. However, users should take into consideration potential cross-sensitivities, as well as rigorous calibration requirements and limited sensor longevity.

The chemiluminescence detector (CLD) method of measuring NO is based on the use of a controlled amount of Ozone (O3) coming into contact with the sample containing NO inside a light sealed chamber. This chamber has a photomultiplier fitted so that it measures the photons given off by the reaction that takes place between NO and O3.

NO is oxidised by the O3 to become NO2 and photons are released as a part of the reaction. This chemiluminescence only occurs with NO, so in order to measure NO2 it is necessary to first convert it to NO. The NO2 value is added to the NO reading and this is equates to the NOx value.

Most of the oxides of nitrogen coming directly from combustion processes are NO, but much of it is further oxidised to NO2 as the NO mixes with air (which is 20.9% Oxygen). For regulatory monitoring, NO2 is generally the required measurement parameter, but for combustion research and development NOx is the common measurand. Consequently, chemiluminescence is the preferred measurement method for development engineers at manufacturer laboratories working on new technologies to reduce NOx emissions in the combustion of fossil fuels. For regulatory compliance monitoring, NDIR (Non-Dispersive Infrared) is more commonly employed.

Typical applications for CLD analysers therefore include the development and manufacture of gas turbines, large stationary diesel engines, large combustion plant process boilers, domestic gas water heaters and gas-fired factory space heaters, as well as combustion research, catalyst efficiency, NOx reduction, bus engine retrofits, truck NOx selective catalytic reduction development and any other manufacturing process which burns fossil fuels.

These applications require better accuracy than regulatory compliance because savings in the choice of analyser are negligible in comparison with the market benefits of developing engines and furnaces with superior efficiency and better, cleaner emissions.

Signal Group always offers non-heated, non-vacuum CLD analysers for combined cycle gas turbine (CCGT) power stations because these stations emit lower than average NOx levels. NDIR analysers typically have a range of 100ppm whereas CLD analysers are much more sensitive, with a lower range of 10ppm. Combustion processes operating with de-NOX equipment will need this superior level of sensitivity.

There is a high proportion of NO2 in the emissions of CCGT plants because they run with high levels of air in the combustion process, so it is necessary to convert NO2 to NO prior to analysis. Most CLD analysers are supplied with converters, but NDIR analysers are not so these are normally installed separately when NDIR is used.

In the USA, permitted levels for NOx are low, and many plants employ de-NOx equipment, so CLD analysers are often preferred. In Europe, the permitted levels are coming down, but there are fewer CCGT Large Plant operators, and in other markets such as India and China, permitted NOx emissions are significantly higher and NDIR is therefore more commonly employed.

In England, the Environment Agency requires continuous emissions monitors (CEMS) to have a range no more than 2.5 times the permitted NOx level, so as a manufacturer of both CLD and NDIR analysers, this can be a determining factor for Signal Group when deciding which analysers to recommend. The UK has a large number of CCGT power plants in operation and Signal Group has a high number of installed CEMS at these sites, but very few new plants have been built in recent years.

New NOx analysis technology
Signal Group recently announced the launch of the QUASAR Series IV gas analysers which employ CLD for the continuous measurement of NOx, Nitric Oxide, Nitrogen Dioxide or Ammonia in applications such as engine emissions, combustion studies, process monitoring, CEMS and gas production.

Chemiluminescence Analyser

The QUASAR instruments exploit the advantages of heated vacuum chemiluminescence, offering higher sensitivity with minimal quenching effects, and a heated reaction chamber that facilitates the processing of hot, wet sample gases without condensation. Signal’s vacuum technology improves the signal to noise ratio, and a fast response time makes it ideal for real-time reporting applications. However, a non-vacuum version is available for trace NOx measurements such as RDE (Real-world Driving Emissions) on-board vehicle testing, for which a 24VDC version is available.

A key feature of these latest instruments is the communications flexibility – all of the new Series IV instruments are compatible with 3G, 4G, GPRS, Bluetooth, Wifi and satellite communications; each instrument has its own IP address and runs on Windows software. This provides users with simple, secure access to their analyzers at any time, from almost anywhere.

In summary, it is clear that the choice of analyser is dictated by the application, so it is important to discuss this with appropriate suppliers/manufacturers. However, with the latest instruments, Signal’s customers can look forward to monitoring systems that are much more flexible and easier to operate. This will improve NOx reduction measures, and thereby help to protect both human health and the environment.


Smart manufacturing standards.

28/11/2018

A major international standards program on smart manufacturing will receive end-user input from the International Society of Automation (ISA), the developer of widely used international consensus standards in key areas of industrial automation, including cybersecurity, safety and enterprise-control integration.

In early November (2018), the International Electrotechnical Commission held the first meeting of a new IEC systems committee on smart manufacturing in Frankfurt (D). An IEC systems committee is intended to set high-level interfaces and functional requirements that span multiple work areas across the IEC and its partner, the International Organization of Standardization (ISO), to achieve a coordinated standards development plan.

The definition of smart manufacturing to be used by new IEC systems committee is:

Manufacturing that improves its performance aspects with integrated and intelligent use of processes and resources in cyber, physical and human spheres to create and deliver products and services, which also collaborates with other domains within an enterprise’s value chain. (Performance aspects can include agility, efficiency, safety, security, sustainability or other indicators. Enterprise domains, in addition to manufacturing, can include engineering, logistics, marketing, procurement, sales or other domains.)

Major supplier and government organizations from across the globe were well represented at the Frankfurt meeting, but participation from end users in industrial processing and manufacturing was noticeably low. However, ISA’s long-standing focus in its consensus industry standards on end-user performance, safety, and security, will be important in filling that void, as evident already in widely used IEC standards that are based on original ISA standards: 

  • ISA-99/IEC 62443: Industrial Automation & Control Systems Security
  • ISA-95/IEC 62264: Enterprise-Control System Integration
  • ISA-88/IEC 61512: Batch Control
  • ISA-84/IEC 61511: Functional Safety
  • ISA-18 IEC 62682: Management of Alarms
  • ISA-100/IEC 62734: Wireless Systems for Automation 

ISA’s participation will be facilitated through an IEC organizational liaison by which ISA standards and technical reports, both published and in development, can be directly circulated and reviewed within the systems committee as appropriate.

“The liaison status will enable ISA to participate more efficiently than would the traditional country-based structure of the IEC,” points out Charley Robinson, ISA’s Director of Standards, who attended the Frankfurt meeting. “This is important and appropriate because ISA’s standards development committees are open to experts from any country.”

In fact, experts from more than 40 countries participate in ISA standards—many on the committees that developed the original work for the widely used IEC standards noted above.

@ISA_Interchange #PAuto @IECStandards @isostandards

Blockchain: the future of food traceability?

20/04/2018
Shan Zhan, global business manager at ABB’s food and beverage business, looks at how blockchain* can be used to enhance food traceability.

“The Blockchain, can change…well everything.” That was the prediction of Goldman Sachs in 2015. There has been a lot of talk in the media recently about Blockchain, particularly around Bitcoin and other cryptocurrencies, but just as the investment bank giant predicted, the technology is starting to have more wide-reaching impacts on other sectors.

A report from research consultancy Kairos Future describes blockchain as a founding block for the digitalization of society. With multinationals such as IBM and Walmart driving a pilot project using blockchain technology for traceability, the food and beverage industry needs to look at the need for the protection of traceability data.

The United Nations recognizes food security as a key priority, especially in developing countries. While most countries must abide by strict traceability regulations, which are particularly strong in the EU, other regions may not have the same standards or the data may be at risk of fraud.

Food fraud is described by the Food Safety Net Services (FSNS) as the act of purposely altering, misrepresenting, mislabeling, substituting or tampering with any food product at any point along the farm-to-table food supply chain. Since the thirteenth century, laws have existed to protect consumers against harm from this. The first instance recorded of these laws was during the reign of English monarch King John, when England introduced laws against diluting wine with water or packing flour with chalk.

The crime still exists to this day. While malicious contamination intended to damage public health is a significant concern, a bigger problem is the mislabeling of food for financial gain. The biggest areas of risk are bulk commodities such as coffee and tea, composite meat products and Marine Stewardship Council (MSC) labelled fish. For example, lower-cost types of rice such as long-grain are sometimes mixed with a small amount of higher-priced basmati rice and sold as the latter. By using blockchain technology in their traceability records, food manufacturers can prevent this from happening.

Blockchain is a type of distributed ledger technology that keeps a digital record of all transactions. The records are broadcasted to a peer-to-peer (P2P) network consisting of computers known as nodes. Once a new transaction is verified, it is added as a new block to the blockchain and cannot be altered. And as the authors of Blockchain Revolution explain, “the blockchain is an incorruptible digital ledger of economic transactions that can be programmed to record not just financial transactions but virtually everything of value”.

When records of suppliers and customers are collected manually, to ensure the end manufacturer can trace the entire process, this does not protect the confidential data of suppliers. Blockchain technology anonymizes the data but it is still sufficient to ensure that the supply chain is up to standard.

In the case of mislabeled basmati rice, blockchain technology would prevent food fraud as the amount of each ingredient going into the supply chain cannot be lower than the volume going out. This would flag the product as a fraudulent product.

Not only can it help to monitor food ingredients, it can also monitor the conditions at the production facility. These are often very difficult to verify and, even if records are taken, they can be falsified. A photo or digital file can be taken to record the situation, such as a fish being caught, to show that it complies with the MSC’s regulations on sustainably caught seafood.

The blockchain will then create a secure digital fingerprint for this image that is recorded in the blockchain, known as a hash. The time and location of the photograph will be encrypted as part of this hash, so it cannot be manipulated. The next supplier in the blockchain will then have a key to this hash and will be able to see that their product has met the regulations.

Food and beverage manufacturers can also use blockchain to ensure that conditions at their production facilities are being met, or any other data that needs to be securely transferred along the production line. While we are not yet advanced enough with this technology to implement across all food and beverage supply chains, increased digitalization and being at the forefront of investment into these technologies will help plant managers to prepare their supply chain against the food fraud threat.

* The Wikipedia entry on Blockchain!

@ABBgroupnews #PAuto #Food @FSNSLABS @MSCecolabel

Compliance – more than just red tape.

03/07/2016

A growing customer demand for regulatory compliance combined with increased competition amongst manufacturers has made SCADA software a minimum requirement for the pharmaceutical industry. Here, Lee Sullivan, Regional Manager at COPA-DATA UK discusses why today, more than ever before, regulatory compliance is crucial for the pharmaceutical industry.

copabatchstdFDA statistics are forcing the industry to identify and implement improvements to its manufacturing processes. In its latest reports (published in Automation World), the FDA identified a significant increase in the number of drug shortages reported globally. With 65% of these drug shortage instances directly caused by issues in product quality, it’s clear that if more pharma manufacturers aimed to meet the criteria outlined in FDA and other industry standards, drug shortages and quality issues would certainly become less frequent.

The compulsion to become compliant obviously differs from company to company and standard to standard but one thing is certain: development starts with batch software. The range and capabilities of batch software vary immensely, but there are three factors to consider before making a choice: flexibility, connectivity and ergonomics.

Flexibility
To assist the complex processes of pharmaceutical manufacturing, batch software needs to be flexible. The software should manage traceability of raw materials through to the finished product and communicate fluently with other equipment in the facility. To ensure it provides a consistent procedure and terminology for batch production, the software should also be in line with the ISA-88 standard.

To meet increasing demand for personalised medication, manufacturers are seeking out batch software that is capable of creating flexible recipes, which are consistently repeatable. Traditional batch control creates one sequence of each process-specific recipe. While this model may be ideal for high volume manufacturing where the recipe does not change, today’s pharmaceutical production requires more flexibility.

To remain competitive, manufacturers need to provide compliance in a quick and cost-effective manner. Modern batch control software ensures flexibility by separating the equipment and recipe control. This allows the operator to make changes in batch recipes without any modifications to the equipment or additional engineering, thus saving the manufacturer time and money.

Connectivity
To avoid complications, manufacturers should choose independent software that supports a wide range of communication protocols. COPA-DATA’s zenon, for example, offers more than 300 high-performance interfaces that function in accordance with the ‘plug-and-play principle’. This makes it easy to implement and the user can start to collect, process and monitor production data straight away.

The communication model of the batch software extends upwards to fully integrate into manufacturing execution systems (MES) and business enterprise resource planning (ERP) systems. This links the raw material from goods-in through to the finished product at the customer’s site. The strong communication platform includes all layers of a production environment and extends to these systems.

Having no association with specific hardware providers ensures that regardless of the make and age of equipment, the batch software will be fully compatible and integrate seamlessly. Using this high level of connectivity minimises disruptions and quality problems, while also allowing pharmaceutical companies to collect data from the entire factory to archive digital records and ensure compliance across the processing line – allowing manufacturers to establish a fully connected smart factory.

Ergonomics
Lastly, understanding and using batch software should be stress free. As the pharmaceutical industry becomes more complex and more manufacturers begin exploring the realms of smart manufacturing, factory operators should be able to control and change batch production without complications.

By using software fitted with clear parameters and access levels, operators gain the ability to create and validate recipes, monitor the execution of production and review the performance of industrial machinery – without accidently altering or changing existing recipes and data. Reducing the amount of software engineering makes the operator’s life easier and minimises potential problems that could arise.

The benefits of complying with various industry regulations and standards do not stop with an enhanced Quality Management System (QMS). More customers will buy from you because you appear more reliable and your supply chain will see improved production indicators, such as increased OEE, reduced wastage, reduced recalls. On top of all of these benefits, you also improve product and thus patient safety.

To comply with industry standards, pharmaceutical companies should take steps to modernise their manufacturing processes, beginning with upgrading their batch control software. Anything else would be like putting the cart before the horse.

 

@copadata  @COPADATAUK #PAuto #SCADA

Future factory – a moderator’s impression!

01/02/2016

Read-out was asked to moderate the automation stream at the National Manufacturing & Supplies conference held last week outside Dublin. (26th January 2016). In their wisdom the organisers selected “Future Factory!” as a title for this half day seminar and there were 11 speakers organised to speak on their particular subjects for about 15 minutes each. This was replicated in the the over a dozen different seminars held on this one day.

q#MSC16

Long queues lasted well into the morning to enter the event!

We were a little sceptical that this would work but with the help of the organisers and the discipline of the speakers the time targets were achieved. Another target achieved was the number of attendees at the event as well as those who attended this particular seminar.
In all between exhibitors, speakers and visitors well over 3000 packed the venue. Probably far more than the organisers had anticipated and hopefully a potent sign that the economy is again on the upturn. Indeed it was so successful that it was trending (#MSC16) on twitter for most of the day.

Seminar
But back to our seminar. If you google the term Future Factory you get back 207million links, yet it is difficult to find a simple definition as to what it means. The term automation similarly is a very difficult term to define though the term in Irish “uathoibriú” perhaps is a bit clearer literally meaning “self-working.”

uturefactory.jpg

Good attendance at the Seminar

Background
The world of automation has changed to an extrordinary degree and yet in other ways it remains the same. The areas where it has experienced least change is in the areas of sensing – a thermometer is a thermometer – and final control – a valve is a valve. Where it has changed almost to the point of unrecognisability is in that bit in the middle, what one does with the signal from the sensor to activate the final control element.

From single parameter dedicated Indicator/Controller/Recorders in the sixties which transmitted either pnuematically (3-15psi) or electrically (4-20mA). Gradually (relatively speaking) most instruments became electronic, smaller in size and multifunctional. The means of communication changed too and fieldbus communication became more common to intercact with computors which themselves were developing at breaknech speed. Then transmission via wireless became more common and finally the internet and the ability to control a process from the computer that we call the intelligent phone. There are problems with these latter, internet/cellphone, of course. One is that the reach of the internet is focussed at present on areas of high population. Another is the danger of infiltration of systems by hostile or mischivous strangers. The importance of security protocols is one that has only recently been apparent to Automation professionals.

• Many of the presentations are available on-line here. The password is manufac2016

The Presentations
Maria Archer of Ericsson spoke on the enabling and facilitating IoT in the manufacturing industry. Diving straight into topic she drew on her experience of big data, e-commerce, media, cyber security, IOT and connected devices.

The second speaker was Cormac Garvey of Hal Software who addressed Supply Chain prototyping. The Supply Chain ecosystem is incredibly complex, usually requiring significant integration of each suppliers’ standards and processes to the manufacturer’s. Cormac will introduce the concept of supply chain prototyping, where easy-to-use, standards-based technology is used to wireframe out the entire supply chain ecosystem prior to integration, thus significantly reducing cost, time and risk on the project. This wireframe can then be used as a model for future integration projects.

Two speakers from the Tralee Institute of Technology, Dr. Pat Doody and Dr. Daniel Riordan spoke on RFID, IoT, Sensor & Process Automation for Industry 4.0. They explained how IMaR’s (Intelligent Mechatronics and RFID) expertise is delivering for their industrial partners and is available to those aiming to become a part of Industry 4.0.

Smart Manufacturing – the power of actionable data was the topic addressed by Mark Higgins of Fast Technology. He shared his understanding of the acute issues companies face on their journey to Business Excellence and how leveraging IT solutions can elevate the business to a new point on that journey.

Assistant Professor (Mechanical & Manuf. Eng) at TCD, Dr Garret O’Donnell,   explained how one of the most significant initiatives in the last 2 years has been the concept of the 4th industrial revolution promoted by the National Academy for Science and Engineering in Germany- ACATECH, known as Industrie 4.0. (Industrie 4.0 was first used as a term in Germany in 2011).

Another speaker from Fast Technologies, Joe Gallaher, addressed the area of Robotics and how Collaborative Robots are the “Game Changer” in the modern manufacturing facility.

Dr. Hassan Kaghazchi of the University of Limerick and Profibus spoke on PROFINET and Industrie 4.0. Industrial communications systems play a major role in today’s manufacturing systems. The ability to provide connectivity, handle large amount of data, uptime, open standards, safety, and security are the major deciding factors. This presentation shows how PROFINET fits into Industrial Internet of Things (Industrie 4.0).

White Andreetto

Maurice Buckley CEO NSAI

The CEO of NSAI, the Irish National Standards Authority, Maurice Buckley explained how standards and the National Standards Authority of Ireland can help Irish businesses take advantage of the fourth industrial revolution and become more prepared to reap the rewards digitisation can bring.

The next two speakers stressed the impact of low forecast accuracy on the bottom line and how this coulbe be addressed. Jaap Piersma a consultant with SAS UK & Ireland explained that low forecast accuracies on the business performance is high in industry but with the right tools, the right approach and experienced resources you can achieve very significant result and benefits for your business. Following him Dave Clarke, Chief Data Scientist at Asystec, who mantains the company strategy for big data analytics service development for customers. He showed how are incredible business opportunities possible by harnessing the massive data sets generated in the machine to machine and person to machine hyper connected IoT world.

The final speaker David Goodstein, Connected Living Project Director, GSMA, described new form factor mobile SIMs which are robust, remotely manageable which are an essential enabler for applications and services in the connected world.

All in all a very interesting event and useful to attendees. Papers are being collected and should be available shortly on-line.

It is hoped to do it all again next year on 24th January 2017- #MSC17.

See you there.

@NationalMSC #MSC16 #PAuto #IoT


The future of regulatory compliance in the pharmaceutical industry.

29/01/2016

In this short article, Martyn Williams, managing director of Copa-Data in Great Britain, explains the steps pharmaceutical manufacturers can take on the road to compliance.

The pharmaceutical industry today operates in one of the world’s most heavily regulated environments. Over the past few years, the industry has experienced significant regulatory change and looking to the future, the strict nature of the industry doesn’t appear to be slacking.

COP174The repercussions of failing to comply with industry standards can be detrimental for pharmaceutical manufacturers. It’s no secret that the integrity and reputation of pharmaceutical brands is integral to their success. As a result, even the smallest failure can be irreversibly damaging.

With industry standards surrounding crucial elements like product integrity, energy efficiency, health and safety and product testing, there is more pressure than ever to ensure manufacturers begin to take steps towards compliance. In this elaborate regulations landscape, how do manufacturers ensure production operates in an efficient and effective way?

Validation-friendly technology
Digitization of processes and the emergence of Industrial Internet of Things (IIoT) has transformed the entire manufacturing industry. However, for pharmaceutical manufacturers, the benefits of IIoT have far surpassed an increase in automated productivity.

The introduction of smart devices enables real-time data reporting to central control systems. Naturally reducing manual intervention and minimising adverse events during production. Paired with validation friendly software, an IIoT enabled factory provides live monitoring of regulatory reporting, potentially reducing the validation efforts many manufacturers face with a risk-based approach. This can greatly help to maximise production agility, allowing manufactures to respond to change and ultimately increase profitably.

For example, with batch control being a key step of the validation process, the combination of a smart factory and intelligent SCADA software couldn’t be more valuable. Automatically generating a trail of reliable audits, electronic signatures and real-time reporting, complicated pharmaceutical standards like FDA 21 CFR Part 11 are not so difficult to obtain.

The cloud and compliance
As manufacturers embrace IIoT, migrating to the cloud is the obvious solution to house and manage the growing expanse of production data. However, the cloud does more than just collect and store data, it allows manufacturers to gain actionable insights, directly from it.

Predictive analysis, for example, produces an intelligent forecast of when and where industrial equipment is most likely to fail. Using this data, contingency plans can be made to ensure, regardless of equipment failure, pharmaceutical production will continue to meet regulatory standards.

In addition, cloud computing enables simplified regulatory submissions. Using data stored in the cloud, manufacturers can directly feed digitised production information to regulatory bodies. This feature can be particularly helpful speeding up the lengthy process of clinical trials as well as post drug launch.

With the ever increasing risks concerning drug counterfeiting, efficiency challenges, adaptations to modern day agile processes and the industry-wide efforts to implement serialisation, it will be interesting to witness how IIoT can continue to solve these challenges moving forward.

Getting the green light
Over the past decade, the global focus on environmental sustainability has been hard to ignore. In industry, initiatives such as the European Union’s Energy Savings Opportunity Scheme (ESOS) and the voluntary certification ISO 50001, have put pressure on manufacturers to jump on the efficiency bandwagon. Using the same IIoT and smart software combination, organisations can gather comprehensive data from the entire factory and subsequently meet these efficiency requirements.

@copadata #PAuto #PHarma

Countdown to February 2016 – Preparing for CEC Level VI.

09/10/2015

Jon Vallis – Sales and Marketing Manager, Ideal Power, discusses the impending regulations.

Successes in reducing the no-load power of external power supplies has resulted in estimated savings of 32billionKW in energy consumption, a reduction in CO² emissions by more than 24million tons and an annual saving of $2.5billion – €2.2bn or £stg1.64bn – (US EPA (Environmental Protection Agency figures). These figures are impressive, and the result of a decade of regulation and programmes by the industry, both voluntary and mandatory ones. The drive for ‘greener’ energy, however, means that an upcoming standard, due to be implemented in February 2016, will further reduce the energy consumed by external power adapters. In some cases, the reductions could result in as little as 20% of the levels allowed in previous standards.

IdealPower VI logoThe CEC (California Energy Commission) Level VI standard is due to come into effect on February 10, 2016 and will bring in some significant revisions and definitions. OEM manufacturers will need to be aware of new performance thresholds, direct and indirect operation models and exemptions.

The last 10 years has seen a lot of rapid development in the reduction of power consumption by external power supplies. In 2004, the CEC introduced the first mandatory standard for energy efficiency, denoting a power supply that meets EnergyStar Tier 1 and Australia’s MEPS (Maximum Efficiency Performance Standards). The EU introduced the ErP phase 1 (Energy related Products) directive in 2010 and harmonised CEC and EISA (Energy Independence and Security Act) the following year, in the ErP Directive 2009/125/EC. Around the same time, the EnergyStar certification ceased to apply to power supplies.

Today, all external power supplies must meet CEC Level IV for the USA and Canada and Level V if shipped to the EU.

The next step, driven by CEC VI, is to further reduce no-load power in single and multiple voltage external power supplies. It introduces multiple output power supplies into the regulation and also addresses external power supplies below 250W. The standard has no-load power thresholds for single voltage, external AC/DC power supplies, low voltage and basic models, for categories of 1W and below, 49 to 250W and 250W and above. There are also limits for no-load power for multiple voltage external power supplies in the same power ranges.

Another distinction of CEC Level VI is that it does not apply to direct operation power supplies, i.e. those that function in an end product without the assistance of a battery. It does apply to indirect operation devices, i.e. those devices which are not battery chargers, but which cannot operate the end product without the assistance of a battery. EISA2007 will govern the limits of indirect operation power supplies.

Exemptions to CEC Level VI include any device that may need FDA (Food and Drug Administration) approval for medical use; power the charger of a detachable battery pack, or charges the battery of a product that is fully, or primarily, motor-operated; and products made available as a service or spare part by the end-product manufacturer before July 1 2008.

The EU is revising its EcoDesign Directive (or ErP II for energy related products) which is considered to be a parallel standard. Countries such as Canada and Australia are expected to adopt CEC Level VI.

OEMS must ensure compliance with all regulations for whichever region products are shipped to. For help in meeting the increased levels of energy saving required to comply with CEC Level VI, contact the IP Support team at Ideal Power, the power conversion experts.

Ideal Power provides external power supplies and adapters, open frame, encapsulated and DIN Rail power conversion products and battery chargers to markets including industrial, computing, medical, communications, LED Lighting, security, consumer and leisure. Products are sourced from market leading suppliers including EOS and Glary,

Ideal Power’s British based IP Support Engineering Team are available to work with customer’s design engineers to create modified-standard and full custom designs for quantities from 100 pieces upwards.  Their IP Support team has over 40 years of experience in the power supply industry and provides local assistance throughout the whole design process from concept, design, quotations, samples and testing to approvals and standards compliance, shipping, stock-holding and after-sales service.


Changing industry standards for OEMs Enginers & contractors: one year on!

04/10/2015

In 1998, Google was founded, the first Apple iMac was introduced and the legendary Windows ’98 was released by Microsoft. In a less glamorous but equally important corner of industry, a new commission was being formed to revise the complex IEC 60439 industry standard, which governed the safety and performance of electrical switchgear assemblies. Although Windows ‘98 has long been consigned to history, the new industry standard – BS EN 61439 – only became mandatory on November 1, 2014.

One year on, Pat McLaughlin, Boulting Technology’s Operations Director, evaluates how original equipment manufacturers, panel builders, electrical engineers, consulting engineers and contractors have been affected by the new BS EN 61439 standard.

Boulting_BS_EN_61439

Why a new standard?
In a market where the demand to optimise and reduce costs blends heavily with higher needs for assembly flexibility, the introduction of a new set of standards was needed to guarantee the performance of Low Voltage Switchgear Assemblies.

Switchgear and Control Gear assemblies are multifaceted and have an endless number of component combinations. Before the introduction of the new standard, testing every conceivable variant was not only time consuming and costly, but impractical.

The intricate character of assemblies also meant that many did not fit into the previous two testing categories: Type Tested Assembly (TTA) and Partially Tested Assembly (PTTA). For example, panels which were too small to be covered by TTA and PTTA fell outside the standard. Finally, in the case of a PTTA, ensuring the safety and suitability of a design was often dependent strictly on the expertise and integrity of the manufacturer.

Design verification
The major change introduced by the new BS EN 61439 standard refers to testing. It states that the capabilities of each assembly will be verified in two stages: design verification and routine verification. This means the new standard completely discards the type-tested (TTA) and partially type-tested assemblies (PTTA) categories in favour of design verification.

Although BS EN 61439 still regards type testing as the preferred option for verifying designs, it also introduces a series of alternative routes to design verification.

The options include using an already verified design for reference, calculation and interpolation. The BS EN 61439 standard specifies that specific margins must be added to the design, when using anything other than type testing.

One of the main benefits of the new design verification procedure is its flexibility. Under the old BS EN 60439 specification customers would demand a Type Test certificate for each assembly particularly Incoming Air Circuit Breakers, which was very expensive and time consuming.

The new standard allows users and specifiers to pertinently define the requirements of each application. Annex D of the BS EN 61439 standard provides a list of 13 categories or verifications required, what testing method can be used and what comparisons can be made. In order to optimise testing time, the standard allows derivation of the rating of similar variants without testing, assuming the ratings of critical variants have been established by test.

Dividing responsibility
The second major change implemented by the new industry standard refers to the responsibilities of each party involved in the design, test and implementation of low voltage switchboard assemblies. Unlike BS EN 60439, which stated the OEM or the system manufacturer was solely responsible throughout the testing programme, the new standard divides the responsibilities between the OEM and the assembly manufacturer, or panel builder.

The new standard recognises that several parties may be involved between concept and delivery of a switchboard assembly. The OEM is responsible for the basic design verification. In addition, the assembly manufacturer is meant to oversee the completion of the assembly and routine testing.

For innovators like Boulting Technology, the new BS EN 61439 has brought more freedom and flexibility when designing switchboard assemblies. For example, Boulting Technology has designed and launched the Boulting Power Centre, a range of low voltage switchboards, which are available in 25kA, 50kA, 80kA and 100kA, fault ratings, and up to 6300Amp current ratings.

Although change is never much fun, it’s what technology and industry are all about. If this wasn’t the case, we would all still be using Windows 98 or the indestructible Nokia 5110.


The future of CCD image sensors: Are we seeing the end of an era?

30/03/2015
Sony recently announced its intention to close its 200 mm CCD wafer line in Kagoshima and to stop the manufacture of the majority of Sony’s industrial CCD (charge-coupled device) sensors.

Mark Williamson, Director – Corporate Market Development of Stemmer Imaging, explains how machine vision users and his customers are affected by this decision.

Mark Williamson

Mark Williamson

Question: CCD sensors have been the key enabler of the imaging and machine vision market, with Sony being the largest vendor of CCDs to this market. What has driven this decision?

Williamson: Before the CCD arrived video cameras were based on tube technology which were free running only and came from the broadcast industry. When CCD technology launched it became possible to add specialist features in cameras to enable triggering and hence the ability to synchronise to the production line. This enabled the explosive growth of industrial vision which developed into the industry we know today. However, while Sony CCDs have the largest market share in industrial imaging, the biggest market for image sensors has been larger markets such as consumer cameras, mobile phones, CCTV and broadcast. The importance of the CCD to mankind was recognised by a Nobel prize in Physics in 2009. In the last few years there has been a big shift from CCD to CMOS in these high volume markets which has left the CCD wafer line very underutilised even with the high number of machine vision sensors sold. This makes the factory no longer financially viable.

Question: Historically CCD sensors have outperformed CMOS (complementary metal-oxide semiconductor) sensors in terms of image quality. Will Sony’s decision reduce the availability of high image quality sensors?

Williamson: Absolutely not, CMOS has traditionally had a reputation for lower image quality, however recent sensors have surpassed the image quality of Sony CCDs in terms of noise and dynamic range. This, coupled with the numerous advantages of CMOS sensors such as speed, lower power consumption, less support electronics and the elimination of tap balancing is the natural evolution of technology. The higher end CCDs from ON Semiconductor (formerly Truesense and Kodak) and the full frame CCDs used in professional photography from Teledyne DALSA are still available for high end applications although over time CMOS will affect this market segment also.

Question:  Are there other advantages of CMOS over CCD technology?

Williamson: From a manufacturing point of view CMOS sensors can be built on standard wafer lines which utilise mainstream manufacturing capacity and competition. From a technical point of view the ability to mix sensor and support circuits on to one device simplifies camera design and allow additional features to be integrated. Multiple regions of interest, and linear scaling of frame rate versus readout region provide application flexibility and high dynamic range modes, additionally the reduction in oversaturated image bleed makes the cameras more tolerant of changing illumination.

Question: What is the share of CCD cameras compared to CMOS cameras Stemmer is selling today?

Williamson: In 2010, 22 % of cameras we sold were based on CMOS sensors. This has risen to 58 % in 2014 with 32 % of cameras using Sony CCDs and the remainder other high end CCDs. With nearly all new camera designs using CMOS the prediction is that in a further 4 years the natural shift would make the CMOS market share approximately 80 %.

Question: What is your expectation of how Sony’s decision will change that ratio in the future?

Williamson: Although Sony has announced the closure, production will not cease until 2017 with the last deliveries in 2020 or even later, depending on the sensor model. This time scale is designed to follow the natural declining trend which is expected to continue and maybe slightly accelerate. With the attractive price: performance ratio of new CMOS cameras new designs are expected to use CMOS anyway.

Question: What are your plans with regards to the announcement?

Williamson: While Sony CCD availability will continue until at least 2020 the camera manufacturers will need to commit to quantities much earlier. Each camera manufacturer may choose to take a different approach, to commit to stock sensors or asking customers to make future commitments. Stemmer Imaging are liaising with all our camera manufacturers to agree their policy and we will communicate this policy if it has any effect on availability to customers. Some models will be available even after 2020.

Question: What is your advice for imaging and machine vision integrators and users that have used CCD cameras in the past?

Williamson: If you build an OEM product that utilises a CCD camera we believe there is no immediate need to change the camera. If there is any risk of your particular camera being made obsolete we will inform you normally with 6 months notice under our End of Life programme. However when selecting a product for a new application we would recommend selecting CMOS sensor based cameras as availability will be longer and also the price: performance ratio will be better. If CCD capability is important remember CCD sensors are available from other companies.

Question:  Are Sony leaving the machine vision sensor market by discontinuing its CCD sensors?

Williamson: Sony have been innovating with CMOS sensors for some time and are investing significantly in expanding their CMOS wafer production capability. They have announced their first CMOS global shutter sensor family under the Pregius name aimed directly at the imaging and machine vision market. The first model named IMX174 is already shipping in a number of our cameras and outperforms the Sony CCD equivalent. With a clear roadmap of further models we will still see Sony sensors in the machine vision market.

Question: Which other players are in the imaging and machine vision sensor market?

Williamson: Over the last 10 years we have seen many small companies launch CMOS image sensors addressing the low cost or high speed market where CCDs could not compete. In recent years a number of these have become significant providers through a combination of innovation and acquisition. While Sony has been the dominant supplier of CCDs to the imaging and machine markets this dominance is not evident with CMOS giving more market choice . Key players besides Sony are ON Semiconductor, CMOSIS, e2v and Teledyne DALSA. Our direct relationship with Teledyne DALSA allows us influence over their sensor strategy so customer needs are valuable input.

Question: With so many manufacturers and sensors how do I choose what?s right for my application?

Williamson: Like any product each manufacturer’s design has advantages and disadvantages. Stemmer Imaging has an in-house EMVA 1288 camera testing facility which is used to characterise cameras and hence the sensors beyond the spec sheet. With this capability, our immense knowledge of sensor and camera technologies and access to the largest number of camera manufacturers and possibly all sensors relevant to our market we are well placed to advise customers as to the sensors and cameras that are best suited to their application. When you are ready to migrate to the new generation of CMOS sensors we are here ready to assist.

• See also: What is the difference between CCD and CMOS image sensors in a digital camera? (How stuff works!)

Foundation & Hart to merge?

26/09/2013

It has finally been formally acknowledged. After many years of co-operation Fieldbus Foundation and HART are strongly considering pooling their resources. Where the proposed merger leaves other standards, particularly ProfiBus remains to be seen.

The Fieldbus Foundation and the HART Communication Foundation have entered into discussions on the potential for merging the two organizations into a single industry foundation dedicated to the needs of intelligent device communications in the world of process automation.

fartThe chairmen of the two organizations—Dr. Gunther Kegel of the Fieldbus Foundation and Mr. Mark Schumacher of the HART Communication Foundation—issued the following statement on behalf of their Boards of Directors:

“We believe combining the resources and capabilities of each foundation into a single organization will provide significant benefits to both end users and suppliers. For end users, a single organization that combines the power of both Fieldbus Foundation and HART Communication Foundation would provide a full solution that addresses every conceivable aspect of field communications and intelligent device management for the process industries. For suppliers, a single organization would create efficiencies in resource utilization, consistency of processes and procedures, and would deliver significant improvements in member services and support.”

The Fieldbus Foundation and HART Communication Foundation have worked extensively together in the past and have a long history of cooperation. For example, the two organizations worked together on the development of common international standards such as Electronic Device Description Language (EDDL) and, most recently, the development of the Field Device Integration (FDI) specification. The merger offers significant potential to harmonize many aspects of the two protocols, making it easier for end users and suppliers to implement the technology and obtain the full benefits of each technology in plant operations and maintenance.

In preliminary discussions, the presidents of the two organizations, Richard J. Timoney of the Fieldbus Foundation and Ted Masters of the HART Communication Foundation, added that many synergies already exist and closed by commenting:

“We are both confident that today’s decision to investigate the merger of these two organizations provides momentum for a major step forward in the evolution of intelligent devices and the world of industrial communications.”

More details are given in this Question & Answer paper published with this announcement!

The Fieldbus Foundation and HART Communication Foundation have signed a memorandum of understanding for a possible merger of the two organizations. This proposed merger is still in the exploratory phase and is not yet guaranteed. Here are some answers to frequently asked questions about the merger.

Q: Is the merger a foregone conclusion, with an agreement to merge the two organizations that has been approved by the Boards of Directors?
A: No. What has been agreed is that each organization will appoint a study team to review the possibility of merging the organizations based on an increased value of a single organization, as well as significant benefits to their respective memberships and the automation industry in general.

Q: Would this be a true merger or an acquisition of one organization by another?
A: The merger would be a true merger of equals and not an acquisition of any one organization by another. A combined organization of Fieldbus Foundation and HART technologies could better leverage the complementary benefits of the technologies. The new combined organization would create more cooperation and collaboration. In addition, improved economies of scale would be realized through merging training and education; seminars; testing and registration; participation at trade shows, conferences, and events; online presence; and social media strategies.

Q: I am a member of only one of the Foundations. How would a merger affect my future membership?
A: Membership in either one of the existing foundations would carry over into the new proposed organization with the same rights and benefits that members enjoy today.

Q: If I were a member of both Foundations, how would this affect my membership costs?
A: While we have begun an analysis of our respective memberships, we have not yet defined the membership model as it relates to membership dues. Members of both foundations should see increased efficiencies and reduced total costs as more and more standards, processes and procedures are harmonized. Over time, we anticipate suppliers recognizing more efficiency compared to membership in both organizations.

Q: If the investigation were successful, when would a merger likely happen?
A: There is still a lot of exploratory work to do in regard to due diligence in the financial and legal arenas. Everything we do must meet strict criteria in terms of benefitting our membership and the broader automation market, including our mutual end users. Once that is done, there are board and membership votes and, if successful, legal filings. Our target is to have everything completed by mid-2014.

Q: Who will decide if the merger is to proceed?
A: The decision to proceed with the merger will flow through three steps. First, the study team will prepare a report and recommendation for each board of directors. Once that is completed, the boards will individually vote to proceed or not. Finally, if both boards vote to proceed with the merger, the proposal will go to a member vote in both organizations.

Q: What are some of the goals of the proposed new merged organization?
A: There are a number of goals:

• Collaboration on new and existing technologies.
• Fully integrated marketing strategy to advance the extensive use of digital
devices.
• Improved products and services.
• Increased market share of digital field devices in total device market.

Q: Would the technologies and protocols of both Foundations continue to exist
and evolve on their own?
A: Both the FOUNDATION fieldbus and HART specifications would continue to exist
separately and evolve. Each protocol would retain and maintain its own brand name, trademarks, patents and copyrights. The proposed organization would continue to seek areas of logical harmonization just as we have with EDDL and FDI.

Q: How would the proposed organization deal with the different wireless
strategies that exist?
A: The proposed organization would continue to support the wireless strategies that exist today within each organization.

Q: How would the proposed merger affect the current activities regarding FDI?
A: Both organizations are totally committed to the FDI project and would continue to support FDI as the sole integration technique for smart devices.

Q: Would the two organizations move to a single location?
A: Pending approval of the merger, the plan is to co-locate both organizations into a
single facility as soon as it is practical.

Q: How would the merger affect host system, and device testing and registration?
A: Both the Fieldbus Foundation and HART Communication Foundation are currently working on common device and host testing procedures under the FDI Cooperation initiative. That is one of the major benefits of the FDI project. Although elements of those tests may differ based on the structure of the protocols, there are many elements that the two organizations share in common. We anticipate that we will move toward a common set of procedures for both device and host testing, and a common registration process.