Is AI all it is cracked up to be?

28/03/2017
In this article, Stephen Parker, CEO of Parker Software, examines whether artificial intelligence is all it’s cracked up to be.

If planet Earth had been created one year ago, the human species would be just ten minutes old. Putting this into context, the industrial era would have kick-started a mere two seconds ago. Thanks to human influence, the pace of technological advancement on Earth is astonishing. However, we are already on the verge of the next change. The potential of artificial intelligence has been discussed by scientists since the 1950s and modern technological advances are finally bringing this technology to the masses. 

Research suggests that artificial intelligence could be as ‘smart’ as human beings within the next century. Originally, human programmers were required to handcraft knowledge items painstakingly. Today, however, one-off algorithms can teach machines to take on and develop knowledge automatically, in the same way a human infant would. Artificial intelligence has reached a critical tipping point and its power is set to impact every business, in every industry sector.

Already, 38 per cent of enterprises are using artificial intelligence in their business operations and this figure is set to grow to 62 per cent by 2018. In fact, according to predictions by Forrester, investments in artificial intelligence technology will increase three-fold in 2017. These figures mean that the market could be worth an estimated $47 billion by 2020. 

Intelligent assistance
One of the most notable applications of AI from the past few years is the creation of intelligent assistants. Intelligent assistants are interactive systems that can communicate with humans to help them access information or complete tasks. This is usually accomplished with speech recognition technology; think Apple’s Siri, Microsoft’s Cortana or Amazon’s Alexa. Most of the intelligent assistants that we are familiar with today are consumer facing and are somewhat general in the tasks they can complete. However, these applications are now making their way into more advanced customer service settings.

While there is certainly a space for these automated assistants in the enterprise realm, there is a debate as to whether this technology could fully replace a contact centre agent.

Automation is widely recognised as a valuable tool for organisations to route the customer to the correct agent. However, completely handing over the reins of customer management to a machine could to be a step too far for most businesses. Even the most advanced AI platforms only hold an IQ score equivalent to that of a four-year-old, and naturally, businesses are unlikely to entrust their customer service offering to a child.

The human touch
Automated processes are invaluable for speeding up laborious processes and completing monotonous customer service tasks. But as any customer service expert will tell you, the human touch is what elevates good service to an excellent experience for the customer. Simple tasks will no doubt be increasingly managed and completed using automation and AI-enabled agent support systems, whereas complex issues will still require the careful intervention of a human agent.

During a TED Talk on artificial intelligence, philosopher and technologist Nick Bostrom claimed that “machine intelligence is the last invention that humanity will ever need to make.” However, contact centre agents needn’t hang up their headsets just yet.  Artificial intelligence won’t be replacing the call centre agent any time soon. The only guarantee is that the role of a call centre agent will continue to evolve after all, the industrial revolution was only two seconds ago.

@ParkerSoftware #PAuto

It IS rocket science!

13/03/2017

Graham Mackrell, managing director Harmonic Drive, explains why its strain wave gears have been the top choice in space for over forty years.

Anything that goes into space is seen as the pinnacle of human creation. Astronauts are highly trained and are at the peak of physical fitness, space shuttles are crafted by large teams of expert engineers and all the technology used is so high-tech it’s as if it belongs to science fiction.

Driving on Mars!

Many decades ago, the first Harmonic Drive gears were sent into space during the Apollo 15 mission. Even from the beginnings of the space race, the expectations for the technology used were high. The equipment used in space had to be reliable, compact and lightweight and given the increasing demands on equipment in today’s space missions, it must also now be highly accurate with zero backlash and have high torque capacity.

When aerospace engineers were recently designing a new space rover, they looked to Harmonic Drive gears for reliability. Due to the obvious difficulties of performing repairs in space, a high mean time between equipment failures is a high priority. Harmonic Drive products achieve this by prioritising quality throughout the entire design and manufacturing process.

It is vital that aerospace gears are thoroughly tested before they are sent to customers, ensuring that they always receive a quality product. At Harmonic Drive, we test products using finite element method (FEM) testing. This process simulates real world physics to ensure that the product is capable of surviving in space. For example, structural testing is carried out to ensure the product is robust and the space rover travelling over rough terrain will not damage the actuators used in the wheels. Thermodynamic properties are also important as aerospace gears are often exposed to both extremes of the temperature range, which are tested in the initial design process.

Also considered in the design process is the part count of the aerospace gears. Harmonic uses a low part count which means that they are maintenance free. In addition, there is a lower chance of components failing giving the gears a high Mean Time Between Failure (MTBF). This also contributes to the compactness and light weight of the gears, a feature essential in space.

Another key feature for aerospace gears is high torque capacity and zero backlash. This is essential for systems which communicate the location of the rover to the control room. If traditional, high backlash gears were to be used, the system would misreport the rover’s location. This would cause problems when the rover is used to survey uncharted areas of planets and could lead to inaccurate mapping. Due to the emphasis on high precision with Harmonic Drive gears, this problem can be avoided.

The numerous quality processes that Harmonic Drive undertakes have led to recognition from a number of accrediting bodies. Harmonic Drive products are AS9100 certified, a specific aerospace standard for the design, manufacture and sale of precision gear reducers, servo-actuators and electro-mechanical positioning systems.

To be the pinnacle of global technology, there are no shortcuts. Components used in aerospace technology must be subject to vigorous testing in order to be reliable, safe and have a long product life.

• The MARS adventure: The NASA site.
@HarmonicDriveUK #PAuto #Robotics @StoneJunctionPR

Cybersecurity pitfalls!

09/03/2017

Jonathan Wilkins, marketing director of obsolete industrial parts supplier, EU Automation discusses three cyber security pitfalls that industry should prepare for – the weaponisation of everyday devices, older attacks, such as Heartbleed and Shellshock and vulnerabilities in industrial control systems.

IBM X-Force® Research
2016 Cyber Security Intelligence Index

In 2016, IBM reported that manufacturing was the second most cyber-attacked industry. With new strains of ransomware and other vulnerabilities created every week, what should manufacturers look out for in new year?

‘Weaponisation’ of everyday devices
The advantages of accessing data from smart devices include condition monitoring, predictive analytics and predictive maintenance, all of which can save manufacturers money.

However, recent attacks proved that these connected devices can quickly become weapons, programmed to attack the heart of any business and shut down facilities. In a recent distributed denial of service (DDOS) attack, everyday devices were used to bring down some of the most visited websites in the world, including Twitter, Reddit and AirBNB.

Such incidents raise a clear alarm signal that manufacturers should run their production line on a separate, highly secure network. For manufacturers that use connected devices, cyber security is even more important, so they should conduct regular cyber security audits and ensure security protocols are in place and up-to-date.

Don’t forget the oldies
According to the 2016 Manufacturing Report, manufacturers are more susceptible to older attacks, such as Heartbleed and Shellshock. These are serious vulnerabilities found in the OpenSSL cryptographic that allows attackers to eavesdrop on communications and steal data directly from users.

Industrial computer systems generally aren’t updated or replaced as often as consumer technology, which means that some still have the original OpenSSL software installed. A fixed version of the programme has since been released, meaning that manufacturers can avoid this type of attack by simply updating their system.

Keeping industrial control
Manufacturers understand the need to protect their networks and corporate systems from attacks, but their industrial control systems also pose a risk. If an attacker deploys ransomware to lock down manufacturing computers, it could cause long periods of downtime, loss of production and scrap of products that are being made when the attack happens.

This is particularly true in the era of Industry 4.0, where devices are connected and processes are automated. One of the most effective means of safeguarding automated production systems is cell protection. This form of defence is especially effective against man-in-the-middle attacks, whereby the attacker has the ability to monitor, alter and inject messages in a communications system.

In its report, IBM also stated that cyber security awareness in the manufacturing industry is lower than other sectors. The truth is that any company can be the target of a cyber attack. The only way to avoid a cyber security breach is by planning ahead and preparing for the unexpected.

#PAuto @StoneJunctionPR @IBMSecurity

Communication analysis: Industrial Ethernet & Wireless v Fieldbus.

06/03/2017

Industrial Ethernet and Wireless growth is accelerated by the increasing need for industrial devices to get connected and the Industrial Internet of Things. This is the main finding of HMS Industrial Networks’ annual study of the industrial network market. Industrial Ethernet now accounts for 46% of the market (38 last year). Wireless technologies are also coming on strong, now at 6% (4) market share. Combined, industrial Ethernet and Wireless now account for 52% of the market, while fieldbuses are at 48%.

Fieldbus vs. industrial Ethernet and wireless
HMS’s estimation for 2017 based on number of new installed nodes in 2016 within Factory Automation. The estimation is based on several market studies and HMS’s own sales statistics

HMS Industrial Networks now presents their annual analysis of the industrial network market, which focuses on new installed nodes within factory automation globally. As an independent supplier of products and services for industrial communication and the Internet of Things, HMS has a substantial insight into the industrial network market. Here are some of the trends they see within industrial communication in 2017.

network-shares-according-to-hms-2017-jpg_ico500
Industrial Internet of Things is boosting Industrial Ethernet growth
According to HMS, industrial Ethernet is growing faster than previous years, with a growth rate of 22%. Industrial Ethernet now makes up for 46% of the global market compared to 38% last year. EtherNet/IP and PROFINET are tied at first place, with PROFINET dominating in Central Europe, and EtherNet/IP leading in North America. Runners-up globally are EtherCAT, Modbus-TCP and Ethernet POWERLINK.

Anders Hanson

Anders Hanson

“We definitely see an accelerated transition towards various industrial Ethernet networks when it comes to new installed nodes,” says Anders Hansson, Marketing Director at HMS. “The transition to industrial Ethernet is driven by the need for high performance, integration between factory installations and IT-systems, as well as the Industrial Internet of Things in general.”

Wireless is redefining the networking picture
Wireless technologies are growing quickly by 32% and now accounts for 6% of the total market. Within Wireless, WLAN is the most popular technology, followed by Bluetooth. “Wireless is increasingly being used by machine builders to realize innovative automation architectures and new solutions for connectivity and control, including Bring Your Own Device (BYOD) solutions via tablets or smartphones,” says Anders Hansson.

Fieldbus is still growing, but the growth is slowing down
Fieldbuses are still the most widely used type of networks with 48% of the market. Fieldbuses are still growing as many users ask for the traditional simplicity and reliability offered by fieldbuses, but the growth rate is slowing down, currently at around 4% compared to 7% last year. The dominant fieldbus is PROFIBUS with 14% of the total world market, followed by Modbus-RTU and CC-Link, both at 6%.

Regional facts
In Europe and the Middle East, PROFIBUS is still the leading network while PROFINET has the fastest growth rate. Runners up are EtherCAT, Modbus-TCP and Ethernet POWERLINK.
The US market is dominated by the CIP networks where EtherNet/IP has overtaken DeviceNet in terms of market shares.
In Asia, a fragmented network market is very visible. No network stands out as truly market-leading, but PROFIBUS, PROFINET, EtherNet/IP, Modbus and CC-Link are widely used. EtherCAT continues to establish itself as a significant network, and CC-Link IE Field is also gaining traction.

More and more devices are getting connected
“The presented figures represent our consolidated view, taking into account insights from colleagues in the industry, our own sales statistics and overall perception of the market,” says Anders Hansson. “It is interesting to see that industrial Ethernet and Wireless combined now account for more than half of the market at 52%, compared to fieldbuses at 48%. The success of a series of industrial Ethernet networks and the addition of growing Wireless technologies confirms that the network market remains fragmented, as users continue to ask for connectivity to a variety of fieldbus, industrial Ethernet and wireless networks. All in all, industrial devices are getting increasingly connected, boosted by trends such as Industrial Internet of Things and Industry 4.0. From our point of view, we are well-suited to grow with these trends, since HMS is all about ‘Connecting Devices.’”

 @HMSAnybus #PAuto #IoT

Changing bad gear oil habits.

28/02/2017
Here, Mark Burnett, VP of the Lubricants and Fuel Additives Innovation Platform at the water, energy and maintenance solutions provider NCH Europe, explores how businesses can improve the effectiveness of their gear oil.

Benjamin Franklin once said, “it is easier to prevent bad habits than to break them.” This rings true for the industrial sector, where it is easier to form a habit of good predictive maintenance than to recover from machinery breakage or downtime.

nch_tan_lubricationHowever, this is easier said than done. Predictive maintenance requires constant vigilance in order to be effective, ensuring that maintenance engineers know when it is the right time to lubricate bearings, apply a rust-preventative coating or treat their water supply. These tasks will vary in frequency, so there can be a steep learning curve to getting it right.

Unfortunately, we all know that problems do not wait until you’re ready and, especially with gear oil changes, failure to get it right often leads to problems. Changing oil too soon, for example, leads to higher costs as more changes will be needed than necessary. Conversely, forgetting to change the oil at the right time increases the likelihood of machine damage and breakage, which itself leads to elevated operational costs.

Despite both extremes leading to increased business costs, only 20 per cent of oil changes happen at the right time. This is not surprising when considering the fact that many variables can determine how regularly oil needs changing. While many engineers may fill up a machine and expect it to require a change after a certain amount of time, it is actually the quality of the oil itself that must be measured.

This is understandably difficult without a comprehensive approach to industrial gear oil analysis. In order to reliably measure the quality of the oil and when a change is due, engineers must identify the quantities of external contamination and metal wear, as well as the general condition of the oil.

For example, oxidation is a naturally occurring process that affects oil over time. In the presence of oxygen, the oil begins to break down and this reduces the service life of the oil itself. In addition to this, it also produces sludge that makes equipment work harder and drives up operation costs.

If left long enough, the acidity of oxidised oil will steadily increase and result in corrosion and pitting. While this is problematic if left for extended periods of time, this acidity allows more accurate assessment of oil condition. By measuring increases in the system’s total acid number (TAN), maintenance engineers and plant managers can identify when the oil acidity is reaching the maximum acceptable level and act accordingly.

However, TAN only accounts for one part of overall gearbox system condition and there are many other considerations such as the operational health of the machinery itself. It is crucial that engineers consider all aspects to ensure optimum performance.

To this end, NCH Europe has developed the NCH Oil Service Program (NOSP) to help businesses keep their machinery in working order and their oil changes timely. Samples of gear oil are analysed and user-friendly reports are generated so that plant managers can see accurate results at a glance, giving a clear overview of equipment condition and the TAN of the oil.

Accurate analysis helps to prevent engineers falling into the bad habit of incorrect oil management. By combining this insight with an effective cleaning solution and a suitable gear oil, further bad oil-change habits and breakages can be kept at bay.

@NCH_Europe #PAuto


The ‘ins and outs’ of air quality monitoring!

20/02/2017
The British National Institute for Health and Care Excellence (NICE) recently issued draft guidance on ‘Air pollution – outdoor air quality and health.’ 

Here, Jim Mills, Managing Director of Air Monitors Ltd, explains why there will need to be more funding for monitoring if the mitigation measures mentioned in the guidance are to be implemented effectively. Jim also highlights the close relationship between outdoor air quality and the (often ignored) problems with indoor air quality.

The NICE guidelines are being developed for Local Authority staff working in: transport, planning, air quality management and public health. The guidance is also relevant for staff in healthcare, employers, education professionals and the general public.

Covering road-traffic-related air pollution and its links to ill health, the guidelines aim to improve air quality and so prevent a range of health conditions and deaths. Unfortunately, on the day that the draft guideline was published, most of the national media focused on one relatively minor recommendation relating to speed bumps. ‘Where physical measures are needed to reduce speed, such as humps and bumps, ensure they are designed to minimise sharp decelerations and consequent accelerations.’ Measures to encourage ‘smooth driving’ are outlined; however, the guidelines also address a wide range of other issues, which, in combination, would help tackle urban air pollution.

Public sector transport services should implement measures to reduce emissions, but this is an area that could involve the greatest financial cost.

Many local authorities would doubtless comment that they are already implementing many of the guideline recommendations, but refer to budgetary constraints on issues that involve upfront costs. This issue was raised on BBC Radio 4 when the issue was discussed on 1st December.

AQMesh Pod

AQMesh Pod

The NICE guidelines recommend the inclusion of air quality issues in new developments to ensure that facilities such as schools, nurseries and retirement homes are located in areas where pollution levels will be low. LAs are also urged to consider ways to mitigate road-traffic-related air pollution and consider using the Community Infrastructure Levy for air quality monitoring. There are also calls for information on air quality to be made more readily available.

LAs are also being urged to consider introducing clean air zones including progressive targets to reduce pollutant levels below the EU limits, and where traffic congestion contributes to poor air quality, consideration should be given to a congestion charging zone. The guidelines also highlight the importance of monitoring to measure the effects of these initiatives.

As part of the consultation process, NICE is looking for evidence of successful measures and specifically rules out “studies which rely exclusively on modelling.”

In summary, all of the initiatives referred to in the NICE report necessitate monitoring in order to be able to measure their effectiveness. However, most LAs do not currently possess the monitoring capability to do so. This is because localised monitoring would be necessary before and after the implementation of any initiative. Such monitoring would need to be continuous, accurate and web-enabled so that air pollution can be monitored in real-time. AQMesh is therefore the ideal solution; small, lightweight, quick and easy to install, these air quality monitors are able to monitor all the main pollutants, including particulates, simultaneously, delivering accurate data wirelessly via the internet.

Whilst AQMesh ‘pods’ are very significantly lower in cost both to buy and to run than traditional reference stations, they still represent a ‘new’ cost. However any additional costs are trivial in comparison with the costs associated with the adverse health effects caused by poor air quality, as evidenced in the recent report from the Royal College of Physicians.

Inside Out or Outside In?

Fidas® Frog

Fidas® Frog

The effects of air pollution are finally becoming better known, but almost all of the publicity focuses on outdoor air pollution. In contrast, indoor air quality is rarely in the media, except following occasional cases of Carbon Monoxide poisoning or when ‘worker lethargy’ or ‘sick building syndrome’ are addressed. However, it is important to understand the relationship between outdoor air quality and indoor air quality. Air Monitors is currently involved in a number of projects in which air quality monitoring is being undertaken both outside and inside large buildings, and the results have been extremely interesting.

Poorly ventilated offices tend to suffer from increased Carbon Dioxide as the working day progresses, leading to worker lethargy. In many cases HVAC systems bring in ‘fresh’ air to address this issue, but if that fresh air is in a town or city, it is likely to be polluted – possibly from particulates if it is not sufficiently filtered and most likely from Nitrogen Dioxide. Ventilating with outdoor air from street level is most likely to bring air pollution into the office, so many inlets are located at roof level. However, data from recent studies indicate that the height of the best air quality can vary according to the weather conditions, so it is necessary to utilise a ‘smart’ system that monitors air quality at different levels outside the building, whilst also monitoring at a variety of locations inside the building. Real-time data from a smart monitoring network then informs the HVAC control system, which should have the ability to draw air from different inlets if available and to decide on ventilation rates depending on the prevailing air quality at the inlets. This allows the optimisation of the internal CO2, temperature and humidity whilst minimising the amount of external pollutants brought into the indoor space. In circumstances where the outside air may be too polluted to be used to ventilate, it can be pre-cleaned by scrubbing the pollutant gases in the air handling system before being introduced inside the building.

Fidas200The implementation of smart monitoring and control systems for buildings is now possible thanks to advances in communications and monitoring technology. AQMesh pods can be quickly and easily installed at various heights outside buildings and further units can be deployed internally; all feeding near-live data to a central control system.

Another example of indoor air quality monitoring instrumentation developing from outdoor technology is the ‘Fidas Frog,’ a new fine dust aerosol spectrometer developed by the German company Palas. The Frog is an indoor, wireless, battery-powered version of the hugely popular, TÜV and MCERTS certified Fidas 200. Both instruments provide simultaneous determination of PM fractions, particle number and particle size distribution, including the particle size ranges PM1, PM2.5, PM4, PM10 and TSP.

Evidence of outdoor air pollution contaminating indoor air can be obtained with the latest Black Carbon monitors that can distinguish between the different optical signatures of combustion sources such as diesel, biomass, and tobacco. The new microAeth® MA200 for example, is a compact, real-time, wearable (400g) Black Carbon monitor with built-in pump, flow control, data storage, and battery with onboard GPS and satellite time synchronisation. Samples are collected on an internal filter tape and wireless communications are provided for network or smartphone app integration and connection to other wireless sensors. The MA200 is able to monitor continuously for 2-3 weeks. Alternatively, with a greater battery capacity, the MA300 is able to provide 3-12 months of continuous measurements.

In summary, a complete picture of indoor air quality can be delivered by a combination of AQMesh for gases, the Palas Frog for particulates and the microAeth instruments for Black Carbon. All of these instruments are compact, battery-powered, and operate wirelessly, but most importantly, they provide both air quality data AND information on the likely source of any contamination, so that the indoor effects of outdoor pollution can be attributed correctly.

@airmonitors #Environment #PAuto @_Enviro_News


Analyzer underpins growth of container inspection company.

10/02/2017

After a career as a customs officer in the Netherlands, Wim van Tienen was well aware of the toxic gas hazards presented by some freight containers, so in 2009 he started a company, Van Tienen Milieuadvies B.V., offering gas analysis and safety advice. The company grew quickly and now employs 23 staff. Wim attributes a large part of this success to the advanced FTIR gas detection and analysis technology upon which the company’s services depend.

Background

Wim van Tienen

Wim van Tienen

It has been estimated that there are more than 17 million shipping containers in the world, and at any time about one third of them are on ships, trucks, and trains. Over a single year, the total number of container trips has been estimated to be around 200 million.

The air quality inside containers varies enormously, depending on the goods, the packing materials, transit time, temperature, humidity and the possible presence of fumigants. Consequently, many containers contain dangerous levels of toxic gases and represent a major threat to port and transport workers, customs officials, warehousemen, store employees and consumers. It is therefore essential that risks are assessed effectively before entry is permitted.

Solvent vapours and most fumigants, whilst harmful, can be detected by the human nose, but Wim says: “Some gases are odourless and some have a high odour detection threshold, which means that you can only smell the substance in high concentrations; ethylene oxide for example, is commonly used as a sterilant in relation to medical devices. It is extremely toxic and has a low TLV limit of 0,5 ppm. However the odour threshold limit of ethylene oxide is 500 ppm, so detection with instrumentation is essential.”

The wide variety of potential contaminants represents a technological challenge to those responsible for testing, because if testers seek to detect specific gases, they risk failing to detect other compounds. It is also not practical to test every single container, so logical procedures must be established in order to minimise risks.

Gas Detection and Measurement
In 2009, when Wim first established the company, container gas detection was carried out with traditional field measurement techniques (gas detection sensors and tubes). “This approach was complicated, costly and time-consuming, and it was impossible to cover all risks,” he says. “With sensors and tubes, only a limited number of compounds can be measured specifically. Furthermore, the accuracy of detection tubes is poor and they can suffer from cross-sensitive reactions by interfering substances.

“Technologies such as PID-detectors respond to a wide variety of organic compounds, but they are not selective and unable to detect commonly found substances with high ionisation potentials such as 1,2-dichloroethane and formaldehyde.” Wim does not, therefore, believe that traditional measurement techniques are the best approach for covering all risks. “In order to test for the most common gases, it would be necessary to utilise a large number of tubes for every container, but this would still risk failing to detect other compounds and would be very expensive.

“It is possible to speciate organic compounds when using a Gas Chromatograph, but the number of compounds that can be tested is limited, and the use of a GC necessitates frequent calibration with expensive standard gases.”

Simultaneous multigas analysis
As a result of the problems associated with traditional gas detection techniques, Wim was keen to find an alternative technology and in 2013 he became aware of portable FTIR multigas analyzers from the Finnish company Gasmet Technologies. “The Gasmet DX4040 appeared to be the answer to our prayers,” Wim says. “The instrument is able to both detect and measure hundreds of compounds simultaneously; with this technique all inventoried high risk substances, such as ethylene oxide and formaldehyde, are always measured in real-time.

“With the help of Peter Broersma from Gasmet’s distributor Reaktie, a special library of over 300 gases was developed for our container monitoring application, and 8 Gasmet DX4040 FTIR instruments are now employed by our team of gas testing specialists.”

The major advantage of the Gasmet FTIR analyzers is the simultaneous multigas analysis capability. However Wim says: “Our testing work is now much faster, efficient and cost-effective, not least because the analyzers are small, lightweight, relatively simple to run, and no calibration is required other than a quick daily zero check with Nitrogen.”

Van Tienen Milieuadvies also employs a fully trained and highly qualified chemist, Tim Gielen, who is able to conduct in-depth analysis of recorded FTIR spectra when necessary. This may involve comparing results with Nist reference spectra for over 5000 compounds.

Most of the gases that are detected and measured by FTIR analyzers are cargo related. Wim says: “Off-gassing during shipment is the greatest problem, producing VOCs such as Toluene, Xylenes, MEK, 1,2 dichloroethane, blowing agents such as isopentane, and butanes from the packing materials and products.

“Formaldehyde, which evaporates from glued pallets, is most commonly found. On the other hand, less frequently found fumigants, such as sulfuryl difluoride and hydrogen cyanide are also always monitored with our FTIR analyzers.”

Container management
The need for container gas testing is driven by employers’ duty of care for employees, which is embedded in international health and safety regulations. Companies receiving containers must investigate whether employees that open or enter containers, may be exposed to the dangers of suffocation, intoxication, poisoning, fire or explosion. In order for employers to protect staff from these hazards, a risk assessment is necessary, coupled with an effective plan to categorise and monitor container flows. “This is how we develop an effective testing strategy,” Wim explains. “If a flow of containers from the same source containing the same goods and packing materials is found to be safe, the number of containers being tested within that flow can be reduced. Similarly, if toxic gases are identified regularly in a container flow, the frequency of testing will be increased.”

Once a container has been found to contain toxic levels of a gas or gases, it is necessary for that container to be ‘de-gassed’ which is a service that Van Tienen Milieuadvies provides. The process involves fitting a powerful ventilator to the door and capturing the gases with activated carbon. Once degassing is complete, it is important that the container is unloaded promptly, because the gases involved will re-accumulate quickly in a closed container, resulting in the need for repeat testing.

With the benefit of many years of experience, Wim estimates that around 10% of containers contain toxic gases. “This means that hundreds of thousands of containers are travelling the world, representing a major risk to anyone that might enter or open them, so it is vital that effective testing strategies are in place wherever that risk exists.”

“FTIR gas analysis has benefited this work enormously. For us, the main advantages are speed and peace of mind – we are now able to test more containers per day, and by testing for such a large number of target compounds, we are able to dramatically lower the risks to staff. The speed with which we are now able to test containers, coupled with the negligible requirement for service, calibration and consumables, means that the ongoing cost of monitoring is minimal.

“Van Tienen combines the Gasmet DX4040 measurements with risk analysis, which provides the best protection for staff responsible for opening containers. We have LRQA certification for the procedures that we have developed to demonstrate compliance with occupational health and safety legislation.

“Risk analysis provides cost reduction for our clients, due to the fact that measurement frequencies in safe flows can be reduced significantly. Root cause analysis is also part of our risk analysis.”

Looking forward Wim believes that the use of Gasmet FTIR will expand rapidly around the world as the risks associated with containers become better understood, and as employers become more aware of the advantages of the technology.