Sensors for Mars.

International collaboration takes Vaisala and the Finnish Meteorological Institute (FMI) to Mars onboard NASA’s Mars 2020 Perseverance rover. The rover is scheduled for launch on July 30, 2020. Vaisala’s sensor technology combined with FMI’s measurement instrumentation will be used to obtain accurate and reliable pressure and humidity data from the surface of the red planet.

The Finnish Meteorological Institute (FMI) is among the scientific partners providing measurement equipment for the new Perseverance rover, expected to launch in July and land on Mars in February 2021. The pressure and humidity measurement devices developed by the FMI are based on Vaisala’s world known sensor technology and are similar but more advanced to the ones sent to Mars on the first Curiosity rover in 2012.

Is there anybody out there?
Join this live webcast to hear more!
Welcome to learn about space-proof technology, how it works, what it does, why it’s important, and why measurements play a key role in space research. You’ll hear examples and stories by our experts, and by a special guest speaker, who will be sharing his own experiences and insights of space.
• Date: July 20, 2020
•Time: 15.30-16.30 EEST – 14.30-15.30 CEST – 08.30-09.30 EDT

Place: Virtual event – Sign up here
The event is organized by Vaisala and the Finnish Meteorological Institute. It will be held in English and it is free of charge. Live subtitles in Finnish will be available.
Learn more about space-proof technology before the event here and follow the discussion on social media using #spacetechFI.

The new mission equipment complements the Curiosity rover. While working on Mars, the Curiosity and Perseverance rovers will form a small-scale observation network. The network is only the first step, anticipating the extensive observation network planned on Mars in the future.

International and scientific collaboration aims to gather knowledge of the Martian atmosphere and other environmental conditions
The Mars 2020 mission is part of NASA’s Mars Exploration Program. In order to obtain data from the surface from the Red Planet, NASA selected trusted partners to provide measurement instruments for installation on the Mars rover. A Spanish-led European consortium provides the rover with Mars Environmental Dynamics Analyzer (MEDA); a set of sensors that provides measurements of temperature, wind speed and direction, pressure, relative humidity, and the amount and size of dust particles.

As part of the consortium, FMI delivers instrumentation to MEDA for humidity and pressure measurements based on Vaisala’s top quality sensors.

“Mars, as well as Venus, the other sister planet of Earth, is a particularly important area of atmospheric investigations due to its similarities to Earth. Studying Mars helps us also better understand the behavior of Earth’s atmosphere”, comments Maria Genzer, Head of Planetary Research and Space Technology group at FMI.

The harsh and demanding conditions of Mars require the most reliable sensor technology that provides accurate and reliable data without maintenance or repair.

“We are honored that Vaisala’s core sensor technologies have been selected to provide accurate and reliable measurement data on Mars. In line with our mission to enable observations for a better world, we are excited to be part of this collaboration. Hopefully the measurement technology will provide tools for finding answers to the most pressing challenges of our time, such as climate change,” says Liisa Åström, Vice President, Products and Systems of Vaisala.

Same technology, different planet – utilizing Vaisala core technologies for accuracy and long-term stability
In the extreme conditions of the Martian atmosphere, NASA will be able to obtain accurate readings of pressure and humidity levels with Vaisala’s HUMICAP® and BAROCAP® sensors. The sensors’ long-term stability and accuracy, as well as their ability to tolerate dust, chemicals, and harsh environmental conditions, make them suitable for very demanding measurement needs, also in space. The same technology is used in numerous industrial and environmental applications such as weather stations, radiosondes, greenhouses and datacenters.

Barocap Wafer

The humidity measurement device MEDA HS, developed by FMI for Perseverance, utilizes standard Vaisala HUMICAP® humidity sensors. HUMICAP® is a capacitive thin-film polymer sensor consisting of a substrate on which a thin film of polymer is deposited between two conductive electrodes. The humidity sensor onboard is a new generation sensor, with superior performance also in the low pressure conditions expected on the red planet.

In addition to humidity measurements, FMI has developed a device for pressure measurement, MEDA PS, which uses customized Vaisala BAROCAP® pressure sensors, optimized to operate in the Martian climate. BAROCAP® is a silicon-based micromechanical pressure sensor that offers reliable performance in a wide variety of applications, from meteorology to pressure sensitive industrial equipment in semiconductor industry and laboratory pressure standard measurements. Combining two powerful technologies – single-crystal silicon material and capacitive measurement – BAROCAP® sensors feature low hysteresis combined with excellent accuracy and long-term stability, both essential for measurements in space.

“Our sensor technologies are used widely in demanding everyday measurement environments here on Earth. And why not – if they work on Mars, they will work anywhere,” Åström concludes.

@VaisalaGroup @FMIspace @NASAPersevere #Metrology #Finland

Continuous Mercury monitoring benefits cement plants.

Antti Heikkilä from Gasmet Technologies highlights the challenges faced by mercury monitoring in cement kilns, and explains how a new continuous mercury monitoring system addresses these issues and provides process operators with an opportunity to improve environmental performance and demonstrate compliance with forthcoming legislation.

The production of cement klinker and lime in rotary kilns is responsible for 10.7% of mercury emissions to air (3,337 kg) according to a recent study. Most of the mercury and mercury compounds pass through the kiln and preheater; they are only partly adsorbed by the raw gas dust, depending on the temperature of the waste gas. For these reasons, monitoring and controlling emissions of mercury to air is important and steps are being taken in several countries to impose emission limits. In the European Union BREF guidance for Cement kilns (CLM BREF), mercury has a BAT-associated emission level of 0.05 mg/Nm3 (50 µg/Nm3) for the half-hour average.

New monitoring technology

Figure 1

Figure 1

Gasmet Technologies has launched a new continuous mercury emission monitoring system (CMM) based on the cold vapour atomic fluorescence (CVAF) measurement principle. The analyser is integrated in an air conditioned cabinet together with a vacuum pump, an automatic calibrator and a nitrogen gas generator. The sample gas is extracted from the process duct with a dilution probe and heated sample line specially designed for sampling mercury from harsh process conditions (see figure 1 right). The analyser has a detection limit of 0.02 µg/Nm3 and the lowest measuring range for total mercury concentration is 0 – 10 µg/Nm3 when a dilution rate of 1:50 is used in the sample extraction probe.

Since the CMM analyser employs a CVAF spectrometer, the sensitivity of the instrument is excellent and the main source of measurement uncertainty that needs to be addressed by the analyser and the system design is the quenching effect; where other gases present in the sample, such as O2 and H2O, lower the fluorescence signal due to mercury atoms. In order to avoid these adverse effects, a dilution sampling approach is used and the dilution gas is synthetic nitrogen formed in a nitrogen generator inside the analyser cabinet. As the detection limit of the analyser is much lower than would be needed to monitor mercury in low µg/Nm3 ranges, dilution does not compromise the sensitivity of the instrument. On the other hand, dilution lowers the quenching effect by lowering the concentration of interfering gases by a factor of 50. Measuring mercury in a gas consisting of 98% nitrogen guarantees consistent measurement regardless of the fuel or emission abatement techniques used in the plant.

The CVAF spectrometer measures atomic mercury vapour (Hg0) and in order to measure total mercury including oxidized forms, a thermal catalytic converter is used to convert all forms of mercury such as Mercury Chloride into atomic mercury. The converter is close-coupled with the fluorescence cell to minimise the risk of recombination reactions where the atomic mercury converts back to oxidised forms between the converter and spectrometer.

The system has been field tested on various types of industrial plants (coal fired power plant, hazardous waste incinerator, sulphuric acid plant and a cement plant) to characterise the suitability and long-term stability of the sample probe and dilution system in various processes. Given the reactive nature of mercury, special care has been taken to ensure that mercury in the flue gas is not absorbed into dust accumulating in the sample probe filters. Mercury reacts readily with limestone dust, resulting in analyte loss and increased response time of the analyser. The Gasmet CMM solution includes a smaller filter element, which minimises the amount of dust deposition on the filter, and a two-stage blowback mechanism which first removes dust from the filter element and then in the second stage expels the dust from the probe tube back into the process.

Field test at Finnish Cement Plant

Figure 2

Figure 2

The CMM was installed on the emission stack of a rotary kiln cement plant with an Electrostatic Precipitator (ESP) for particulate emission control (see figure 2 above). The test period lasted 30 days. The fuels used during the test included coal, petroleum coke and recovered fuels. The flue gas composition at the measurement point is summarised in table 1. During the field trial, the raw mill was periodically stopped and the variation in mercury levels was monitored together with changes in other process parameters. Average mercury concentration when the raw mill was running was 6 to 8 µg/Nm3 and when the raw mill was stopped, the concentrations could increase to 20 – 40 µg/Nm3. The plant had an emission limit value of 50 µg/Nm3 for total mercury.

Figure 3

Figure 3

Figure 3 (above) shows a typical 24-hour period of emissions including raw mill on and raw mill off conditions. In addition to Hg0 concentration, the dust loading and raw mill state are shown because these are the main parameters expected to have an impact on the mercury analyser.

The main goal of the test was to ensure the stability and repeatability of mercury measurement in demanding process conditions and to determine whether cement dust causes analyte loss and increased response time in the sample extraction probe.

The only process variable which clearly correlates with mercury concentration is the raw mill on/off state. When the raw mill is on, the variation in dust loading or other gas concentrations (O2, H2O, acid gases such as SO2 and HCl) does not correlate with variation observed in mercury concentration. When the raw mill is switched off, all gases including mercury undergo a change in concentration but this is clearly brought about by the raw mill state.

In order to estimate the repeatability of the Hg measurement at zero and span levels, the CMM analyser was configured to perform zero tests with synthetic nitrogen and span tests with Hg0 test gas generated by the mercury calibrator in the CMM system at 4 hour intervals. The normal test interval required by the analyser is 24 hours, but in the interest of creating more test data, the interval was shortened in this test. All test gases are injected into the probe upstream of particle filters so that the test gas has to pass through the potentially contaminated filters.

Figure 4

Figure 4

The results from six repeated span/zero test cycles are shown in figure 4 (above). The target level for the span check was 6.5 µg/Nm3 and the average span level was 6.60±0.036 µg/Nm3. The average result for the zero check was -0.006 ± 0.036964 µg/Nm3. If the dust accumulating in the sample extraction probe were to cause analyte loss during span tests, the later tests would show a decrease from the span check target value, but this was not observed. If the dust in the probe were to make the response time longer (memory effect), the later tests would show a slower response than the first tests. Again, there was no systematic change in the test results and the tests 1-6 exhibited very consistent results.

The span and zero checks also provided an opportunity to characterise the response time of the analyser when the span test at a known concentration is followed by a zero check with a zero concentration. The data from all six tests in figure 3 were combined together into one dataset in figure 4 by synchronising the moment when the span/zero check cycle was started. A Boltzmann sigmoidal curve (eqn 1) was fitted to the experimental data using GRG nonlinear fitting routine in the Microsoft Excel Solver package. The parameters of the response curve are summarised in table 2. The response time was evaluated as T90-10, the time interval between a reading representing 90% of the span check value and a reading representing 10% of the span check value.  The response time from this calculation was 10.15 minutes or just over two measurement cycles (measurement data is obtained as 5 minute rolling averages of the mercury concentration). The live data from the emissions shows peaks of comparable sharpness, but these were not subjected to the same analysis as the span/zero check data.

The requirements of a Continuous Mercury Monitoring system in a Cement plant are as follows:

  • capable of measuring a low baseline level with high sensitivity when the raw mill is on and the fuel feed contains low levels of metals
  • capable of measuring excursions to higher concentrations when the raw mill is off
  • low cross-interference from gases e.g. SO2
  • no analyte loss or other sampling issues in high dust loading
  • stable calibration and simplified calibration check routine with built-in calibration gas generator.

Since the main application areas for continuous Mercury monitoring systems have been in hazardous and municipal waste incineration, and coal fired power stations with conditions that are different to Cement plants; care must be taken to ensure that the monitoring system, and especially its sample extraction probe, is suitable for the process conditions. This study demonstrates that a CVAF spectrometer and dilution sampling approach can be successfully used in this application.

Wastewater treatment plant monitors Greenhouse Gas emissions!


Globally, little attention is paid to gaseous emissions from wastewater treatment processes. This contrasts greatly with the regulatory monitoring that is applied to the quality of water emissions from such facilities. However, in Helsinki (FI), a large municipal wastewater treatment facility continuously monitors its emissions of greenhouse gases (GHGs) to help in the city’s efforts to combat climate change and also to help improve the wastewater treatment process.

Employing a multigas FTIR (Fourier Transform InfraRed) analyser from Gasmet, a Helsinki-based manufacturer of analytical instrumentation, the plant’s managers are able to measure the effects of process control on GHG emissions such as carbon dioxide, methane and nitrous oxide. This also provides an insight into the fate of nitrogenous compounds within the wastewater stream.

The Viikinmäki wastewater treatment plant was built in 1994 to process wastewater from both domestic (85%) and industrial (15%) sources. However, the average temperature in Helsinki between December and February is around minus 4 DegC, with extremes below minus 20 and even minus 30 DegC, so the plant was built almost entirely underground to avoid the freezing temperatures. Underground construction is common practice in the Nordic countries, providing other advantages such as land availability above the plant and the provision of stable conditions for process control and odour management.

Viikinmäki Wastewater HSY (FI)

Viikinmäki Wastewater HSY (FI) (Photo courtesy of HSY)

The Viikinmäki plant is the largest wastewater treatment facility in Finland, handling approximately 270,000 m³ of wastewater per day, which amounts to about 100 million m³ per year. The wastewater is treated in compliance with the Finnish Wastewater Discharge Permit, which is stricter than the EU Water Framework Directive for parameters such as nitrogen removal, phosphate content, BOD, COD and suspended solids. Following treatment, the purified / treated wastewater is conveyed 8km out to sea and to a depth of over 20m. This might seem superfluous, but the 16 km long discharge pipe was built in the 1980s and was designed to ensure that discharged wastewater did not accumulate on the shallow and scattered shore and nature reserves along the coastline of Helsinki.

The treatment process is based on the activated sludge method and includes three phases: mechanical, biological and chemical treatment. Traditional nitrogen removal has been enhanced with a biological filter that utilises denitrification bacteria.

The organic matter contained in the sludge produced in the wastewater treatment process is exploited by digesting the sludge, and the biogas generated in the digestion process is collected for further use. Thanks to the energy produced from biogas, the treatment plant is self-sufficient in terms of heating and about 70 per cent self-sufficient in terms of electricity. However, the plant aims to be fully energy self-sufficient in the near future, and around 60,000 tonnes of dried waste sludge is sold each year for landscaping purposes.

Gas monitoring
As a result of the size of the plant (E-PRTR reporting) and the commitment of the Helsinki Region Environmental Services Authority (HSY) to the protection of the environment, it was necessary to monitor or to model gaseous emissions. At the beginning of the E-PRTR reporting requirements (2007) HSY modelled the annual gaseous emissions based on grab samples. However, monitoring was relatively simple to implement because the plant is enclosed underground and a gas exhaust system was already in place.

Viikinmaki Emissions Monitor

Viikinmaki Emissions Monitor (Photo courtesy of Gasmet Technologies)

Initially, a portable FTIR analyzer from Gasmet was hired for a short period to assess the plant’s emissions and for research purposes. However, as Mari Heinonen, Process Manager at Viikinmäki, reports: “The gas emissions data were very interesting but they were not representative of the annual emissions, and posed more questions than they answered.

“We therefore purchased a continuous emissions monitoring system (CEMS) from Gasmet, which was installed in late 2012 and we now have our first full year’s data for 2013.

“Very little data has been published on the GHG emissions of wastewater treatment and as far as we are aware, Viikinmäki is the only plant in the world conducting this type of monitoring, so our data is likely to be of major significance.”

The Gasmet CEMS employs an FTIR spectrometer to obtain infrared spectra from the waste gas stream by first collecting an ‘interferogram’ of the sample signal with an interferometer, which measures all infrared frequencies simultaneously to produce a spectrum from which qualitative and quantitative data are produced. For example, the CEMS at Viikinmäki continuously displays emissions data for CH4, N2O, CO2, NO, NO2, and NH3.

Over a number of years, Gasmet has established a library of FTIR reference spectra that now extends to simultaneous quantification of 50 gases or identification of unknowns from a collection of 5000+ gases. This means that it is possible to reanalyse produced spectra with the instrument’s PC based software (Calcmet) and thereby to identify unknown gases – a major advantage of FTIR.

Whilst FTIR is able to analyse an enormous number of gases, the technique is not suitable for noble gases, homonuclear diatomic gases (e.g., N2, Cl2, H2, F2, etc) or H2S (detection limit too high).

Gasmet FTIR technology was chosen for the Viikinmäki plant because of its ability to monitor multiple gases simultaneously. However Mari Heinonen says: “The system has performed very well, with very little maintenance required. Zero point calibration with nitrogen (background) just takes a few minutes each day and is fully automated. Water vapour calibration is conducted at least once per year, but under normal circumstances no other calibration is necessary.”

With the benefit of the monitoring data, Mari Heinonen has calculated the annual emissions for methane to be around 350 tonnes, and for nitrous oxide around 134 tonnes. This means that the emissions per cubic meter of wastewater equate to 3.5g of methane and 1.34g of nitrous oxide.

Looking forward, Mari believes that it will be possible to use the gas monitoring data to improve process control: “Traditional monitoring/control systems focus on concentrations of oxygen, nitrate and ammonia in the water, but if we detect high levels of N2O gas for example, this may indicate a problem in the process that we can use as a feedback control.

“The monitoring data for gaseous nitrogen compounds (N2O, NH3, NOx) complements water analysis and provides a more complete picture of the nitrogen cycle in the treatment process.

“Clearly, further research will be required, but this work may indicate a need to consider the fate of nitrogenous compounds beyond just those in the wastewater; the removal of nitrogen from wastewater is a key objective, but if this results in high N2O emissions the process may need to be managed in a different way.”

Documenting calibrators


Benefit from using documenting calibrators

Why calibrate?
For process manufacturers, regular calibration of instruments across a manufacturing plant is common practice. In plant areas where instrument accuracy is critical to product quality or safety, calibration every six months – or even more frequently – is not unusual. However, the key final step in any calibration process – documentation – is often neglected or overlooked because of a lack of resources, time constraints or the pressure of everyday activities. Indeed, many manufacturers are now outsourcing all or some of their maintenance activities and so the contractor too is now under the same pressure to calibrate plant instruments quickly but accurately and to ensure that the results are then documented for quality assurance purposes and to provide full traceability.

Benefits in Practice
Northern Energy Services (NES) Ltd, is using Beamex® CMX Calibration Software and the Beamex® MC5 Multifunction Calibrator to carry out instrument calibrations for its customers.
NES is a service provider for the power generation and petrochemicals industries. NES carries out a range of services for its clients, including routine maintenance, calibrations, installations, electrical services, fault diagnosis and repair, ATEX inspections and re-certifications.
David Tuczemskyi, Control & Instrumentation Engineer at NES has worked in his role for more than 10 years. He comments: “Most of our work is for gas-fired power stations in Britain and Ireland, which involves a significant amount of instrument calibrations. Typically, once a year for a major inspection, the plant will shut down for a month and we are called in to carry out all instrument calibrations across the site.”
In Tuczemskyi’s experience, the industry is often guilty of relying on manual paper-based systems for documenting instrument calibrations. As he explains: “Calibration is done manually, which takes longer and is prone to manual error. Often, the field engineer calibrates the instrument, handwrites the results onto a paper form of some kind, and then has to re-enter this information into a database on his or her return to the office. Unintentional errors often creep in here and the whole process is time consuming.” So, in the last few years, NES has issued its team of engineers with the MC5 documenting calibrator and the company has also bought the CMX calibration software. “We are actively pushing our customers, the power generation plants, to use Beamex’s hardware and software,” he explains. Why? Because, he says, you get higher accuracy, the calibration process is much faster and the customer gets full traceability. “What surprised me most when I started using the MC5 was that the calibration tasks were being done much faster but also more accurately. In our industry, a faster job usually means the engineer has cut corners. When you’ve got to calibrate 1,000 instruments across a site, typically with five-point checks on each instrument, speed and accuracy are critical.”
After completing instrument calibrations, they typically provid their clients with a full quality assurance report of all instruments that have been calibrated at the site, along with a calibration certificate if required. “This not only ensures full traceability but looks professional and reflects well on us as a service provider,” states Tuczemskyi.
Over the years, the biggest change that Tuczemskyi has seen in the British power industry is in regulations and the auditing process. “You simply cannot get away with it now. Everything you do has to be traceable. The problem here is that the end customer has little or no time to spend with the contractor, so the contractor has to be fully competent in everything it does. The customer wants to outsource as much of the maintenance and calibration work as possible these days. MC5 enables us to perform all the required instrument calibrations on a site, but then automatically downloads this information to the CMX software. We perform all the necessary back-ups for the customer and the whole process is fully integrated and efficient.”
Instruments that require calibration are normally given a priority rating by the customer. Safety-critical instruments are often the highest and will be checked every three to six months with lower priority instruments only being checked once a year or less. “CMX removes all these issues and enables users to prioritise their instruments and then to receive automatic alerts when safety-critical sensors require calibrating. When it comes to plant safety, you cannot always rely on manual paper systems and poor databases.”
Tuczemskyi also likes the fact that the combination of using the MC5 and CMX means that instructions on how to calibrate an instrument and the order in which to calibrate can all be downloaded to the handset while the engineer is out in the field. “We did some calibration work for a customer on three gas turbines, two of which were running, the other on standby. Certain instruments we had to calibrate were on a common block trip and we knew from our own experience that we had to calibrate these in a specific order and method, otherwise we could inadvertently cause the one of the turbines to fail. A two-hour shut down on a gas turbine, for example, would have cost this customer around £100,000 in lost downtime. By using the MC5 and CMX software in situations like these, the contractor doesn’t have to rely on experience like we did, but can download specific instructions to the calibrator before calibrating the instruments, which ensures costly mistakes out in the field are never made.”
According to Tuczemskyi, it took NES engineers only two weeks to get to grips with the CMX software. “Technical support at Beamex is absolutely superb. Three years ago, we were under immense pressure to get a job completed, when we had a software glitch that prevented us from uploading or downloading calibration results. We contacted Beamex and the guys repaired our fault within two days.”

The purpose of calibration itself is to determine how accurate an instrument or sensor is. Although most instruments are very accurate these days, regulatory bodies often need to know just how inaccurate a particular instrument is and whether it drifts in and out of specified tolerance over time.

What is a documenting calibrator?
A documenting calibrator is a handheld electronic communication device that is capable of calibrating many different process signals such as pressure, temperature and electrical signals, including frequency and pulses, and then automatically documenting the calibration results by transferring them to a fully integrated calibration management system. Some calibrators can read HART, Foundation Fieldbus or Profibus output of the transmitters and can even be used for configuring ‘smart’ sensors.

Heikki Laurila, Product Manager at Beamex in Finland comments: “I would define a documenting calibrator as a device that has the dual functionality of being able to save and store calibration results in its memory, but which also integrates and automatically transfers this information to some sort of calibration management software.” A non-documenting calibrator is a device that does not store data, or stores calibration data from instruments but is not integrated to a calibration management system. Calibration results have to be keyed manually into a separate database, spreadsheet or paper filing system.

Why use a documenting calibrator?
Calibrating Transmitters
By using a documenting calibrator, the calibration results are stored automatically in the calibrator’s memory during the calibration process. The engineer does not have to write any results down on paper and so the whole process is much faster and costs are therefore reduced. Quality and accuracy of calibration results will also improve, as there will be fewer mistakes due to human error. The calibration results are transferred automatically from the calibrator’s memory to the computer/database. This means the engineer does not spend time transferring the results from his notepad to final storage on a computer, again, saving time and money.

With instrument calibration, the calibration procedure itself is critical. Performing the calibration procedure in the same way each time is important for consistency of results. With a documenting calibrator, the calibration procedure can be automatically transferred from the computer to the handheld calibrator before going out into the field. As Heikki Laurila states: “Engineers who are out in the field performing instrument calibrations, get instant pass or fail messages with a documenting calibrator. The tolerances and limits for a sensor, as well as detailed instructions on how to calibrate the transmitter, are input once in the calibration management software and then downloaded to the calibrator. This means calibrations are carried out in the same way every time as the engineer is being told by the calibrator which test point he needs to measure next. Also, having an easy-to-use documenting calibrator is definitely the way forward, especially if calibration is one of many tasks that the user has to carry out in his or her daily maintenance routine.”

With a multi-functioning documenting calibrator the user doesn’t need to carry as much equipment while out in the field. It can be used also to calibrate, configure and trim Foundation Fieldbus H1 or Profibus PA transmitters.

Heikki Laurila continues: “With a documenting calibrator such as the BEAMEX MC5, the user can download calibration instructions for hundreds of different instruments into the device’s memory before going out into the field. The corresponding calibration results for these instruments can be saved in the device without the user having to return to his PC in the office to download/upload data. This means the user can work for several days in the field.”
Having a fully integrated calibration management system – using documenting calibrators and calibration management software – is important. It ensures that calibration procedures are carried out at the correct time and that calibration tasks do not get forgotten, overlooked or become overdue.