Ensuring accurate pigment dispensing.

20/06/2018

PD Edenhall Ltd is one of the largest independent concrete facing brick manufacturers in Britain. They needed to accurately monitor the quantity of pigment being dispensed into the weigh hopper throughout the manufacture of concrete facing bricks.

Precise pigment dispensing needs to be calculated as inaccurate amounts of pigment can lead to incorrect colour blends, resulting in a loss of sales and profit.

Solution – Accurate DBBSM S-Beam Load Cells and Intuitive4-L Digital Indicator
To monitor the quantity of pigment going into the blend, 3x DBBSM S-beam load cells were connected together to the supports of the pigment weigh hopper.  The DBBSM S-beam load cells checked the load of the pigment weigh hopper throughout the pigment dispensing process and the outputs of the load cells were sent to an Intuitive4 load cell digital indicator.  This enabled the engineers to constantly check the correct amount of pigment was being dispensed into the mix.

“We chose Applied Measurements’ DBBSM s-beam load cells as the pigment is £1000 per tonne so has to be extremely accurate.” Paul Akers, Works Manager at PD Edenhall Ltd told us.

DBBSM S-Beam Load Cells

  • Capacities: 0-1kg up to 0-30,000kg
  • Force & Load Measuring
  • Tension & Compression
  • Output: 2mV/V to 2.7mV/V
  • High Accuracy: <±0.03% Rated Capacity
  • Custom Versions Available
  • Fast and Simple to Install

The DBBSM S-beam load cells were ideal to use in this application as they are extremely accurate of better than ±0.03% of the rated capacity.  Coupled with their dual bending beam design, they guarantee excellent accuracy.  Improved accuracy can be further guaranteed by using our specially designed rod end bearings to help reduce any extraneous forces.

They offer an optional sealing to provide protection in the dusty environment and a robust 4-core polyurethane cable, making them ideal to use in the pigment dispensing machine.

Intuitive4-L Load Cell Digital Indicator

  • 6-Digit LED Display (±199999)
  • Modular Construction
  • Fast & Simple to Setup
  • Ideal for Harsh Environments – Dust Tight IP65 Protection (Once Installed)
  • Superior Accuracy – 10 Point Linearisation
  • Higher Stability – Signal Filtering Adjustment
  • Improved Resolution – 20-bit A/D Converter
  • Compatible with the INT2 Series
  • 10Vdc Load Cell Excitation @ 120mA max.
  • Powers up to 4x 350Ω Load Cells
  • Available in Less Than 1 Week

The Intuitive4-L load cell digital indicator was chosen for this application as superior accuracy was needed due to the high cost of the yellow pigment.  The intuitive4-L boasts a 10 point linearisation guaranteeing outstanding accuracy coupled with a 20-bit A/D converter for high resolution.  High stability is promised with signal filtering adjustment options which reduce the effect of noise or instability of the input signal.  Plus, it benefits from an active filter which reduces the effects of vibration and other external sources of system noise.

The intuitive4-L load cell digital indicator is fast and simple to setup with its single layer menu making the options easier to find.  Dimensions and fittings are entirely compatible with the existing Intuitive2 models making the switch over to this new improved version even easier.

Once installed the intuitive4-L load cell digital indicator has an IP65 dust tight protection rating making it ideal to use in this harsh construction environment.  If that’s not enough we can also provide an optional waterproof front panel cover for that extra level of protection.

The intuitive4-L digital panel meter has a modular construction meaning that PD Edenhall Ltd could configure it to their exact specification saving them money.  Options available include voltage or current analogue outputs, 2 or 4 alarm relays and a serial data output in one of several formats including RS232 ASCII, RS485 ASCII and RS485 ModBus RTU, making this a truly flexible load cell digital indicator.

@AppMeas #PAuto @EdenhallUK
Advertisements

The world of virtual commissioning.

15/06/2018
Robert Glass, global food and beverage communications manager at ABB explores the concept of virtual commissioning and how system testing can benefit the food industry.

In 1895, pioneer of astronautic theory, Konstantin Tsiolkovsky, developed the concept of the space elevator, a transportation system that would allow vehicles to travel along a cable from the Earth’s surface directly into space. While early incarnations have proven unsuccessful, scientists are still virtually testing new concepts.

Industry 4.0 continues to open up new opportunities across food and beverage manufacturing. In particular, these technologies help improve manufacturing flexibility and the speed and cost at which manufacturers are able to adapt their production to new product variations. Virtual commissioning is one of these key technologies.

What is virtual commissioning?
Virtual commissioning is the creation of a digital replica of a physical manufacturing environment. For example, a robotic picking and packing cell can be modeled on a computer, along with its automation control systems, which include robotic control systems, PLCs, variable speed drives, motors, and even safety products. This “virtual” model of the robot cell can be modified according to the new process requirements and product specifications. Once the model is programmed, every step of that cell’s operation can be tested and verified in the virtual world. If there are changes that are needed in the process automation or robot movement, these can be made on the same computer, allowing the robot to be reprogrammed, orchanges made to the variable speed drives and PLC programming. The ABB Ability™ RobotStudio is one tool that enables this type of virtual commissioning.

Once reprogrammed, the system is tested again and if it passes, it’s ready for physical deployment. This is where the real benefits become tangible. By using virtual commissioning to program and test ahead of time, less process downtime is required and manufacturers can reduce the changeover risks.

Automation programming and software errors in a system can be incredibly difficult and costly to rectify, particularly if they are found later on in the production process. Research by Austrian software testing frim Tricentis, estimated that software bugs, glitches and security failures cost businesses across the world $1.1 trillion.

To achieve the full potential of virtual commissioning, the simulation must be integrated across the entire plant process, including both the planning and engineering phase. Known as simulation-based engineering, this step is integral for the installation of reliable systems. The use of simulations in a plant is not a new concept, in fact virtual commissioning has been researched for more than a decade.

The benefits
The implementation of virtual commissioning brings with it a number of benefits. The ‘try before you buy’ concept allows plant managers to model and test the behavior of a line before making any physical changes. This saves time as the user can program the system’s automation while testing and fixing errors. The use of a digital model can also reduce risk when changing or adding processes.

One company which has seen significant improvements in production since investing in virtual commissioning is Comau, a supplier of automotive body and powertrain manufacturing and assembly technologies. Comau’s head of engineering and automation systems, Franceso Matergia, said: “We were able to reprogram 200 robots in just three days using virtual commissioning as opposed to roughly 10 weekends had the work been done on the factory floor.”

Just as you wouldn’t build a space elevator without meticulous planning and years of small scale prototyping, it’s very cost and time beneficial to build and test in a virtual environment where you can find the bugs and discover the unforeseen challenges and mitigate them without added downtime or loss of production. It’s much better to discover that bug while on the ground versus at 100,000 feet midway between the surface of the earth and that penthouse in space.

@ABBgroupnews #PAuto @StoneJunctionPR

Monitoring and managing the unpredictable.

07/06/2018
Energy sector investments in big data technologies have exploded. In fact, according to a study by BDO, the industry’s expenditure on this technology in 2017 has increased by ten times compared to the previous year, with the firm attributing much of this growth to the need for improved management of renewables. Here, Alan Binning, Regional Sales Manager at Copa-Data UK, explores three common issues for renewables — managing demand, combining distributed systems and reporting.

Renewables are set to be the fastest-growing source of electrical energy generation in the next five years. However, this diversification of energy sources creates a challenge for existing infrastructure and systems. One of the most notable changes is the switch from reliable to fluctuating power.

Implementing energy storage
Traditional fossil-fuel plants operate at a pre-mitigated level, they provide a consistent and predictable amount of electricity. Renewables, on the other hand, are a much more unreliable source. For example, energy output from a solar farm can drop without warning due to clouds obscuring sunlight from the panels. Similarly, wind speeds cannot be reliably forecasted. To prepare for this fluctuation in advance, research and investment into energy storage systems are on the rise.

For example, wind power ramp events are a major challenge. Therefore, developing energy storage mechanisms is essential. The grid may not always be able to absorb excess wind power created by an unexpected windspeed increase. Ramp control applications allow the turbine to store this extra power in the battery instead. When combined with reliable live data, these systems can develop informed models for intelligent distribution.

Britain has recently become home to one of the largest energy storage projects to use EV batteries. While it is not the first-time car batteries have been used for renewable power, the Pen y Cymoedd wind farm in Wales has connected a total of 500 BMW i3 batteries to store excess power.

Combining distributed systems
Control software is the obvious solution to better monitor this fluctuating source of power. However, many renewable energy generation sites, like solar PV and wind farms, are distributed across a wide geographical scope and are therefore more difficult to manage without sophisticated software.

Consider offshore wind farms as an example. The world’s soon-to-be-largest offshore wind farm is currently under construction 74.5 miles off the Yorkshire coastline. To accurately manage these vast generation sites, the data from each asset needs to be combined into a singular entity.

This software should be able to combine many items of distributed equipment, whether that’s an entire wind park or several different forms of renewable energy sources, into one system to provide a complete visualisation of the grid.

Operators could go one step further, by overlaying geographical information systems (GIS) data into the software. This could provide a map-style view of renewable energy parks or even the entire generation asset base, allowing operators to zoom on the map to reveal greater levels of detail. This provides a full, functional map enabling organisations to make better informed decisions.

Reporting on renewables
Controlling and monitoring renewable energy is the first step to better grid management. However, it is what energy companies do with the data generated from this equipment that will truly provide value. This is where reporting is necessary.

Software for renewable energy should be able to visualise data in an understandable manner so that operators can see the types of data they truly care about. For example, wind farm owners tend to be investors and therefore generating profit is a key consideration. In this instance, the report should compare the output of a turbine and its associated profit to better inform the operator of its financial performance.

Using intelligent software, like zenon Analyzer, operators can generate a range of reports about any information they would like to assess — and the criteria can differ massively depending on the application and the objectives of the operator. Reporting can range from a basic table of outputs, to a much more sophisticated report that includes the site’s performance against certain key performance indicators (KPIs) and predictive analytics. These reports can be generated from archived or live operational data, allowing long term trends to be recognised as well as being able to react quickly to maximise efficiency of operation.

As investments in renewable energy generation continue to increase, the need for big data technologies to manage these sites will also continue to grow. Managing these volatile energy sources is still a relatively new challenge. However, with the correct software to combine the data from these sites and report on their operation, energy companies will reap the rewards of these increasingly popular energy sources.


Challenges facing energy industry sector.

21/05/2018

Leaders from Britain’s  energy industry attended Copa Data’s  zenon Energy Day 2018 at the Thames Valley Microsoft centre. The event, which was held on in April 2018, welcomed industry experts and energy suppliers to address the current challenges the sector is facing — renewable generation, substation automation, IoT and cyber security.

scamaill

A welcome speech from the British MD od Copa Data , Martyn Williams, started a day encompassed a series of talks from industry experts. Speakers included Ian Banham, IoT Technical Sales Lead UK for Microsoft, Chris Dormer of systems integrator, Capula and Jürgen Resch, Copa Data Energy Industry Manager.

Preparing for renewables
Only 24 per cent of Britain’s electricity comes from renewable sources — a relatively low figure compared to some European countries.  However, the percentage is growing. In 2000, Britain’s renewable capacity was 3,000 MW, and rose eleven-fold by the end of 2016 to 33,000 MW.

To prepare for the impending challenges for this market, Jürgen Resch’s presentation discussed how software can alleviate some of the common questions associated with renewable energy generation, including the growing demand for energy storage.
“Energy storage is often used in combination with renewables because renewable energy is volatile and fluctuating,” explained Resch. “In Korea, the government is pumping $5 billion dollars into energy storage systems. In fact, every new building that is built in Korea gets an energy storage battery fitted into the basement.”

BMW’s battery storage farm in Leipzig (D) was also presented as an example. The facility, which uses COPA-DATA’s zenon as the main control centre system, uses 700 high-capacity used battery packs from BMW i3s and could also provide storage capacity for local wind energy generation.

Moving onto specific issued related to wind generation, Resch discussed the potential challenge of reporting in a sector reliant on unpredictable energy sources.
“Reports are particularly important in the wind power industry,” he said. “Typically, owners of wind farms are investors and they want to see profits. Using software, like zenon Analyzer, operators can generate operational reports.

“These reports range from a basic table with the wind speeds, output of a turbine and its associated profit, or a more sophisticated report with an indication of the turbines performance against specific key performance indicators (KPIs).”

Best practice for substation automation
Following the morning’s keynote speeches on renewable energy, Chris Dormer of Capula, presented the audience with a real-life case study. The speech discussed how smart automation helped to address significant issues related to the critical assets of the National Grid’s substations, where Capula was contracted to refurbish the existing substation control system at New Cross.

substn“Like a lot of companies that have developed, grown and acquired assets over the years, energy providers tend to end up with a mass mixture of different types of technology, legacy equipment and various ways to handling data,” explained Dormer. “For projects like this, the first key evaluation factor is choosing control software with legacy communication. We need to ensure the software can talk to both old legacy equipment in substations as well as modern protocol communications, whilst also ensuring it was scalable and compliant.

“The National Grid will make large investments into IEC 61850 compatible equipment, therefore for this project, we needed an IEC 61850 solution. Any system we put in, we want to support it for the next 25 years. Everyone is talking about digital substations right now, but there are not that many of them out there. That said, we need to prepare and be ready.”

The case study, which was a collaborative project with COPA-DATA, was recognised at the UK Energy Innovation Awards 2017, where it was awarded the Best Innovation Contributing to Quality and Reliability of Electricity Supply.

“Our collaboration with COPA-DATA allows us to address modern energy challenges,” explained Mark Hardy, Managing Director of Capula upon winning the award last year. “It helps drive through the best value for energy customers.”

Cyber security – benefit or burden?
“Raise your hand if you consider cyber security to be a benefit?” Mark Clemens, Technical Product Manager at Copa Data asked the audience during his keynote speech on cyber security. “Now, raise your hand if you consider it to be a burden?”

substn2Clemens’ question provided interesting results. Numerous attendees kept their hands raised for both questions, giving an insight into the perception of cyber security for those operating in the energy industry — a necessary evil.

“A cyber-attack on our current infrastructure could be easy to execute,” continued Clemens. “95 per cent of communication protocols in automation systems don’t provide any security features. For those that do provide security, the mechanisms are often simply bolted-on.”

Clemens continued to explain how substation design can strengthen the security of these sites. He suggested that, despite living in the era of IoT, energy companies should limit the communication between devices to only those that are necessary. The first step he suggested was to establish a list of assets, including any temporary assets like vendor connections and portable devices.

“There are lots of entry points into a substation, not only through the firewall but through vendors and suppliers too. This doesn’t have to be intentional but could be the result of a mistake. For example, if an engineer is working in the substation and believe they are testing in simulation mode, but they are not, it could cause detrimental problems.”

Collaborating with Microsoft
The address of Microsoft’s UK IoT Technical Sales Lead, Ian Banham focused on the potential of cloud usage for energy companies. When asking attendees who had already invested in cloud usage, or planned on doing so, the audience proved to be a 50:50 split of cloud enthusiasts and sceptics.

“IoT is nothing new,” stated Ian Banham, IoT Technical Sales Lead at Microsoft. “There’s plenty of kit that does IoT that is over 20 years old, it just wasn’t called IoT then. That said, there’s not a great deal of value in simply gathering data, you’ve got to do something with that data to realise the value from it.

“The change in IoT is the way the technology has developed. That’s why we are encouraging our customers to work with companies like COPA-DATA. They have done the hard work for you because they have been through the process before.”

He explained how Microsoft’s cloud platform, Azure, could be integrated with COPA-DATA’s automation software, zenon. In fact, COPA-DATA’s partnership with Microsoft is award-winning, COPA-DATA having won Microsoft Partner of the Year in the IoT category in 2017.

@copadata #PAuto @Azure #Cloud #IoT


Keep making the tablets!

08/05/2018
This article shows how valuable manufacturing production line downtime in the pharmaceutical industry can be reduced by ensuring predictive maintenance of tablet making machinery using Harting’s MICA industrial computing platform.

Introduction
Harting recently challenged postgraduate students from the Centre for Doctoral Training in Embedded Intelligence at Loughborough University to investigate practical application solutions where MICA – the company’s innovative open platform based ruggedised industrial edge computing device – could be applied to the benefit of manufacturing. Simple seamless integration within existing established production processes was the target, based on the concept of machine predictive maintenance.

The key objective was to achieve immediate productivity improvements and return on investment (RoI), thus satisfying the increasing trend for Integrated Industry 4.0 implementation on the factory floor. One such proposal was suggested for volume manufacturers in the pharmaceutical industry: in particular, those companies manufacturing tablets using automated presses and punch tools.

Data from these machines can be collected using passive UHF RFID “on metal” transponders which can be retrofitted to existing tablet press machines and mounted on the actual press-die/punch tools. The RFID read and write tags can record the pressing process, i.e. the number of operations performed by a particular press die, plus any other critical operating sensor-monitored conditions. The system can then review that data against expected normal end-of-life projected limits set for that die.

Such data can be managed and processed through Harting’s MICA edge computing device, which can then automatically alert the machine operator that maintenance needs to take place to replace a particular die-set before it creates a catastrophic tool failure condition and breakdown in the production line – which unfortunately is still quite a common occurrence.

Open system software
MICA is easy to use, with a touch-optimised interface for end users and administrators implemented entirely in HTML5 and JavaScript. It provides an open system software environment that allows developers from both the production and IT worlds to quickly implement and customise projects without any special tools. Applications are executed in their own Linux-based containers, which contain all the necessary libraries and drivers. This means that package dependencies and incompatibilities are eliminated. In addition, such containers run in individual “sandboxes” which isolate and secure different applications from one another with their own separate log-in and IP addresses. As a result, there should be no concerns over data security when MICA is allowed access to a higher-level production ERP network.

MICA is already offered with a number of containers such as Java, Python C/C++, OPC-UA, databases and web toolkits, all available on free download via the HARTING web site. As a result, users should be able to download links to the operating software system compatible with an existing machine, enabling full 2-way communication with the MICA device. Relaying such manufacturing information, which can comprise many gigabytes of data in the course of a day, directly to the ERP would normally overwhelm both the network and the ERP. With the MICA, this data stream is buffered directly onto the machine and can be reduced to just essential business-critical data using proven tools from the IT world.

The resultant improvements in productivity include:

– Less downtime reduces the amount of money lost during unforeseen maintenance of damaged punch tools.
– Individual punch identification will help in removing a specific punch, once it has reached its pre-set operational frequency working limit.
– A digital log of each punch and the number of tablets that it has produced is recorded. This provides vital information for GMP (Good Manufacturing Practice) regulators such as the MHRA (Medicines & Healthcare products Regulatory Agency) or the FDA (Food & Drug Administration).

A further benefit is that MICA is very compact, with DIN rail mounting fixing options that allow it to be easily accommodated inside a machine’s main control cabinet.

@HARTING #PAuto #Pharma @CDT_EI

Researchers investigate ultra-low Mediterranean nutrient levels.

25/04/2018

Researchers at Haifa University’s Marine Biological Station in Israel are exploiting the ultra-low detection limits of advanced laboratory equipment to measure extremely low nutrient concentrations in marine water.

H.Nativ – Morris Kahn Marine Research Station

The University’s Prof. M. D. Krom says: “We work in the Eastern Mediterranean which has the lowest regional concentration of dissolved nutrients anywhere in the global ocean. We therefore utilize an automated segmented flow analyzer from SEAL Analytical, which has been specially adapted to accommodate ultra-low measurements.”

The SEAL AutoAnalyzer 3 (AA3) is a 4 channel system, measuring Phosphate with a long flow cell which has a detection limit of 2 nM. Ammonia is measured using a JASCO fluorometer with a similar ultra-low detection limit, and Silicate, which has a higher concentration, is measured using SEAL’s high resolution colorimetric technology.

The measurement data are being used to determine the season nutrient cycling in the system, which will then be used to help understand the nature of the food web and the effects of global environmental and climate change.

Low nutrient levels in the Mediterranean
The eastern Mediterranean Sea (EMS) has an almost unique water circulation. The surface waters (0-200m) flow into the Mediterranean through the Straits of Gibraltar and from there into the EMS at the Straits of Sicily. As the water flows towards the east it becomes increasingly saline and hence denser. When it reaches the coast of Turkey in winter it also cools and then flows back out of the Mediterranean under the surface waters to Sicily, and then eventually through the Straits of Gibraltar to the North Atlantic. This outflowing layer exists between 200m and 500m depth.

Phytoplankton grow in the surface waters (0-200m) because that is the only layer with sufficient light. This layer receives nutrients from the adjacent land, from rivers and wastewater discharges, and also from aerosols in the atmosphere. These nutrients are utilized by the plankton as they photosynthesize. When the plants die (or are eaten) their remains drop into the lower layer and are jetted out of the EMS. Because the water flows are so fast (it takes just 8 years for the entire surface layers of the EMS to be replaced), these nutrient rich intermediate waters rapidly expel nutrients from the basin. The result is very low nutrient concentrations and very low numbers of phytoplankton – some of the lowest values anywhere in the world. Prof. Krom says: “The maximum levels of nutrients measured in the EMS are 250 nM phosphate, 6 uM nitrate and 6-12 uM silicate. Ammonia is often in the low nanomolar range. By contrast, in the North Atlantic, values are 1000 nM phosphate, 16 uM nitrate and 20 uM silicate, and the levels in the North Pacific are even higher.”

The value of data
The low levels of plankton caused by low nutrient levels, result in a low biomass of fish. Nevertheless, coastal areas generally support more fish than offshore, so the research will seek to quantify and understand the nutrient cycle in the coastal regions, which is poorly understood at present. “We plan to develop understandings which will inform stakeholders such as government. For example, there is a discussion about the potential for fish farms off the Israeli coast, so our work will enable science-based decisions regarding the quantity of fish that the system can support.”

To-date, three data sets have been taken from the EMS, and the first publishable paper is in the process of being prepared.

Choosing the right analyzer
Prof. Krom says that his first ‘real’ job was working for the (then) Water Research Centre at Medmenham in Britain, where he was involved in the development of chemical applications for the Technicon AA-II autoanalyzers, which included going on secondment to Technicon for several months. SEAL Analytical now own and manufacture the AutoAnalyzer brand of Continuous Segmented Flow Analyzers, so his career has been connected with autoanalyzers for decades. For example his is Professor (Emeritus) at the University of Leeds (GB), where, again, he worked with SEAL autanalyzers. An AA3 instrument was employed at Leeds in a project to investigate the nature of atmospheric acid processing of mineral dusts in supplying bioavailable phosphorus to the oceans.

Explaining the reasoning behind the purchase of a new AA3 at Haifa University, Prof. Krom says: “During a research cruise, it is necessary to analyse samples within a day to avoid changes in concentration due to preservation procedures.

“Typically we analyse 50-80 samples per day, so it is useful to useful to be able to analyze large numbers of samples automatically. However, the main reasons for choosing the SEAL AA3 were the precision, accuracy and low limits of detection that it provides.”

Commenting on this application for SEAL’s analyzers, company President Stuart Smith says: “Many of our customers analyze nutrient levels in freshwater and marine water samples, where high levels of nutrients are a concern because of increasing levels of algal blooms and eutrophication. However, Prof. Krom’s work is very interesting because, in contrast, he is looking at extremely low levels, so it is very gratifying that our instruments are able to operate at both ends of the nutrient concentration spectrum.

Bibliography
• Powley, H.R., Krom, M.D., and Van Cappellen, P. (2017) Understanding the unique biogeochemistry of the Mediterranean Sea: Insights from a coupled phosphorus and nitrogen model. Global Biogeochemical Cycles, 11; 1010-1031. DOI 10.1002/2017GB005648.

• Stockdale, A. Krom, M. D., Mortimer, R.J.G., Benning, L.G., Carslaw, K.S., Herbert, R.J., Shi, Z., Myriokefalitakis, S., Kanakidou, M., and Nenes, A., (2016) Understanding the nature of atmospheric acid processing of mineral dusts in supplying bioavailable phosphorus to the oceans. PNAS vol. 113 no. 51

#SealAnal #Marine @_Enviro_News

Blockchain: the future of food traceability?

20/04/2018
Shan Zhan, global business manager at ABB’s food and beverage business, looks at how blockchain* can be used to enhance food traceability.

“The Blockchain, can change…well everything.” That was the prediction of Goldman Sachs in 2015. There has been a lot of talk in the media recently about Blockchain, particularly around Bitcoin and other cryptocurrencies, but just as the investment bank giant predicted, the technology is starting to have more wide-reaching impacts on other sectors.

A report from research consultancy Kairos Future describes blockchain as a founding block for the digitalization of society. With multinationals such as IBM and Walmart driving a pilot project using blockchain technology for traceability, the food and beverage industry needs to look at the need for the protection of traceability data.

The United Nations recognizes food security as a key priority, especially in developing countries. While most countries must abide by strict traceability regulations, which are particularly strong in the EU, other regions may not have the same standards or the data may be at risk of fraud.

Food fraud is described by the Food Safety Net Services (FSNS) as the act of purposely altering, misrepresenting, mislabeling, substituting or tampering with any food product at any point along the farm-to-table food supply chain. Since the thirteenth century, laws have existed to protect consumers against harm from this. The first instance recorded of these laws was during the reign of English monarch King John, when England introduced laws against diluting wine with water or packing flour with chalk.

The crime still exists to this day. While malicious contamination intended to damage public health is a significant concern, a bigger problem is the mislabeling of food for financial gain. The biggest areas of risk are bulk commodities such as coffee and tea, composite meat products and Marine Stewardship Council (MSC) labelled fish. For example, lower-cost types of rice such as long-grain are sometimes mixed with a small amount of higher-priced basmati rice and sold as the latter. By using blockchain technology in their traceability records, food manufacturers can prevent this from happening.

Blockchain is a type of distributed ledger technology that keeps a digital record of all transactions. The records are broadcasted to a peer-to-peer (P2P) network consisting of computers known as nodes. Once a new transaction is verified, it is added as a new block to the blockchain and cannot be altered. And as the authors of Blockchain Revolution explain, “the blockchain is an incorruptible digital ledger of economic transactions that can be programmed to record not just financial transactions but virtually everything of value”.

When records of suppliers and customers are collected manually, to ensure the end manufacturer can trace the entire process, this does not protect the confidential data of suppliers. Blockchain technology anonymizes the data but it is still sufficient to ensure that the supply chain is up to standard.

In the case of mislabeled basmati rice, blockchain technology would prevent food fraud as the amount of each ingredient going into the supply chain cannot be lower than the volume going out. This would flag the product as a fraudulent product.

Not only can it help to monitor food ingredients, it can also monitor the conditions at the production facility. These are often very difficult to verify and, even if records are taken, they can be falsified. A photo or digital file can be taken to record the situation, such as a fish being caught, to show that it complies with the MSC’s regulations on sustainably caught seafood.

The blockchain will then create a secure digital fingerprint for this image that is recorded in the blockchain, known as a hash. The time and location of the photograph will be encrypted as part of this hash, so it cannot be manipulated. The next supplier in the blockchain will then have a key to this hash and will be able to see that their product has met the regulations.

Food and beverage manufacturers can also use blockchain to ensure that conditions at their production facilities are being met, or any other data that needs to be securely transferred along the production line. While we are not yet advanced enough with this technology to implement across all food and beverage supply chains, increased digitalization and being at the forefront of investment into these technologies will help plant managers to prepare their supply chain against the food fraud threat.

* The Wikipedia entry on Blockchain!

@ABBgroupnews #PAuto #Food @FSNSLABS @MSCecolabel