Monitoring and managing the unpredictable.

07/06/2018
Energy sector investments in big data technologies have exploded. In fact, according to a study by BDO, the industry’s expenditure on this technology in 2017 has increased by ten times compared to the previous year, with the firm attributing much of this growth to the need for improved management of renewables. Here, Alan Binning, Regional Sales Manager at Copa-Data UK, explores three common issues for renewables — managing demand, combining distributed systems and reporting.

Renewables are set to be the fastest-growing source of electrical energy generation in the next five years. However, this diversification of energy sources creates a challenge for existing infrastructure and systems. One of the most notable changes is the switch from reliable to fluctuating power.

Implementing energy storage
Traditional fossil-fuel plants operate at a pre-mitigated level, they provide a consistent and predictable amount of electricity. Renewables, on the other hand, are a much more unreliable source. For example, energy output from a solar farm can drop without warning due to clouds obscuring sunlight from the panels. Similarly, wind speeds cannot be reliably forecasted. To prepare for this fluctuation in advance, research and investment into energy storage systems are on the rise.

For example, wind power ramp events are a major challenge. Therefore, developing energy storage mechanisms is essential. The grid may not always be able to absorb excess wind power created by an unexpected windspeed increase. Ramp control applications allow the turbine to store this extra power in the battery instead. When combined with reliable live data, these systems can develop informed models for intelligent distribution.

Britain has recently become home to one of the largest energy storage projects to use EV batteries. While it is not the first-time car batteries have been used for renewable power, the Pen y Cymoedd wind farm in Wales has connected a total of 500 BMW i3 batteries to store excess power.

Combining distributed systems
Control software is the obvious solution to better monitor this fluctuating source of power. However, many renewable energy generation sites, like solar PV and wind farms, are distributed across a wide geographical scope and are therefore more difficult to manage without sophisticated software.

Consider offshore wind farms as an example. The world’s soon-to-be-largest offshore wind farm is currently under construction 74.5 miles off the Yorkshire coastline. To accurately manage these vast generation sites, the data from each asset needs to be combined into a singular entity.

This software should be able to combine many items of distributed equipment, whether that’s an entire wind park or several different forms of renewable energy sources, into one system to provide a complete visualisation of the grid.

Operators could go one step further, by overlaying geographical information systems (GIS) data into the software. This could provide a map-style view of renewable energy parks or even the entire generation asset base, allowing operators to zoom on the map to reveal greater levels of detail. This provides a full, functional map enabling organisations to make better informed decisions.

Reporting on renewables
Controlling and monitoring renewable energy is the first step to better grid management. However, it is what energy companies do with the data generated from this equipment that will truly provide value. This is where reporting is necessary.

Software for renewable energy should be able to visualise data in an understandable manner so that operators can see the types of data they truly care about. For example, wind farm owners tend to be investors and therefore generating profit is a key consideration. In this instance, the report should compare the output of a turbine and its associated profit to better inform the operator of its financial performance.

Using intelligent software, like zenon Analyzer, operators can generate a range of reports about any information they would like to assess — and the criteria can differ massively depending on the application and the objectives of the operator. Reporting can range from a basic table of outputs, to a much more sophisticated report that includes the site’s performance against certain key performance indicators (KPIs) and predictive analytics. These reports can be generated from archived or live operational data, allowing long term trends to be recognised as well as being able to react quickly to maximise efficiency of operation.

As investments in renewable energy generation continue to increase, the need for big data technologies to manage these sites will also continue to grow. Managing these volatile energy sources is still a relatively new challenge. However, with the correct software to combine the data from these sites and report on their operation, energy companies will reap the rewards of these increasingly popular energy sources.


Challenges facing energy industry sector.

21/05/2018

Leaders from Britain’s  energy industry attended Copa Data’s  zenon Energy Day 2018 at the Thames Valley Microsoft centre. The event, which was held on in April 2018, welcomed industry experts and energy suppliers to address the current challenges the sector is facing — renewable generation, substation automation, IoT and cyber security.

scamaill

A welcome speech from the British MD od Copa Data , Martyn Williams, started a day encompassed a series of talks from industry experts. Speakers included Ian Banham, IoT Technical Sales Lead UK for Microsoft, Chris Dormer of systems integrator, Capula and Jürgen Resch, Copa Data Energy Industry Manager.

Preparing for renewables
Only 24 per cent of Britain’s electricity comes from renewable sources — a relatively low figure compared to some European countries.  However, the percentage is growing. In 2000, Britain’s renewable capacity was 3,000 MW, and rose eleven-fold by the end of 2016 to 33,000 MW.

To prepare for the impending challenges for this market, Jürgen Resch’s presentation discussed how software can alleviate some of the common questions associated with renewable energy generation, including the growing demand for energy storage.
“Energy storage is often used in combination with renewables because renewable energy is volatile and fluctuating,” explained Resch. “In Korea, the government is pumping $5 billion dollars into energy storage systems. In fact, every new building that is built in Korea gets an energy storage battery fitted into the basement.”

BMW’s battery storage farm in Leipzig (D) was also presented as an example. The facility, which uses COPA-DATA’s zenon as the main control centre system, uses 700 high-capacity used battery packs from BMW i3s and could also provide storage capacity for local wind energy generation.

Moving onto specific issued related to wind generation, Resch discussed the potential challenge of reporting in a sector reliant on unpredictable energy sources.
“Reports are particularly important in the wind power industry,” he said. “Typically, owners of wind farms are investors and they want to see profits. Using software, like zenon Analyzer, operators can generate operational reports.

“These reports range from a basic table with the wind speeds, output of a turbine and its associated profit, or a more sophisticated report with an indication of the turbines performance against specific key performance indicators (KPIs).”

Best practice for substation automation
Following the morning’s keynote speeches on renewable energy, Chris Dormer of Capula, presented the audience with a real-life case study. The speech discussed how smart automation helped to address significant issues related to the critical assets of the National Grid’s substations, where Capula was contracted to refurbish the existing substation control system at New Cross.

substn“Like a lot of companies that have developed, grown and acquired assets over the years, energy providers tend to end up with a mass mixture of different types of technology, legacy equipment and various ways to handling data,” explained Dormer. “For projects like this, the first key evaluation factor is choosing control software with legacy communication. We need to ensure the software can talk to both old legacy equipment in substations as well as modern protocol communications, whilst also ensuring it was scalable and compliant.

“The National Grid will make large investments into IEC 61850 compatible equipment, therefore for this project, we needed an IEC 61850 solution. Any system we put in, we want to support it for the next 25 years. Everyone is talking about digital substations right now, but there are not that many of them out there. That said, we need to prepare and be ready.”

The case study, which was a collaborative project with COPA-DATA, was recognised at the UK Energy Innovation Awards 2017, where it was awarded the Best Innovation Contributing to Quality and Reliability of Electricity Supply.

“Our collaboration with COPA-DATA allows us to address modern energy challenges,” explained Mark Hardy, Managing Director of Capula upon winning the award last year. “It helps drive through the best value for energy customers.”

Cyber security – benefit or burden?
“Raise your hand if you consider cyber security to be a benefit?” Mark Clemens, Technical Product Manager at Copa Data asked the audience during his keynote speech on cyber security. “Now, raise your hand if you consider it to be a burden?”

substn2Clemens’ question provided interesting results. Numerous attendees kept their hands raised for both questions, giving an insight into the perception of cyber security for those operating in the energy industry — a necessary evil.

“A cyber-attack on our current infrastructure could be easy to execute,” continued Clemens. “95 per cent of communication protocols in automation systems don’t provide any security features. For those that do provide security, the mechanisms are often simply bolted-on.”

Clemens continued to explain how substation design can strengthen the security of these sites. He suggested that, despite living in the era of IoT, energy companies should limit the communication between devices to only those that are necessary. The first step he suggested was to establish a list of assets, including any temporary assets like vendor connections and portable devices.

“There are lots of entry points into a substation, not only through the firewall but through vendors and suppliers too. This doesn’t have to be intentional but could be the result of a mistake. For example, if an engineer is working in the substation and believe they are testing in simulation mode, but they are not, it could cause detrimental problems.”

Collaborating with Microsoft
The address of Microsoft’s UK IoT Technical Sales Lead, Ian Banham focused on the potential of cloud usage for energy companies. When asking attendees who had already invested in cloud usage, or planned on doing so, the audience proved to be a 50:50 split of cloud enthusiasts and sceptics.

“IoT is nothing new,” stated Ian Banham, IoT Technical Sales Lead at Microsoft. “There’s plenty of kit that does IoT that is over 20 years old, it just wasn’t called IoT then. That said, there’s not a great deal of value in simply gathering data, you’ve got to do something with that data to realise the value from it.

“The change in IoT is the way the technology has developed. That’s why we are encouraging our customers to work with companies like COPA-DATA. They have done the hard work for you because they have been through the process before.”

He explained how Microsoft’s cloud platform, Azure, could be integrated with COPA-DATA’s automation software, zenon. In fact, COPA-DATA’s partnership with Microsoft is award-winning, COPA-DATA having won Microsoft Partner of the Year in the IoT category in 2017.

@copadata #PAuto @Azure #Cloud #IoT


GIS in power!

20/02/2018
Geographic Information Systems (GIS) are not a new phenomenon. The technology was first used during World War II to gather intelligence by taking aerial photographs. However, today’s applications for GIS are far more sophisticated. Here, Martyn Williams, managing director of  COPA-DATA UK, explains how the world’s energy industry is becoming smarter, using real-time GIS.

GIS mapping is everywhere. In its most basic format, the technology is simply a computerised mapping system that collects location-based data — think Google Maps and its live traffic data. Crime mapping, computerised road networking and the tech that tags your social media posts in specific locations? That’s GIS too.

Put simply, anywhere that data can be generated, analysed and pinned to a specific geographical point, there’s potential for GIS mapping. That said, for the energy industry, GIS can provide more value than simply pinning where your most recent selfie was taken.

Managing remote assets
One of the biggest challenges for the industry is effectively managing geographically dispersed sites and unmanned assets, such as wind turbines, offshore oil rigs or remote electrical substations.

Take an electrical distribution grid as an example. Of the 400,000 substations scattered across Britain, many are remote and unmanned. Therefore, it is common for operators to rely on a connected infrastructure and control software to obtain data from these sites. While this data is valuable, it is the integration of GIS mapping that enables operators to gain a full visual overview of their entire grid.

Using GIS, operators are provided with a map of the grid and every substation associated with it. When this visualisation is combined with intelligent control software, the operator can pin data from these remote sites on one easy-to-read map.

Depending on the sophistication of the control software used, the map can illustrate the productivity, energy consumption or operational data from each substation. In fact, operators can often choose to see whatever data is relevant to them and adjust their view to retrieve either more, or less, data from the map.

When used for renewable energy generation, the operator may want to see the full geographical scope of the wind turbines they control, pin-pointed on a geographically accurate map. However, upon zooming into the map, it is possible for the operator to view the status, control and operational data from each turbine on the grid.

GIS mapping is not only advantageous for monitoring routine operations, but also for alerting operators of unexpected faults in the system.

Taking out the guesswork
Unexpected equipment failure can be devastating to any business. However, when managing assets in the energy industry, providing a fundamental service to the public, the impact of downtime can be devastating.

Traditionally, energy organisations would employ huge maintenance teams to quickly react to unexpected errors, like power outages or equipment breakdowns. However, with GIS and software integration, this guesswork approach to maintenance is not necessary.

The combination of GIS with an intelligent control system means that operators will be alerted of faults in real-time, regardless of whether it occurs at an offshore wind turbine, a remote pumping station or a substation. When an error is identified, the operator is automatically informed of exactly where the fault has occurred, by a pinpoint on the map.

Enabling intelligent maintenance
In the energy industry, there is no sure-fire way to predict exactly how and when faults will occur, but there are ways to deploy reliability centred maintenance (RCM) techniques to minimise downtime when they do.

Using GIS-mapping and alerts, an operator can accurately pinpoint the location of the error, and a maintenance engineer can be deployed to the site immediately. This allows organisations to plan more effectively from a human asset perspective, ensuring their engineers are in the right places at the right time.

In addition, using the data acquired by the control software, the engineer can then take a more intelligent approach to maintenance. GIS mapping allows an operator to easily extract data from the exact location that the fault occurred, passing this to the engineer for more intelligent maintenance.

For the energy industry, GIS technology provides an opportunity to better understand remote operations, enables more effective maintenance and could dramatically minimises downtime caused by unexpected errors. The reliability of the technology has already been proven in other industry areas, like crime mapping, road networking — and for novelty applications, like social media tagging.

Now, it’s time for the energy industry to make its mark on the GIS map.


Compliance – more than just red tape.

03/07/2016

A growing customer demand for regulatory compliance combined with increased competition amongst manufacturers has made SCADA software a minimum requirement for the pharmaceutical industry. Here, Lee Sullivan, Regional Manager at COPA-DATA UK discusses why today, more than ever before, regulatory compliance is crucial for the pharmaceutical industry.

copabatchstdFDA statistics are forcing the industry to identify and implement improvements to its manufacturing processes. In its latest reports (published in Automation World), the FDA identified a significant increase in the number of drug shortages reported globally. With 65% of these drug shortage instances directly caused by issues in product quality, it’s clear that if more pharma manufacturers aimed to meet the criteria outlined in FDA and other industry standards, drug shortages and quality issues would certainly become less frequent.

The compulsion to become compliant obviously differs from company to company and standard to standard but one thing is certain: development starts with batch software. The range and capabilities of batch software vary immensely, but there are three factors to consider before making a choice: flexibility, connectivity and ergonomics.

Flexibility
To assist the complex processes of pharmaceutical manufacturing, batch software needs to be flexible. The software should manage traceability of raw materials through to the finished product and communicate fluently with other equipment in the facility. To ensure it provides a consistent procedure and terminology for batch production, the software should also be in line with the ISA-88 standard.

To meet increasing demand for personalised medication, manufacturers are seeking out batch software that is capable of creating flexible recipes, which are consistently repeatable. Traditional batch control creates one sequence of each process-specific recipe. While this model may be ideal for high volume manufacturing where the recipe does not change, today’s pharmaceutical production requires more flexibility.

To remain competitive, manufacturers need to provide compliance in a quick and cost-effective manner. Modern batch control software ensures flexibility by separating the equipment and recipe control. This allows the operator to make changes in batch recipes without any modifications to the equipment or additional engineering, thus saving the manufacturer time and money.

Connectivity
To avoid complications, manufacturers should choose independent software that supports a wide range of communication protocols. COPA-DATA’s zenon, for example, offers more than 300 high-performance interfaces that function in accordance with the ‘plug-and-play principle’. This makes it easy to implement and the user can start to collect, process and monitor production data straight away.

The communication model of the batch software extends upwards to fully integrate into manufacturing execution systems (MES) and business enterprise resource planning (ERP) systems. This links the raw material from goods-in through to the finished product at the customer’s site. The strong communication platform includes all layers of a production environment and extends to these systems.

Having no association with specific hardware providers ensures that regardless of the make and age of equipment, the batch software will be fully compatible and integrate seamlessly. Using this high level of connectivity minimises disruptions and quality problems, while also allowing pharmaceutical companies to collect data from the entire factory to archive digital records and ensure compliance across the processing line – allowing manufacturers to establish a fully connected smart factory.

Ergonomics
Lastly, understanding and using batch software should be stress free. As the pharmaceutical industry becomes more complex and more manufacturers begin exploring the realms of smart manufacturing, factory operators should be able to control and change batch production without complications.

By using software fitted with clear parameters and access levels, operators gain the ability to create and validate recipes, monitor the execution of production and review the performance of industrial machinery – without accidently altering or changing existing recipes and data. Reducing the amount of software engineering makes the operator’s life easier and minimises potential problems that could arise.

The benefits of complying with various industry regulations and standards do not stop with an enhanced Quality Management System (QMS). More customers will buy from you because you appear more reliable and your supply chain will see improved production indicators, such as increased OEE, reduced wastage, reduced recalls. On top of all of these benefits, you also improve product and thus patient safety.

To comply with industry standards, pharmaceutical companies should take steps to modernise their manufacturing processes, beginning with upgrading their batch control software. Anything else would be like putting the cart before the horse.

 

@copadata  @COPADATAUK #PAuto #SCADA

Unlocking the value of cross-facility data sets.

29/04/2016

The great migration: cloud computing for smart manufacturing
By Martyn Williams, managing director at COPA-DATA UK.

According to this industry survey by IBM, two thirds of mid-sized companies have already implemented – or are currently considering – a cloud based storage model for their organisation. The analytic advantages of cloud computing in industry are no secret, in fact, 70 per cent of these cloud-using respondents said they were actively pursuing cloud-based analytics to gleam greater insights and efficiency in order to achieve business goals

Copa_Cloud_MigrationFor the manufacturing industry, the benefits of migrating to cloud computing have been heavily publicised, but in an industry that has been slow to embrace new technology, a mass move to the cloud can feel like a leap in the unknown. Despite an increased adoption of smart manufacturing technologies, some companies may still feel hesitant. Instead, many decide to test the water by implementing a cloud storage model in just one production site. However, this implementation model can only provide limited benefits in comparison to a mass, multi-site migration to the cloud.

So what should companies expect to undertake during their cloud migration?

Define business objectives
Before migrating to the cloud, companies should first consider how it can help them achieve -and in some cases refine – their business objectives and plan their migration with these objectives in mind. For businesses that want to improve collaboration and benchmarking across multiple locations, for example, the cloud plays a significant role.

A company with multiple production sites operating in several locations will be familiar with the complications of cross-facility benchmarking. Often, business objectives or key performance indicators (KPIs) are only set for single site locations. In an ideal situation, the business objectives have to be coordinated across all locations to offer a clear, company-wide mission.

To achieve better collaboration and transparency across sites, companies can resort to using a cloud storage and computing application that gathers all available production data (from multiple production sites) in one place. Certain individuals or teams in the company can be granted access to relevant data sets and reports, depending on their responsibilities within the organisation.

Determine the ideal status
Once a business objective is clear, companies should identify what the ideal status of each process is. By using production data and energy information stored and analysed in the cloud, a company can gain insight on productivity, overall equipment effectiveness (OEE), energy usage and more. This insight helps companies make changes that will bring the existing production environment closer to the ideal status.

Combined with the right SCADA software, the cloud unlocks rich company-wide data sets. By bridging information from different facilities in real-time, the software generates a bird’s eye view of company-wide operations and detailed analysis of energy consumption, productivity and other operational KPIs. This makes it easier for a company to monitor progress against the original business objectives and scale up or down when necessary.

Already, a large number of manufacturers are using industrial automation to speed up production and increase efficiency. With the large scale adoption of intelligent machinery, cloud computing is poised to become the obvious solution to store and manage the complexity of data this industry connectivity creates.

Unlike the restrictions associated with on-premises storage, cloud based models provide unlimited scalability, allowing companies to store both real-time and historical data from all production their sites and integrate any new production lines or sites to their cloud solution in a seamless manner. When accompanied with data analytics software, like zenon Analyzer, cloud computing can help companies prevent potential problems in production and even ignite entirely new business models.

Continuous improvement
For manufacturers with strict energy efficiency and productivity targets, easy access to company-wide data is invaluable. However, the knowledge provided by the cloud does not end with past and present data, but also gives manufacturers a glimpse into the future of their facilities.

By using the cloud, companies can implement a long-term continuous improvement strategy. Often, continuous improvement will follow the simple Plan-Do-Check-Act (PDCA) model often used in energy management applications. This allows companies to make decisions based on data analytics and to evaluate the effectiveness of those decisions in the short and medium run.

Using data collected from industrial machinery, companies can also employ predictive analytics technology to forecast why and when industrial machinery is likely to fail, which also means they can minimise costly downtime.

Predictive analytics allows manufacturers to identify potential problems with machinery before breakdowns occur. Avoiding expensive overheads for production downtime and costly fines for unfulfilled orders, the priceless insights predictive analytics can provide is the obvious solution to such costly problems.

Converting from the safe familiarities of on-premises storage to an advanced cloud model may seem risky. As with any major business transition, there is bound to be hesitation surrounding the potential problems the changeover could bring. Before making a decision, companies should closely assess three things: business objectives, how the cloud can help them achieve the ideal status across one or multiple production sites and how it can help them continuously improve in the long run.


The future of regulatory compliance in the pharmaceutical industry.

29/01/2016

In this short article, Martyn Williams, managing director of Copa-Data in Great Britain, explains the steps pharmaceutical manufacturers can take on the road to compliance.

The pharmaceutical industry today operates in one of the world’s most heavily regulated environments. Over the past few years, the industry has experienced significant regulatory change and looking to the future, the strict nature of the industry doesn’t appear to be slacking.

COP174The repercussions of failing to comply with industry standards can be detrimental for pharmaceutical manufacturers. It’s no secret that the integrity and reputation of pharmaceutical brands is integral to their success. As a result, even the smallest failure can be irreversibly damaging.

With industry standards surrounding crucial elements like product integrity, energy efficiency, health and safety and product testing, there is more pressure than ever to ensure manufacturers begin to take steps towards compliance. In this elaborate regulations landscape, how do manufacturers ensure production operates in an efficient and effective way?

Validation-friendly technology
Digitization of processes and the emergence of Industrial Internet of Things (IIoT) has transformed the entire manufacturing industry. However, for pharmaceutical manufacturers, the benefits of IIoT have far surpassed an increase in automated productivity.

The introduction of smart devices enables real-time data reporting to central control systems. Naturally reducing manual intervention and minimising adverse events during production. Paired with validation friendly software, an IIoT enabled factory provides live monitoring of regulatory reporting, potentially reducing the validation efforts many manufacturers face with a risk-based approach. This can greatly help to maximise production agility, allowing manufactures to respond to change and ultimately increase profitably.

For example, with batch control being a key step of the validation process, the combination of a smart factory and intelligent SCADA software couldn’t be more valuable. Automatically generating a trail of reliable audits, electronic signatures and real-time reporting, complicated pharmaceutical standards like FDA 21 CFR Part 11 are not so difficult to obtain.

The cloud and compliance
As manufacturers embrace IIoT, migrating to the cloud is the obvious solution to house and manage the growing expanse of production data. However, the cloud does more than just collect and store data, it allows manufacturers to gain actionable insights, directly from it.

Predictive analysis, for example, produces an intelligent forecast of when and where industrial equipment is most likely to fail. Using this data, contingency plans can be made to ensure, regardless of equipment failure, pharmaceutical production will continue to meet regulatory standards.

In addition, cloud computing enables simplified regulatory submissions. Using data stored in the cloud, manufacturers can directly feed digitised production information to regulatory bodies. This feature can be particularly helpful speeding up the lengthy process of clinical trials as well as post drug launch.

With the ever increasing risks concerning drug counterfeiting, efficiency challenges, adaptations to modern day agile processes and the industry-wide efforts to implement serialisation, it will be interesting to witness how IIoT can continue to solve these challenges moving forward.

Getting the green light
Over the past decade, the global focus on environmental sustainability has been hard to ignore. In industry, initiatives such as the European Union’s Energy Savings Opportunity Scheme (ESOS) and the voluntary certification ISO 50001, have put pressure on manufacturers to jump on the efficiency bandwagon. Using the same IIoT and smart software combination, organisations can gather comprehensive data from the entire factory and subsequently meet these efficiency requirements.

@copadata #PAuto #PHarma

Beyond smoke and mirrors!

07/01/2016

Three things you didn’t know about IIoT examined by Martyn Williams, Managing Director of COPA-DATA UK.

The human brain is a wonderful thing that works tirelessly from the day we are born until the day we die, only stopping on special occasions, like when presenting in front of large audiences. We’ve been studying the brain for many centuries, but we still know relatively little about the trillions of connections that make it work. Creating a road map of the brain is a bit like trying to map out the Industrial Internet of Things (IIoT). IIoT is a concept that has intrigued industry for several years now, but much like the human brain, is not yet fully understood.

COP146_3 things-IIoTTo gain a better understanding of the IIoT universe, we need to look at specifics. We need to understand how hardware and software, communication protocols and the human connection come together to support a stable and flexible interaction that enhances production, control and efficiency in industrial environments.

Machines to machines
Every time you form a new memory, new connections are created in the brain, making the system even more complex than before. Similarly, IIoT relies on many-to-many applications or groups of nodes to accomplish a single task. The plural of “machine” is important when discussing IIoT because it highlights the complexity of the system.

For example, on a sandwich biscuit production line, the biscuit sandwiching machine at the heart of the line should be able to communicate with the previous elements of the process, as well as the ones that come after it. The mixing, cutting and baking machines at the very start of the production process should also be able to “speak” to the conveyers, the pile packing sandwich machine, the cream feed system, lane multiplication and packaging machines. This level of communication allows the production line to be more flexible and cater for a wider range of biscuit varieties.

Regardless of whether we’re talking about biscuits, automotive manufacturing or even smart grids, IIoT has communication requirements that go beyond the standard client/server needs and conventional thinking.

Instead, the nodes act as peers in a network, each making decisions and reporting to other nodes.

Besides performing core tasks, the production system is also connected to an enterprise level that can automatically issue alarms, collect and analyse data and even make predictions or recommendations based on this analysis.

A common language
IIoT will only work if it uses a compatible language across systems and industries. To help achieve this objective, industry giants AT&T, Cisco, General Electric, IBM and Intel founded the Industrial Internet Consortium in 2014. The Consortium aims to accelerate the development and adoption of interconnected machines and intelligent analytics.

As IIoT cuts across all industry sectors, from manufacturing to energy, common standards, harmonised interfaces and languages are crucial for successful implementation of the concept. The consortium hopes to lower the entry barriers to IIoT by creating a favourable ecosystem that promotes collaboration and innovation. The next step is to facilitate interoperability and open standards, allowing machines or systems from different original equipment manufacturers (OEMs) to communicate with each other and with control systems.

The old and the new
Perhaps one of the biggest challenges when it comes to implementing IIoT on a larger scale comes from integrating legacy systems with the latest generation of smart factory equipment.

Learning new things changes the structure of the brain and similarly, in manufacturing, implementing new automation equipment usually results in changes across the entire system. The solution is to use standards-based protocol gateways to integrate legacy systems in brownfield environments. This allows organisations to free data from proprietary constraints and use it for real-time and historical data collection and analysis.

There is as much risk in sticking to a single vendor based on current install base as there is to accepting these new concepts with multiple new vendors and interoperability between intelligent devices. Their concept is something that we have experienced greatly within the energy and infrastructure sector and the concepts behind IEC61850 and interoperability.

Much like the human brain, the Industrial Internet of Things is always changing and there are still a lot of questions to be answered before we fully understand its requirements, implementation and potential. Luckily, these conversations are taking place and new ideas are put into practice every day. The next step is to figure out an easy way of practically implementing IIoT innovations in manufacturing environments across the world.