It all began with the War of the Currents…

24/01/2020

Today, people greatly appreciate having electrical energy available at the flip of a switch, seemingly at any time and for any occasion. But where does electricity actually come from? The answer most people would give you is: “from the wall socket, of course”. So does this automatically settle the question of security of supply? More on this later.

If we compare the history of electric current with the 75 years of the history of Camille Bauer Metrawatt AG, it is easy to see how they were interlinked at certain times in the course of their development. Why is that?

It all began with the War of the Currents – an economic dispute about a technical standard

It was around 1890 when the so-called War of the Currents started in the USA. At that time, the question was whether the direct current favoured by Thomas Alva Edison (1847-1931) or the alternating current promoted by Nicola Tesla (1856-1943) and financially supported by George Westinghouse (1846-1914), was the more suitable technology for supplying the United States of America with electrical energy over large areas and constructing power grids. Because of Westinghouse’s market dominance at that time compared to Edison General Electric (called General Electric from 1890 on), it soon became clear that the alternating voltage invented by Nicola Tesla was rapidly gaining the upper hand. This was not least because its approximately 25% lower transmission losses weighed unquestionably in its favour. Soon afterward, came the breakthrough for alternating voltage as the means of transmitting electrical energy using. Initially, the main target application was electric lighting, which to be spurred on by the invention of the incandescent lamp by Edison. The reasons for this were logical. Westinghouse was initially a lighting manufacturing company and wanted to secure as great a market share as possible.

As developments continued, it is no surprise that already by 1891, in Germany for example, the first long-distance transmission of electrical energy was put into operation, over a distance of more than 170 km from Lauffen am Neckar to Frankfurt am Main. It was a technological breakthrough using three-phase current technology. However, this has by no means been the end of the story for direct current. Not least because of digitalization, electromobility, decentralized energy supplies, etc., DC voltage has experienced a full-blown renaissance and now is treated almost as a brand-new topic.

The Camille Bauer story.
The foundation of the Camille Bauer company dates back to 1900, immediately after the War of the Currents just described, at a time when electricity was rapidly gaining in importance. At the turn of the century, the Camille Bauer company, named after its founder Camille Bauer-Judlin, began importing measuring instruments for the trendy new phenomenon called “electricity” into Switzerland for sale to the local market. Some years later, in 1906, Dr. Siegfried Guggenheimer (1875 – 1938), formerly a research scientist for Wilhelm Conrad Röntgen (1845 – 1923) and who in 1901, became the first winner of the Nobel Prize for physics, founded what was a start-up company in Nuremberg, Germany, trading under his own name. The company was engaged in the production and sale of electrical measuring instruments. However, due to pressure from the Nazis because Dr. Guggenheimer was of Jewish descent, he had to rename the company in 1933, creating Metrawatt AG.

Four technological segments.

Four technological segments.

In 1919, a man by the name of Paul Gossen entered the picture. He was so dissatisfied with his employment with Dr. Guggenheimer that he founded his own company in Erlangen, near Nuremberg, and for decades the two rivals were continuously in fierce competition with one another. In 1944, towards the end of the Second World War, Camille Bauer could see that its importing business had virtually come to a standstill. All the factories of its suppliers, which were mainly in Germany (for example Hartmann & Braun, Voigt & Haeffner, Lahmeyer, etc.), had been converted to supplying materials for the war. At this point, a decision had to be made quickly. Camille Bauer’s original trading company located in Basel (CH), undertook a courageous transformation. In order to survive, it turned itself into a manufacturing company. In a first step, the recently formed manufacturing company Matter, Patocchi & Co. AG in Wohlen (CH) was taken over, in order to be get the business up and running quickly with the necessary operating resources at their disposal. Thus the Swiss manufacturing base in Wohlen in the canton of Aargau was born.

The story does not end there. In 1979, Camille Bauer was taken over by Röchling a family-owned company in Mannheim, Germany. At that time, Röchling wanted to quit the iron and steel business and enter the field of I&C technology. Later, in 1993, Gossen in Erlangen and Metrawatt in Nuremberg were reunited in a single company, after Röchling became owner of the Gossen holding company as a result of the acquisition of the Bergmann Group from Siemens in 1989, and Metrawatt was acquired from ABB in 1992. At the same time, Camille Bauer’s German sales operation in Frankfurt-Dreieich also became a part of the company. Today the companies operate globally and successfully under the umbrella brand of GMC-I (Gossen Metrawatt Camille-Bauer-Instruments).

A new era.
The physics of electric current have not changed over the course of time. However, business conditions have changed drastically, especially over the last 5-10 years. Catch phrases such as electricity free market, collective self-consumption, renewable energy sources, PV, wind power, climate targets, reduction of CO2 emissions, e-mobility, battery storage, Tesla, smart meters, digitalization, cyber security, network quality, etc. are all areas of interest for both people and companies. And last but not least, with today’s protest demonstrations, climate change has become a political issue. We will have to see what results from this. At the very least, the catch phrases mentioned above are perfect for developing scenarios for electricity supply security. And it really is the case that the traditional electricity infrastructure, which is often as old as Camille Bauer Metrawatt itself, was not designed for the new types of energy behaviour, either those on the consumer side or the decentralised feed-in side. As a result, it is ever more important to have increasing numbers of intelligent systems which need to work from basic data obtained from precise measurements in order to avoid outages, blackouts and resulting damage.

The overall diversity of these new clusters of topics has prompted Camille Bauer Metrawatt AG to once more face the challenges with courage and above all to do so in an innovative and productive way. In this spirit, Camille Bauer Metrawatt AG develops, produces and distributes its product range globally in 4 technological segments.

These are:
(1) Measurement & Display,
(2) Power Quality,
(3) Control & Monitoring,
(4) Software, Systems and Solutions.

Through its expert staff, modern tools and external partners Camille Bauer Metrawatt is able, for example, to analyse power quality and detect power quality problems. In addition, the Camille Bauer Metrawatt Academy, recently founded in 2019, puts its focus on knowledge transfer by experienced lecturers, with the latest and most important topics as its main priority. Furthermore, we keep in very close contact with customers, authorities, associations, specialist committees, educational institutions, practice-oriented experts and the scientific community in order to continually provide the requisite solutions to the market and interested parties.

#Camille_Bauer_Metrawatt #PAuto @irishpwrprocess


The most viewed Stories in 2019.

02/01/2020
  • MCAA President Teresa Sebring has certified the election of officers and directors of the Measurement, Control & Automation Association…
  • The VP869 high performance 6U OpenVPX FPGA processing board has been announced by Abaco Systems .  Featuring two Xilinx® UltraScale+™ FPGAs …
  • Mr. Uwe Gräff has been appointed to the Board of New Technologies & Quality at the Harting Technology Group. He follows Dr. Frank Brode…
  • ISA demonstrates its undoubted strength again in providing stunning seminars allied with top class training built on member experience. Ne…
  • GE-IP Third Annual Executive Forum on delivering operational excellence. GE Intelligent Platforms recently hosted its third annual execut…
  • Leading monitoring and analysis solution makes improving SQL Server performance easier than ever SolutionsPT has announced a new partners…
  • The International Society of Automation (ISA) has welcomed Paul Gruhn , PE, CFSE, and ISA Life Fellow, is its 2019 Society President. Pa…
  • exida, LLC announced that it has assessed the Honeywell ISA100 Wireless™ Device Manager model WDMY version R320 and certified that it meets…
  • Anglia has continued to make big strides towards full RoHS 3* compliance over six months before the deadline for meeting the new provisions…
  • The emergence of radar has been an important advance in the level measurement field. Radar represents a cost effective, accurate solution that is immune to density and other process fluid changes….

#PAuto #TandM


Most viewed stories in 2018

What is on the list of trends for 2020?

06/12/2019
Data centre trends for 2020 from Rittal

Growing volumes of data, a secure European cloud (data control), rapid upgrades of data centres and rising energy consumption are the IT/data centre trends for Rittal in 2020. For example, the use of OCP (Open Compute Project) technology and heat recovery offers solutions for the challenges of the present.

Concept of cloud computing or big data, shape of cloud in futuristic style with digital technology interface!

According to the market researchers at IDC (International Data Corporation), humans and machines could already be generating 175 zettabytes of data by 2025. If this amount of data were stored on conventional DVDs, it would mean 23 stacks of data discs, each of them reaching up to the moon. The mean 27 percent annual rate of data growth is also placing increasing pressure on the IT infrastructure.

Since there is hardly any company that can afford to increase its own data storage by almost a third every year, IT managers are increasingly relying on IT services from the cloud. The trend towards the cloud has long since been a feature in Germany: A survey published in the summer of 2019 by the Bitkom ICT industry association together with KPMG showed that three out of four companies are already using cloud solutions.

However, businesses using cloud solutions from third-party providers do lose some control over their corporate data. That is why, for example, the US Cloud Act (Clarifying Lawful Overseas Use of Data) allows US authorities to access data stored in the cloud, even if local laws at the location where the data is stored do prohibit this.

“Future success in business will be sustainable if they keep pace with full digital transformation and integration. Companies will use their data more and more to provide added value – increasingly in real time – for example in the production environment,” says Dr Karl-Ulrich Köhler, CEO of Rittal International. “Retaining control over data is becoming a critical success factor for international competitiveness,” he adds.

Trend #1: Data control
The self-determined handling of data is thus becoming a key competitive factor for companies. This applies to every industry in which data security is a top priority and where the analysis of this data is decisive for business success. Examples are the healthcare, mobility, banking or manufacturing industries. Companies are now faced with the questions of how to process their data securely and efficiently, and whether to modernise their own data centre, invest in edge infrastructures or use the cloud.

The major European “Gaia-X” digital project, an initiative of the German Federal Ministry for Economics and Energy (BMWi), is set to start in 2020. The aim is to develop a European cloud for the secure digitalization and networking of industry that will also form the basis for using new artificial intelligence (AI) applications. The Fraunhofer Gesellschaft has drawn up the “International Data Spaces” initiative in this context. This virtual data room allows companies to exchange data securely. The compatibility of their own solutions with established (cloud) platforms (interoperability) is also provided.

This means that geographically widespread, smaller data centres with open cloud stacks might be able to create a new class of industrial applications that perform initial data analysis at the point where the data is created and use the cloud for downstream analysis. One solution in this context is ONCITE. This turnkey (plug-and-produce) edge cloud data centre stores and processes data directly where it arises, enabling companies to retain control over their data when networking along the entire supply chain.

Trend #2: Standardisation in data centres with OCP
The rapid upgrade of existing data centres is becoming increasingly important for companies, as the volume of data needing to be processed continues to grow. Essential requirements for this growth are standardised technology, cost-efficient operation and a high level of infrastructure scalability. The OCP technology (Open Compute Project) with its central direct current distribution in the IT rack is becoming an interesting alternative for more and more CIOs. This is because DC components open up new potentials for cost optimisation. For instance, all the IT components can be powered centrally with n+1 power supplies per rack. This way, an efficient cooling is achieved, since fewer power packs are present. At the same time, the high degree of standardisation of OCP components simplifies both maintenance and spare parts management. The mean efficiency gain is around five percent of the total current.

Rittal expects that OCP will establish itself further in the data centre as an integrated system platform in 2020. New OCP products for rack cooling, power supply or monitoring will enable rapid expansion with DC components. Furthermore, new products will support the conventional concept of a central emergency power supply, where the power supply is safeguarded by a central UPS. As a result, it will no longer be necessary to protect every single OCP rack with a UPS based on lithium-ion batteries. The advantage: the fire load in the OCP data centre is reduced considerably.

Trend #3: Heat recovery and direct CPU cooling
Data centres release huge amounts of energy into the environment in the form of waste heat. As the power density in the data centre grows, so too do the amounts of heat, which can then potentially be used for other purposes. So far, however, the use of waste heat has proven too expensive, because consumers are rarely found in the direct vicinity of the site for example. In addition, waste heat, as generated by air-based IT cooling systems, is clearly too low at a temperature of 40 degrees Celsius to be used economically.

In the area of high-performance computing (HPC) in particular, IT racks generate high thermal loads, often in excess of 50 kW. For HPC, direct processor cooling with water is significantly more efficient than air cooling, so that return temperatures of 60 to 65 degrees become available. At these temperatures, for instance, it is possible to heat domestic hot water or use heat pumps or to feed heat into a district heating network. However, CIOs should be aware that only about 80 percent of the waste heat can be drawn from an IT rack, even with a direct CPU water cooling. IT cooling is still needed by the rack for the remaining 20 percent.

At the German Government’s 2019 Digital Summit, the topic of heat recovery was discussed in the working group concerned, which identified a high need for action. For this reason, Rittal assumes that by 2020, significantly more CIOs will be involved in the issue of how the previously unused waste heat from the data centre can be used economically.

Trend #4: Integration of multi-cloud environments
Businesses need to be assured that they can run their cloud applications on commonly used platforms and in any country. This calls for a multi-cloud strategy. From management’s point of view, this is a strategic decision based on the knowledge that its own organisation will develop into a fully digitised business.

For example, an excellent user experience is guaranteed by minimising delays with the appropriate availability zones on site. This means that companies choose one or more availability zones worldwide for their services, depending on their business requirements. Strict data protection requirements are met by a specialised local provider in the target market concerned, for example. A vendor-open multi-cloud strategy allows exactly that: combining the functional density and scalability of hyperscalers with the data security of local and specialised providers such as Innovo Cloud. At the push of a button, on a dashboard, with a contact person, an invoice and in the second when the business decision is made – this is what is making multi-cloud strategies one of the megatrends of the coming years. This is because the economy will take further steps towards digital transformation and further accelerate its own continuous integration (CI) and continuous delivery (CD) pipelines with cloud-native technologies – applications designed and developed for the cloud computing architecture. Automating the integration and delivery processes will then enable the rapid, reliable and repeatable deployment of software.

#PAuto @Rittal @EURACTIV @PresseBox

 


Secure remote access in manufacturing.

24/07/2018
Jonathan Wilkins, marketing director of obsolete industrial parts supplier EU Automation, discusses secure remote access and the challenges it presents.

Whether you’re working from home, picking up e-mails on the go or away on business, it’s usually possible to remotely access you company’s network. Though easy to implement in many enterprises, complexity and security present hefty barriers to many industrial businesses

Industry 4.0 provides an opportunity for manufacturers to obtain detailed insights on production. Based on data from connected devices, plant managers can spot inefficiencies, reduce costs and minimise downtime. To do this effectively, it is useful to be able to access data and information remotely. However, this can present challenges in keeping operations secure.

Secure remote access is defined as the ability of an organisation’s users to access its non-public computing resources from locations other than the organisation’s facilities. It offers many benefits such as enabling the monitoring of multiple plants without travel or even staffing being necessary. As well as monitoring, maintenance or troubleshooting is possible from afar. According to data collected from experienced support engineers, an estimated 60 to 70 per cent of machine problems require a simple fix, such a software upgrade or minor parameter changes – which can be done remotely.

Remote access reduces the cost and time needed for maintenance and troubleshooting and can reduce downtime. For example, by using predictive analytics, component failures can be predicted in advance and a replacement part ordered from a reliable supplier, such as EU Automation. This streamlines the process for the maintenance technician, flagging an error instantly, even if they are not on site.

The challenges of remote access
There are still significant challenges to remote access of industrial control systems, including security, connectivity and complexity. Traditional remote-access includes virtual private networking (VPN) and remote desktop connection (RDC). These technologies are complex, expensive and lack the flexibility and intelligence manufacturers require.

Additional complexity added by traditional technologies can increase security vulnerabilities. Industrial control systems were not typically designed to be connected, and using a VPN connects the system to the IT network, increasing the attack surface. It also means if a hacker can access one point of the system, it can access it all. This was the case in attacks on the Ukrainian power grid and the US chain, Target.

To overcome these issues, manufacturers require a secure, flexible and scalable approach to managing machines remotely. One option that can achieve this is cloud-based access, which uses a remote gateway, a cloud server and a client software to flexibly access equipment from a remote location. In this way, legacy equipment can be connected to the cloud, so that it can be managed and analysed in real-time.

Most manufacturers find that the benefits of remote access can offer outweigh the investment and operational risks. To counteract them, businesses should put together a security approach to mitigate the additional risks remote access introduces. This often involves incorporating layers of security so that if one section is breached, the entire control system is not vulnerable.

When implementing remote access into an industrial control system, manufacturers must weigh up all available options. It’s crucial to ensure your system is as secure as possible to keep systems safe when accessed remotely, whether the user is working from home, on the go, or away on business.

@euautomation #PAuto #Industrie4

Ensuring accurate pigment dispensing.

20/06/2018

PD Edenhall Ltd is one of the largest independent concrete facing brick manufacturers in Britain. They needed to accurately monitor the quantity of pigment being dispensed into the weigh hopper throughout the manufacture of concrete facing bricks.

Precise pigment dispensing needs to be calculated as inaccurate amounts of pigment can lead to incorrect colour blends, resulting in a loss of sales and profit.

Solution – Accurate DBBSM S-Beam Load Cells and Intuitive4-L Digital Indicator
To monitor the quantity of pigment going into the blend, 3x DBBSM S-beam load cells were connected together to the supports of the pigment weigh hopper.  The DBBSM S-beam load cells checked the load of the pigment weigh hopper throughout the pigment dispensing process and the outputs of the load cells were sent to an Intuitive4 load cell digital indicator.  This enabled the engineers to constantly check the correct amount of pigment was being dispensed into the mix.

“We chose Applied Measurements’ DBBSM s-beam load cells as the pigment is £1000 per tonne so has to be extremely accurate.” Paul Akers, Works Manager at PD Edenhall Ltd told us.

DBBSM S-Beam Load Cells

  • Capacities: 0-1kg up to 0-30,000kg
  • Force & Load Measuring
  • Tension & Compression
  • Output: 2mV/V to 2.7mV/V
  • High Accuracy: <±0.03% Rated Capacity
  • Custom Versions Available
  • Fast and Simple to Install

The DBBSM S-beam load cells were ideal to use in this application as they are extremely accurate of better than ±0.03% of the rated capacity.  Coupled with their dual bending beam design, they guarantee excellent accuracy.  Improved accuracy can be further guaranteed by using our specially designed rod end bearings to help reduce any extraneous forces.

They offer an optional sealing to provide protection in the dusty environment and a robust 4-core polyurethane cable, making them ideal to use in the pigment dispensing machine.

Intuitive4-L Load Cell Digital Indicator

  • 6-Digit LED Display (±199999)
  • Modular Construction
  • Fast & Simple to Setup
  • Ideal for Harsh Environments – Dust Tight IP65 Protection (Once Installed)
  • Superior Accuracy – 10 Point Linearisation
  • Higher Stability – Signal Filtering Adjustment
  • Improved Resolution – 20-bit A/D Converter
  • Compatible with the INT2 Series
  • 10Vdc Load Cell Excitation @ 120mA max.
  • Powers up to 4x 350Ω Load Cells
  • Available in Less Than 1 Week

The Intuitive4-L load cell digital indicator was chosen for this application as superior accuracy was needed due to the high cost of the yellow pigment.  The intuitive4-L boasts a 10 point linearisation guaranteeing outstanding accuracy coupled with a 20-bit A/D converter for high resolution.  High stability is promised with signal filtering adjustment options which reduce the effect of noise or instability of the input signal.  Plus, it benefits from an active filter which reduces the effects of vibration and other external sources of system noise.

The intuitive4-L load cell digital indicator is fast and simple to setup with its single layer menu making the options easier to find.  Dimensions and fittings are entirely compatible with the existing Intuitive2 models making the switch over to this new improved version even easier.

Once installed the intuitive4-L load cell digital indicator has an IP65 dust tight protection rating making it ideal to use in this harsh construction environment.  If that’s not enough we can also provide an optional waterproof front panel cover for that extra level of protection.

The intuitive4-L digital panel meter has a modular construction meaning that PD Edenhall Ltd could configure it to their exact specification saving them money.  Options available include voltage or current analogue outputs, 2 or 4 alarm relays and a serial data output in one of several formats including RS232 ASCII, RS485 ASCII and RS485 ModBus RTU, making this a truly flexible load cell digital indicator.

@AppMeas #PAuto @EdenhallUK

The world of virtual commissioning.

15/06/2018
Robert Glass, global food and beverage communications manager at ABB explores the concept of virtual commissioning and how system testing can benefit the food industry.

In 1895, pioneer of astronautic theory, Konstantin Tsiolkovsky, developed the concept of the space elevator, a transportation system that would allow vehicles to travel along a cable from the Earth’s surface directly into space. While early incarnations have proven unsuccessful, scientists are still virtually testing new concepts.

Industry 4.0 continues to open up new opportunities across food and beverage manufacturing. In particular, these technologies help improve manufacturing flexibility and the speed and cost at which manufacturers are able to adapt their production to new product variations. Virtual commissioning is one of these key technologies.

What is virtual commissioning?
Virtual commissioning is the creation of a digital replica of a physical manufacturing environment. For example, a robotic picking and packing cell can be modeled on a computer, along with its automation control systems, which include robotic control systems, PLCs, variable speed drives, motors, and even safety products. This “virtual” model of the robot cell can be modified according to the new process requirements and product specifications. Once the model is programmed, every step of that cell’s operation can be tested and verified in the virtual world. If there are changes that are needed in the process automation or robot movement, these can be made on the same computer, allowing the robot to be reprogrammed, orchanges made to the variable speed drives and PLC programming. The ABB Ability™ RobotStudio is one tool that enables this type of virtual commissioning.

Once reprogrammed, the system is tested again and if it passes, it’s ready for physical deployment. This is where the real benefits become tangible. By using virtual commissioning to program and test ahead of time, less process downtime is required and manufacturers can reduce the changeover risks.

Automation programming and software errors in a system can be incredibly difficult and costly to rectify, particularly if they are found later on in the production process. Research by Austrian software testing frim Tricentis, estimated that software bugs, glitches and security failures cost businesses across the world $1.1 trillion.

To achieve the full potential of virtual commissioning, the simulation must be integrated across the entire plant process, including both the planning and engineering phase. Known as simulation-based engineering, this step is integral for the installation of reliable systems. The use of simulations in a plant is not a new concept, in fact virtual commissioning has been researched for more than a decade.

The benefits
The implementation of virtual commissioning brings with it a number of benefits. The ‘try before you buy’ concept allows plant managers to model and test the behavior of a line before making any physical changes. This saves time as the user can program the system’s automation while testing and fixing errors. The use of a digital model can also reduce risk when changing or adding processes.

One company which has seen significant improvements in production since investing in virtual commissioning is Comau, a supplier of automotive body and powertrain manufacturing and assembly technologies. Comau’s head of engineering and automation systems, Franceso Matergia, said: “We were able to reprogram 200 robots in just three days using virtual commissioning as opposed to roughly 10 weekends had the work been done on the factory floor.”

Just as you wouldn’t build a space elevator without meticulous planning and years of small scale prototyping, it’s very cost and time beneficial to build and test in a virtual environment where you can find the bugs and discover the unforeseen challenges and mitigate them without added downtime or loss of production. It’s much better to discover that bug while on the ground versus at 100,000 feet midway between the surface of the earth and that penthouse in space.

@ABBgroupnews #PAuto @StoneJunctionPR

Monitoring and managing the unpredictable.

07/06/2018
Energy sector investments in big data technologies have exploded. In fact, according to a study by BDO, the industry’s expenditure on this technology in 2017 has increased by ten times compared to the previous year, with the firm attributing much of this growth to the need for improved management of renewables. Here, Alan Binning, Regional Sales Manager at Copa-Data UK, explores three common issues for renewables — managing demand, combining distributed systems and reporting.

Renewables are set to be the fastest-growing source of electrical energy generation in the next five years. However, this diversification of energy sources creates a challenge for existing infrastructure and systems. One of the most notable changes is the switch from reliable to fluctuating power.

Implementing energy storage
Traditional fossil-fuel plants operate at a pre-mitigated level, they provide a consistent and predictable amount of electricity. Renewables, on the other hand, are a much more unreliable source. For example, energy output from a solar farm can drop without warning due to clouds obscuring sunlight from the panels. Similarly, wind speeds cannot be reliably forecasted. To prepare for this fluctuation in advance, research and investment into energy storage systems are on the rise.

For example, wind power ramp events are a major challenge. Therefore, developing energy storage mechanisms is essential. The grid may not always be able to absorb excess wind power created by an unexpected windspeed increase. Ramp control applications allow the turbine to store this extra power in the battery instead. When combined with reliable live data, these systems can develop informed models for intelligent distribution.

Britain has recently become home to one of the largest energy storage projects to use EV batteries. While it is not the first-time car batteries have been used for renewable power, the Pen y Cymoedd wind farm in Wales has connected a total of 500 BMW i3 batteries to store excess power.

Combining distributed systems
Control software is the obvious solution to better monitor this fluctuating source of power. However, many renewable energy generation sites, like solar PV and wind farms, are distributed across a wide geographical scope and are therefore more difficult to manage without sophisticated software.

Consider offshore wind farms as an example. The world’s soon-to-be-largest offshore wind farm is currently under construction 74.5 miles off the Yorkshire coastline. To accurately manage these vast generation sites, the data from each asset needs to be combined into a singular entity.

This software should be able to combine many items of distributed equipment, whether that’s an entire wind park or several different forms of renewable energy sources, into one system to provide a complete visualisation of the grid.

Operators could go one step further, by overlaying geographical information systems (GIS) data into the software. This could provide a map-style view of renewable energy parks or even the entire generation asset base, allowing operators to zoom on the map to reveal greater levels of detail. This provides a full, functional map enabling organisations to make better informed decisions.

Reporting on renewables
Controlling and monitoring renewable energy is the first step to better grid management. However, it is what energy companies do with the data generated from this equipment that will truly provide value. This is where reporting is necessary.

Software for renewable energy should be able to visualise data in an understandable manner so that operators can see the types of data they truly care about. For example, wind farm owners tend to be investors and therefore generating profit is a key consideration. In this instance, the report should compare the output of a turbine and its associated profit to better inform the operator of its financial performance.

Using intelligent software, like zenon Analyzer, operators can generate a range of reports about any information they would like to assess — and the criteria can differ massively depending on the application and the objectives of the operator. Reporting can range from a basic table of outputs, to a much more sophisticated report that includes the site’s performance against certain key performance indicators (KPIs) and predictive analytics. These reports can be generated from archived or live operational data, allowing long term trends to be recognised as well as being able to react quickly to maximise efficiency of operation.

As investments in renewable energy generation continue to increase, the need for big data technologies to manage these sites will also continue to grow. Managing these volatile energy sources is still a relatively new challenge. However, with the correct software to combine the data from these sites and report on their operation, energy companies will reap the rewards of these increasingly popular energy sources.