Monitoring and managing the unpredictable.

Energy sector investments in big data technologies have exploded. In fact, according to a study by BDO, the industry’s expenditure on this technology in 2017 has increased by ten times compared to the previous year, with the firm attributing much of this growth to the need for improved management of renewables. Here, Alan Binning, Regional Sales Manager at Copa-Data UK, explores three common issues for renewables — managing demand, combining distributed systems and reporting.

Renewables are set to be the fastest-growing source of electrical energy generation in the next five years. However, this diversification of energy sources creates a challenge for existing infrastructure and systems. One of the most notable changes is the switch from reliable to fluctuating power.

Implementing energy storage
Traditional fossil-fuel plants operate at a pre-mitigated level, they provide a consistent and predictable amount of electricity. Renewables, on the other hand, are a much more unreliable source. For example, energy output from a solar farm can drop without warning due to clouds obscuring sunlight from the panels. Similarly, wind speeds cannot be reliably forecasted. To prepare for this fluctuation in advance, research and investment into energy storage systems are on the rise.

For example, wind power ramp events are a major challenge. Therefore, developing energy storage mechanisms is essential. The grid may not always be able to absorb excess wind power created by an unexpected windspeed increase. Ramp control applications allow the turbine to store this extra power in the battery instead. When combined with reliable live data, these systems can develop informed models for intelligent distribution.

Britain has recently become home to one of the largest energy storage projects to use EV batteries. While it is not the first-time car batteries have been used for renewable power, the Pen y Cymoedd wind farm in Wales has connected a total of 500 BMW i3 batteries to store excess power.

Combining distributed systems
Control software is the obvious solution to better monitor this fluctuating source of power. However, many renewable energy generation sites, like solar PV and wind farms, are distributed across a wide geographical scope and are therefore more difficult to manage without sophisticated software.

Consider offshore wind farms as an example. The world’s soon-to-be-largest offshore wind farm is currently under construction 74.5 miles off the Yorkshire coastline. To accurately manage these vast generation sites, the data from each asset needs to be combined into a singular entity.

This software should be able to combine many items of distributed equipment, whether that’s an entire wind park or several different forms of renewable energy sources, into one system to provide a complete visualisation of the grid.

Operators could go one step further, by overlaying geographical information systems (GIS) data into the software. This could provide a map-style view of renewable energy parks or even the entire generation asset base, allowing operators to zoom on the map to reveal greater levels of detail. This provides a full, functional map enabling organisations to make better informed decisions.

Reporting on renewables
Controlling and monitoring renewable energy is the first step to better grid management. However, it is what energy companies do with the data generated from this equipment that will truly provide value. This is where reporting is necessary.

Software for renewable energy should be able to visualise data in an understandable manner so that operators can see the types of data they truly care about. For example, wind farm owners tend to be investors and therefore generating profit is a key consideration. In this instance, the report should compare the output of a turbine and its associated profit to better inform the operator of its financial performance.

Using intelligent software, like zenon Analyzer, operators can generate a range of reports about any information they would like to assess — and the criteria can differ massively depending on the application and the objectives of the operator. Reporting can range from a basic table of outputs, to a much more sophisticated report that includes the site’s performance against certain key performance indicators (KPIs) and predictive analytics. These reports can be generated from archived or live operational data, allowing long term trends to be recognised as well as being able to react quickly to maximise efficiency of operation.

As investments in renewable energy generation continue to increase, the need for big data technologies to manage these sites will also continue to grow. Managing these volatile energy sources is still a relatively new challenge. However, with the correct software to combine the data from these sites and report on their operation, energy companies will reap the rewards of these increasingly popular energy sources.

New fault current measurement technology shows promising results in trials with Scottish Power


Outram Research’s new fault current prediction technology developed by John Outram, the company’s MD, has proven to deliver accurate results during the first stage of field trials with ScottishPower. Knowing the peak fault current on their networks is critical to electrical utilities, as it defines the rating of the components such as circuit breakers required to ensure that they can safely withstand the large release of energy that occurs during an electrical fault.

Outram’s patent-pending technique to predict fault current from network measurements has been developed to provide greater accuracy than current approaches. The project, funded by Ofgem’s Innovation Funding Initiative (IFI), consists of three stages. In the first stage of the trials, which measured RMS “break” current, Outram’s solution delivered accuracy to within 2% of the network model. By accurately predicting fault current, utilities can ensure that network components are correctly specified and eliminate the money wasted when over-compensating to provide a safety margin that allows for the existing, possibly inaccurate, approaches to their calculations.

Fault current is measured using an algorithm developed by Outram Research, which runs on their PM7000 power quality analyser. The PM7000 is already widely used by companies wanting to troubleshoot, identify and resolve power quality problems quickly and efficiently. The first stage of the project was to predict the RMS “break” current, which represents the highest possible current that might have to be interrupted in the case of a fault occurring on an already live circuit. Breakers in the network must operate up to this maximum fault current to ensure that the power is safely and quickly disconnected.

“The first stage of the field trials have been very encouraging, demonstrating the huge potential of Outram’s technique to predict fault current,” said Jim Sutherland, SP Energy Network’s Director of Network Development. “We’re optimistic that once all stages are complete, we will be able to use this approach to allow us to accurately specify breakers and other components in the network.”

Stage two of the project, which aims to measure the peak “make” current – the current that flows when a connection is made to a section of the network where a fault is already present, has begun.

Electricity distribution networks are becoming increasingly complex, and can no longer be modelled as a simple “waterfall”, where power flows from the power station to the load. In the event of a fault, components such as motors and embedded generation will deliver power back to the grid. This presents a potentially unknown level of fault current since it is impossible to have prior knowledge of such events without building complex network models and having a full understanding of all customer loads. Local electricity generation from renewable sources presents a similar downstream contribution that should be calculated. The third stage of the project will demonstrate that the Outram approach can elucidate the contribution to fault current from such loads through its measurement technique.

Existing approaches to calculating peak “make” current are less accurate than those for peak “break” current, and they usually ignore the increasingly important downstream contribution.

“ScottishPower has been an excellent partner, allowing us to demonstrate the effectiveness of our new technique in a real-world situation,” said John Outram. “We’re delighted with the results of the first stage of the field trials, and are excited as we move into the second and third stages, where our refined and extended fault current prediction will provide even greater benefits to electrical utilities.”