#SPS17 Gearing specialist exhibits.

17/10/2017

Aerospace, medical and automotive engineers will be able to speak with Harmonic Drive AG to improve the accuracy and efficiency of their applications at this year’s SPS IPC Drives show in Nuremberg (D). Harmonic Drive AG will be launching its FHA-C Mini Servo Actuators with Absolute MZE encoder at stand 119 in hall four, from November 28-30, 2017.

The exhibition brings together suppliers of electric automation technologies from all over the world to discuss the future of automation and new innovative products. Visitors will have the chance to eat sushi while talking to Harmonic Drive AG engineers about its range of servo actuators and learn how its multi-turn absolute motor feedback system can increase productivity.

Designed to offer high transmission reliability, even in environments with high electromagnetic interference (EMI), the FHA-C Mini Servo Actuator has a specially developed output bearing, which uses preloaded precision and high tilting capacity.

With a multi-turn absolute encoder mounted directly on to the motor shaft, the actuator can provide accurate signals for positioning directly at the load. This is ideal for design engineers working in the medical and automation sector, where creating accurate and reliable devices is imperative.

“Automation provides a number of opportunities for businesses to improve the efficiency of their processes,” explained Graham Mackrell, managing director of Harmonic Drive UK. “Harmonic Drive FHA Mini Servo Actuators have been created so that they can be combined with the YukonDrive® Servo Controller. When connected, the actuator can be tailored for use in demanding dynamic applications.

“In addition to being customisable, the actuators feature a multi-turn absolute motor feedback system that acquires the absolute position directly at the load, with maximum accuracy over more than 600 revolutions. The productivity of the servo actuator is increased because unproductive referencing is no longer required, allowing it to deliver smooth and quiet running characteristics.”

• The SPS IPC Drives show brought together 63,291 people at last year’s event, with over 500 exhibitors coming from countries outside of Germany.
Advertisements

Quality Managers are the Leaders!

09/10/2017
Jennifer Sillars, Product Marketing Executive for Ideagen tells why quality managers are the leaders that manufacturing needs.

Having previously worked in business intelligence for Ideagen, Jennifer Sillars brings a passion for data driven decision making to the realms of quality, compliance, audit and risk. As Product Marketing Executive at Ideagen, Jennifer’s key objectives are to understand how customers use Ideagen’s software and how the company can better serve them in their GRC challenges.

Jennifer Sillars, – Ideagen

Companies are becoming more aggressive in their financial goals, with cost cutting being the mantra for many years. Indeed, in some organisations, when all the obvious cuts have been made, making the less obvious ones can begin to jeopardise the smooth-running of the business.

In Manufacturing, we see an industry under pressure. An industry trying to do more with less. A more proactive approach is needed now to break out of this cycle of simply fulfilling demand. While it may seem unlikely to some, your Quality Manager or Quality Director, is in the perfect position to guide this strategy.

Reason One – A focus that is principled and positive
Quality leaders are focused on…quality. By this I mean customer satisfaction and reputation building are inherent in their goals. ‘Quality’ often means meeting the customer requirements or meeting safety objectives. They are high integrity individuals who make analytical decisions based on fact and with no hidden motive. It is this type of input that is needed when difficult choices have to be taken.

John Burrows – HepcoMotion

HepcoMotion, a manufacturer of linear motion systems and automation components, is an example of a company who lead with quality. As HepcoMotion manufacture everything in-house, no responsibility for quality can be delegated away. John Burrows, Group Quality Manager, defines his job as “ensuring that our customers get the best possible products.” In industries striving for blocked out order books filled with repeat business, the Quality department’s smooth operation can make this a reality.

Quality departments are often unfairly profiled as the inspector who comes in looking for problems at the end of a project. The truth behind this is that quality leaders, like John Burrows, believe the company can achieve greatness. So, if something goes wrong, there is a process or control that needs to be fixed. John is trusted at an operational level as someone to turn to when an issue is found. Leading from the front like this allows issues to be addressed earlier in the manufacturing process, instead of hidden from the inspector.

Reason Two – Business intelligence in non-conformances
They understand how the business runs. They know the small details, the daily struggles as well as the big picture strategic goals. They work beyond silos to understand the internal processes that take requirements and turn them in to commercial products. This gives Quality Managers a deep understanding of where things can, and do, go wrong. They see where the trends are that provide opportunities for improvement. Quality leaders track non-conformances and the cost of these issues.

“The ability to cost Corrective Action Preventative Action (CAPA’s) allows me to highlight areas that require attention,” John continues: “where costs have been accrued from a variety of different reasons (work in progress, customer issues, final inspection issues for example). This gives us a much better picture of particular products that may require different processes or additional inspections. We can easily add costs of re-work, extra inspections etc. to an issue and so can get an accurate picture of what it has cost the business.”

This is the kind of data-driven decision making that all companies are striving for. Executives often overlook the Quality department as a partner in business intelligence. HepcoMotion have made great efforts to improve reporting and analysis, keeping in mind the strategic needs of the company.

Reason Three – A stabilising force
In many companies, each location operates within its own rules and business constraints with little sharing of best practices. Each location focuses on optimisation. In itself a worthwhile activity that can have significant cost saving benefits, however at the public level it may not be enough to make an impact.

In a competitive market customers put a premium on suppliers that they can rely on. Reliability and repeatability are fundamental goals within the quality leader’s principled and positive focus. It is part of “ensuring that our customers get the best possible products”, as John says. When every customer knows they will get exactly what they need, orders grow. The wider market notices and the effects are transformative.

When sub-contracting part of the manufacture to suppliers, a part of the burden of reliability is taken. This is as long as you have chosen wisely, set the requirements and been realistic with lead times. The situation is different where you are the sole provider. This is the position that HepcoMotion is in as they do all their manufacturing in-house.

In recent years John’s day to day focus has shifted as HepcoMotion adapt to deal with demand. Downtime is minimal and many of their machines run 24/7. Their order books are full; their focus on quality is undoubtedly a contributing factor. Customers expect the “HepcoMotion” standard, not the standard of a particular site. John Burrows understands that it takes more than a written policy to ensure success. For that reason, John spends increasing amounts of time at different manufacturing sites engaging people and establishing a true Group-wide approach to quality.

Because when the reputation for quality and reliability is built, the impact of failing to meet a customer’s expectation in a single instance can be devastating.

It is for these reasons that Quality Managers are the leaders that manufacturing needs.

@Ideagen_Plc #PAuto @HepcoMotion

No escape even for agrochemicals!

28/09/2017
In this article key points that are covered in depth in the IDTtechEX published report “Agricultural Robots and Drones 2017-2027: Technologies, Markets, Players” by Dr Khasha Ghaffarzadeh and Dr Harry Zervos are discussed. 

New robotics is already quietly transforming many aspects of agriculture, and the agrochemicals business is no exception. Here, intelligent and autonomous robots can enable ultraprecision agriculture, potentially changing the nature of the agrochemicals business. In this process, bulk commodity chemical suppliers will be transformed into speciality chemical companies, whilst many will have to reinvent themselves, learning to view data and artificial intelligence (AI) as a strategic part of their overall crop protection offerings.

Computer vision
Computer vision is already commercially used in agriculture. In one use case, simple row-following algorithms are employed, enabling a tractor-pulled implement to automatically adjust its position. This relieves the pressure on the driver to maintain an ultra-accurate driving path when weeding to avoid inadvertent damage to the crops.

The computer vision technology is however already evolving past this primitive stage. Now, implements are being equipped with full computer systems, enabling them to image small areas, to detect the presence of plants, and to distinguish between crop and weed. The system can then instruct the implement to take a site-specific precision action to, for example, eliminate the weed. In the future, the system has the potential to recognize different crop and weed types, enabling it to take further targeted precision action.

This technology is already commercial, although at a small scale and only for specific crops. The implements are still very much custom built, assembled and ruggedized for agriculture by the start-ups themselves. This situation will continue until the market is proven, forcing the developers to be both hardware and software specialists. Furthermore, the implements are not yet fully reliable and easy to operate, and the upfront machine costs are high, leading the developers to favour a robotic-as-a-service business model.

Nonetheless, the direction of travel is clear: data will increasingly take on a more prominent (strategic) role in agriculture. This is because the latest image processing techniques, based on deep learning, feed on large datasets to train themselves. Indeed, a time-consuming challenge in applying deep learning techniques to agriculture is in assembling large-scale sets of tagged data as training fodder for the algorithms. The industry needs its equivalents of image databases used for facial recognition and developed with the help of internet images and crowd-sourced manual labelling.

In not too distant a future, a series of image processing algorithms will emerge, each focused on some set of crop or weed type. In time, these capabilities will inevitably expand, allowing the algorithms to become applicable to a wider set of circumstances. In parallel, and in tandem with more accumulated data (not just images but other indicators such NDVA too), algorithms will offer more insight into the status of different plants, laying the foundation of ultra-precision farming on an individual plant basis.

Agriculture is a challenging environment for image processing. Seasons, light, and soil conditions change, whilst the plant themselves transform shape as they progress through their different stages of growth. Nonetheless, the accuracy threshold that the algorithms in agriculture must meet are lower than those found in many other applications such as autonomous general driving. This is because an erroneous recognition will, at worse, result in elimination of a few healthy crops, and not in fatalities. This, of course, matters economically but is a not safety critical issue and is thus not a showstopper.

This lower threshold is important because achieving higher levels of accuracy becomes increasingly challenging. This is because after an initial substantial gain in accuracy improvement the algorithms enter the diminishing returns phase where lots more data will be needed for small accuracy gains. Consequently, algorithms can be commercially rolled out in agriculture far sooner, and based on orders of magnitude lower data sizes and with less accuracy, than in many other applications.

Navigational autonomy
Agriculture is already a leading adapter of autonomous mobility technology. Here, the autosteer and autoguide technology, based on outdoor RTK GPS localization, are already well-established. The technology is however already moving towards full level-5 autonomy. The initial versions are likely to retain the cab, enabling the farmer/driver to stay in charge, ready to intervene, during critical tasks such as harvesting. Unmanned cable versions will also emerge when technology reliability is proven and when users begin to define staying in charge as remote fleet supervision.

The evolution towards full unmanned autonomy has major implications. As we have discussed in previous articles, it may give rise to fleets of small, slow, lightweight agricultural robots (agrobots). These fleets today have limited autonomous navigational capability and suffer from limited productivity, both in individual and fleet forms. This will however ultimately change as designs/components become standardized and as the cost of autonomous mobility hardware inevitably goes down a steep learning curve.

Agrobots of the future
Now the silhouette of the agrobots of the future may be seen: small intelligent autonomous mobile robots taking precise action on an individual plant basis. These robots can be connected to the cloud to share learning and data, and to receive updates en mass. These robots can be modular, enabling the introduction of different sensor/actuator units as required. These robots will never be individually as productive as today’s powerful farm vehicles, but can be in fleet form if hardware costs are lowered and the fleet size-to-supervisor ratio is increased.

What this may mean for the agrochemicals business is also emerging. First, data and AI will become an indispensable part of the general field of crop protection, of which agrochemical supply will become only a subset, albeit still a major one. This will mandate a major rethinking of the chemical companies’ business model and skillsets. Second, non-selective blockbuster agrochemicals (together with engineered herbicide resistant seeds) may lose their total dominance. This is because the robots will apply a custom action for each plant, potentially requiring many specialized selective chemicals.

These will not happen overnight. The current approach is highly productive, particularly over large areas, and off-patent generic chemicals will further drive costs down. The robots are low-lying today, constricting them to short crops. Achieving precision spraying using high boys will be a mechanical and control engineering challenge. But these changes will come, diffusing into general use step by step and plant by plant. True, this is a long term game, but playing it cannot be kicked into the long grass for long.

@IDTechEx #Robotics #Agriculture #PAuto

World’s first LiFi enabled light bar!

21/09/2017
Mainstream adoption of LiFi will be available within LED light bars which will replace the most widely utilized light source in the world – fluorescent tubes.

The introduction of the first LED “light bar” is forecasted to replace the most conventional form of lighting within commercial and industrial facilities: fluorescent tubes; with an estimated 3-4 billion installed throughout the world.

pureLiFi and Linmore LED will demonstrate this new technology at LuxLive from the 15-16th of November 2017 (London GB) as part of their LiFi experience zone.

WiFi versus LiFi

Wireless connectivity is evolving. The spectrum now has to accommodate more mobile users and is forecasted to increase to 20 Billion devices (forming the IoT) by the year 2020 which will result in what is known as the Spectrum Crunch. However, LiFi can open up 1000 times more spectrum for wireless communications to combat this phenomenon.  LiFi is a transformative technology changing the way we connect to the Internet by using the same light we use to illuminate our offices, home and streets.

Integration of LiFi within LED strip lights will drive mass adoption, enabling LiFi to easily move into full-scale implementation within offices, schools, warehouses and anywhere illumination is required.

Alistair Banham, CEO of pureLiFi says: “This partnership marks a step change for LiFi adoption. We can now offer new solutions that will help industry, future proof their spaces, devices and technology to ensure they are ready to cope with the increased demand for highspeed, secure and mobile wireless communications.”

LiFi utilizes LED lights that illuminate both our workspace and homes to transmit high-speed, bi-directional, secure and fully networked wireless internet.

What is LiFi
LiFi is high speed bi-directional networked and mobile communication of data using light. LiFi comprises of multiple light bulbs that form a wireless network, offering a substantially similar user experience to Wi-Fi except using the light spectrum.

Lighting manufacturers are important players in the adoption of LiFi technology. Linmore LED has built its reputation in the retrofit market, and they ensure their portfolio of LED products perform in the top 1% for energy efficiency in the industry.

Retrofit fixtures are in great demand as many facilities seek to drive down energy costs by as much as 70-80% which can be achieved by converting to LED technology. This trend is also driven by the increased operating life that LEDs provide and the concerns of toxic mercury utilized within fluorescent lamps that complicates disposal. This provides a scenario where building owners and facility managers can adopt LiFi technology while dramatically decreasing lighting-related energy costs at the same time.

Paul Chamberlain, CEO of Linmore LED says: “Utilizing an existing part of a building’s infrastructure – lighting – opens up endless possibilities for many other technologies to have a deployment backbone.  Internet of Things (IoT), RFID, product and people movement systems, facility maintenance, and a host of other technologies are taken to the next level with LiFi available throughout a facility.”

John Gilmore, Linmore’s VP of Sales talks about early adopters of the technology: “We’re very excited to be aligning ourselves with pure LiFi. We firmly believe the US Government will be an early adopter of this technology. Our position on GSA schedule will help buyers be able to easily access the technology.”

LiFi offers lighting innovators the opportunity to enter new markets and drive completely new sources of revenue by providing wireless communications systems. LiFi is a game changer not only for the communications industry but also for the lighting industry, and with LiFi, Linmore certainly has a brighter future. 

@purelifi #Pauto @LinmoreLED ‏#bes

Simulating agricultural climate change scenarios.

19/09/2017
Extreme weather, believed to result from climate change and increased atmospheric CO2 levels, is a concern for many. And beyond extreme events, global warming is also expected to impact agriculture.(Charlotte Observer, 7 Sept 2017)

Although it is expected that climate change will significantly affect agriculture and cause decreases in crop yields, the full effects of climate change on agriculture and human food supplies are not yet understood. (1, 2 & 3 below)

Simulating a Changing Climate
To fully understand the effects that changes in temperature, CO2, and water availability caused by climate change may have on crop growth and food availability, scientists often employ controlled growth chambers to grow plants in conditions that simulate the expected atmospheric conditions at the end of the century. Growth chambers enable precise control of CO2 levels, temperature, water availability, humidity, soil quality and light quality, enabling researchers to study how plant growth changes in elevated CO2 levels, elevated temperatures, and altered water availability.

However, plant behavior in the field often differs significantly from in growth chambers. Due to differences in light quality, light intensity, temperature fluctuations, evaporative demand, and other biotic and abiotic stress factors, the growth of plants in small, controlled growth chambers doesn’t always adequately reflect plant growth in the field and the less realistic the experimental conditions used during climate change simulation experiments, the less likely the resultant predictions will reflect reality.3

Over the past 30 years, there have been several attempts to more closely simulate climate change growing scenarios including open top chambers, free air CO2 enrichment, temperature gradient tunnels and free air temperature increases, though each of these methods has significant drawbacks.

For example, chamber-less CO2 exposure systems do not allow rigorous control of gas concentrations, while other systems suffer from “chamber effects” included changes in wind velocity, humidity, temperature, light quality and soil quality.3,4

Recently, researchers in Spain have reported growth chamber greenhouses and temperature gradient greenhouses, designed to remove some of the disadvantages of simulating the effects of climate change on crop growth in growth chambers. A paper reporting their methodology was published in Plant Science in 2014 and describes how they used growth chamber greenhouses and temperature gradient greenhouses to simulate climate change scenarios and investigate plant responses.3

Choosing the Right Growth Chamber
Growth chamber and temperature gradient greenhouses offer increased working area compared with traditional growth chambers, enabling them to work as greenhouses without the need for isolation panels, while still enabling precise control of CO2 concentration, temperature, water availability, and other environmental factors.

Such greenhouses have been used to study the potential effects of climate change on the growth of lettuce, alfalfa, and grapevine.

CO2 Sensors for Climate Change Research
For researchers to study the effects of climate change on plant growth using growth chambers or greenhouses, highly accurate CO2 measurements are required.

The Spanish team used the Edinburgh Sensors Guardian sensor in their greenhouses to provide precise, reliable CO2 measurements. Edinburg Sensors is a customer-focused provider of high-quality gas sensing solutions that have been providing gas sensors to the research community since the 1980s.3,5

The Guardian NG from Edinburgh Sensors provides accurate CO2 measurements in research greenhouses mimicking climate change scenarios. The Edinburgh Sensors Guardian NG provides near-analyzer quality continuous measurement of CO2 concentrations. The CO2 detection range is 0-3000 ppm, and the sensor can operate in 0-95% relative humidity and temperatures of 0-45 °C, making it ideal for use in greenhouses with conditions intended to mimic climate change scenarios.

Furthermore, the Guardian NG is easy to install as a stand-alone product in greenhouses to measure CO2, or in combination with CO2 controllers as done by the Spanish team in their growth control and temperature gradient greenhouses.4,6 Conclusions Simulating climate change scenarios in with elevated CO2 concentrations is essential for understanding the potential effects of climate change on plant growth and crop yields. Accurate CO2 concentration measurements are essential for such studies, and the Edinburgh Sensors Guardian NG is an excellent option for researchers building research greenhouses for climate change simulation.

References

  1. Walthall CL, Hatfield J, Backlund P, et al. ‘Climate Change and Agriculture in the United States: Effects and Adaptation.’ USDA Technical Bulletin 1935, 2012. Available from: http://lib.dr.iastate.edu/cgi/viewcontent.cgi?article=1000&context=ge_at_reports
  2. https://www.co2.earth/2100-projections Accessed September 7th, 2017.
  3. Morales F, Pascual I, Sánchez-Díaz M, Aguirreolea J, Irigoyen JJ, Goicoechea N Antolín MC, Oyarzun M, Urdiain A, ‘Methodological advances: Using greenhouses to simulate climate change scenarios’ Plant Science 226:30-40, 2014.
  4. Aguirreolea J, Irigoyen JJ, Perez P, Martinez-Carrasco R, Sánchez-Díaz M, ‘The use of temperature gradient tunnels for studying the combined effect of CO2, temperature and water availability in N2 fixing alfalfa plants’ Annals of Applied Biology, 146:51-60, 2005.
  5. https://edinburghsensors.com/products/gas-monitors/guardian-ng/ Accessed September 7th, 2017.
@Edinst #PAuto #Food

Understanding risk: cybersecurity for the modern grid.

23/08/2017
Didier Giarratano, Marketing Cyber Security at Energy Digital Solutions/Energy, Schneider Electric discusses the challenge for utilities is to provide reliable energy delivery with a focus on efficiency and sustainable sources.

There’s an evolution taking place in the utilities industry to build a modern distribution automation grid. As the demand for digitised, connected and integrated operations increases across all industries, the challenge for utilities is to provide reliable energy delivery with a focus on efficiency and sustainable sources.

The pressing need to improve the uptime of critical power distribution infrastructure is forcing change. However, as power networks merge and become ‘smarter’, the benefits of improved connectivity also bring greater cybersecurity risks, threatening to impact progress.

Grid complexity in a new world of energy
Electrical distribution systems across Europe were originally built for centralised generation and passive loads – not for handling evolving levels of energy consumption or complexity. Yet, we are entering a new world of energy. One with more decentralised generation, intermittent renewable sources like solar and wind, a two-way flow of decarbonised energy, as well as an increasing engagement from demand-side consumers.

The grid is now moving to a more decentralised model, disrupting traditional power delivery and creating more opportunities for consumers and businesses to contribute back into the grid with renewables and other energy sources. As a result, the coming decades will see a new kind of energy consumer – that manages energy production and usage to drive cost, reliability, and sustainability tailored to their specific needs.

The rise of distributed energy is increasing grid complexity. It is evolving the industry from a traditional value chain to a more collaborative environment. One where customers dynamically interface with the distribution grid and energy suppliers, as well as the wider energy market. Technology and business models will need to evolve for the power industry to survive and thrive.

The new grid will be considerably more digitised, more flexible and dynamic. It will be increasingly connected, with greater requirements for performance in a world where electricity makes up a higher share of the overall energy mix. There will be new actors involved in the power ecosystem such as transmission system operators (TSOs), distribution system operators (DSOs), distributed generation operators, aggregators and prosumers.

Regulation and compliancy
Cyber security deployment focuses on meeting standards and regulation compliancy. This approach benefits the industry by increasing awareness of the risks and challenges associated with a cyberattack. As the electrical grid evolves in complexity, with the additions of distributed energy resource integration and feeder automation, a new approach is required – one that is oriented towards risk management.

Currently, utility stakeholders are applying cyber security processes learned from their IT peers, which is putting them at risk. Within the substation environment, proprietary devices once dedicated to specialised applications are now vulnerable. Sensitive information available online that describes how these devices work, can be accessed by anyone, including those with malicious intent.

With the right skills, malicious actors can hack a utility and damage systems that control the grid. In doing so, they also risk the economy and security of a country or region served by that grid.

Regulators have anticipated the need for a structured cyber security approach. In the U.S. the North American Electric Reliability Corporation Critical Infrastructure Protection (NERC CIP) requirements set out what is needed to secure North America’s electric system. The European Programme for Critical Infrastructure Protection (EPCIP) does much the same in Europe. We face new and complex attacks every day, some of which are organised by state actors, which is leading to a reconsideration of these and the overall security approach for the industry.

Developing competencies and cross-functional teams for IT-OT integration

Due to the shift towards open communication platforms, such as Ethernet and IP, systems that manage critical infrastructure have become increasingly vulnerable. As operators of critical utility infrastructure investigate how to secure their systems, they often look to more mature cybersecurity practices. However, the IT approach to cybersecurity is not always appropriate with the operational constraints utilities are facing.

These differences in approach mean that cybersecurity solutions and expertise geared toward the IT world are often inappropriate for operational technology (OT) applications. Sophisticated attacks today are able to leverage cooperating services, like IT and telecommunications. As utilities experience the convergence of IT and OT, it becomes necessary to develop cross-functional teams to address the unique challenges of securing technology that spans both worlds.

Protecting against cyber threats now requires greater cross-domain activity where engineers, IT managers and security managers are required to share their expertise to identify the potential issues and attacks affecting their systems

A continuous process: assess, design, implement and manage
Cybersecurity experts agree that standards by themselves will not bring the appropriate security level. It’s not a matter of having ‘achieved’ a cyber secure state. Adequate protection from cyber threats requires a comprehensive set of measures, processes, technical means and an adapted organisation.

It is important for utilities to think about how organisational cybersecurity strategies will evolve over time. This is about staying current with known threats in a planned and iterative manner. Ensuring a strong defence against cyberattacks is a continuous process and requires an ongoing effort and a recurring annual investment. Cybersecurity is about people, processes and technology. Utilities need to deploy a complete programme consisting of proper organisation, processes and procedures to take full advantage of cybersecurity protection technologies.

To establish and maintain cyber secure systems, utilities can follow a four-point approach:

1. Conduct a risk assessment
The first step involves conducting a comprehensive risk assessment based on internal and external threats. By doing so, OT specialists and other utility stakeholders can understand where the largest vulnerabilities lie, as well as document the creation of security policy and risk migration

2. Design a security policy and processes
A utility’s cybersecurity policy provides a formal set of rules to be followed. These should be led by the International Organisation for Standardisation (ISO) and International Electrotechnical Commision (IEC)’s family of standards (ISO27k) providing best practice recommendations on information security management. The purpose of a utility’s policy is to inform employees, contractors, and other authorised users of their obligations regarding protection of technology and information assets. It describes the list of assets that must be protected, identifies threats to those assets, describes authorised users’ responsibilities and associated access privileges, and describes unauthorised actions and resulting accountability for the violation of the security policy. Well-designed security processes are also important. As system security baselines change to address emerging vulnerabilities, cybersecurity system processes must be reviewed and updated regularly to follow this evolution. One key to maintaining and effective security baseline is to conduct a review once or twice a year

3. Execute projects that implement the risk mitigation plan
Select cybersecurity technology that is based on international standards, to ensure appropriate security policy and proposed risk mitigation actions can be followed. A ‘secure by design’ approach that is based on international standards like IEC 62351 and IEEE 1686 can help further reduce risk when securing system components

4. Manage the security programme
Effectively managing cybersecurity programmes requires not only taking into account the previous three points, but also the management of information and communication asset lifecycles. To do that, it’s important to maintain accurate and living documentation about asset firmware, operating systems and configurations. It also requires a comprehensive understanding of technology upgrade and obsolescence schedules, in conjunction with full awareness of known vulnerabilities and existing patches. Cybersecurity management also requires that certain events trigger assessments, such as certain points in asset life cycles or detected threats

For utilities, security is everyone’s business. Politicians and the public are more and more aware that national security depends on local utilities being robust too. Mitigating risk and anticipating attack vulnerabilities on utility grids and systems is not just about installing technology. Utilities must also implement organisational processes to meet the challenges of a decentralised grid. This means regular assessment and continuous improvement of their cybersecurity and physical security process to safeguard our new world of energy.

@SchneiderElec #PAuto #Power

Towards a liveable Earth!

08/08/2017

Addressing global issues through co-innovation to create new value!

Yokogawa has developed sustainability goals for the year 2050 that will guide its efforts to make the world a better place for future generations.

Yokogawa’s efforts to achieve a sustainable society are in keeping with the Paris Agreement, which was adopted in 2015 by the 21st Framework Convention on Climate Change (COP21) to provide a basis for global efforts to tackle issues related to climate change. The agreement calls for the achievement of net-zero greenhouse gas emissions by the second half of this century. Also in 2015, the UN adopted the 2030 Agenda for Sustainable Development centering on the Sustainable Development Goals (SDGs). Through these initiatives, a global consensus is developing on how to address these issues, and the direction that companies should take is becoming clear.

Yokogawa’s efforts to achieve sustainability and build a brighter future for all are based on the company’s corporate philosophy, which states: “As a company, our goal is to contribute to society through broad-ranging activities in the areas of measurement, control, and information. Individually, we aim to combine good citizenship with the courage to innovate.” To ensure a flexible response to environmental and technology changes and guide its long-term efforts to address social issues, Yokogawa is committing itself to the achievement of goals that are based on a vision of where our society should be by the year 2050. Through the selection of products and solutions and the formulation of medium-term business plans and the like that are based on environmental, economic, and societal considerations, Yokogawa will carry out the detailed tasks needed to achieve these goals.

Commenting on this initiative, Takashi Nishijima, Yokogawa President and CEO, says: “Companies have a growing responsibility to respond to issues such as population growth and the rising use of fossil fuels that are addressed in the Paris Agreement and the SDGs. Yokogawa provides solutions that improve the stability, efficiency, and safety of operations at industrial plants and other infrastructure facilities by, for example, speeding up processes, reducing workloads, and saving energy. Yokogawa needs to work harder to broaden its solutions so that it can address other issues that impact our society. Yokogawa will establish key performance indicators (KPIs) to evaluate on a medium-term basis the achievement of its sustainability goals, and will continue to create new value through co-innovation with its stakeholders.”


Statement on Yokogawa’s aspiration for sustainability
Yokogawa will work , to achieve net-zero emissions, to make a transition to a circular economy, and ensure the well-being of all by 2050,  thus making the world a better place for future generations.

We will undergo the necessary transformation to achieve these goals by 1. becoming more adaptable and resilient, 2. evolving our businesses to engage in regenerative value creation, and 3. promoting co-innovation with our stakeholders.

Achieve net-zero emissions; stopping climate change
Climate change is an urgent issue that requires a global response. We aim for net-zero emissions, which means that the greenhouse gas concentrations in the atmosphere do not rise due to the balance of emissions and the absorption of greenhouse gases, which can be accomplished through the introduction of renewable energy and efficient use of energy. We are also working to reduce the impact of natural disasters and respond to biodiversity issues.

Make the transition to a circular economy; circulation of resources and efficiency
The transformation from a one-way economy based on the take, make, and dispose model to an economy where resources are circulated without waste, and the transition to businesses that emphasize services, is under way. We aim to realize a social framework and ecosystem in which various resources are circulated without waste and assets are utilized effectively. We are also contributing to the efficient use of water resources and the supply of safe drinking water.

Ensure well-being; quality life for all
With the aim of achieving the physical, mental, and social well-being described in the 2030 Agenda for Sustainable Development adopted by the United Nations in 2015, we support people’s health and prosperity through the achievement of safe and comfortable workplaces and our pursuits in such areas as the life sciences and drug discovery. We promote human resource development and employment creation in local communities, alongside diversity and inclusion.

 

@YokogawaIA #PAuto @UNFCCC